Probabilistic information processing
How can we elucidate meaningful and reliable information from circumstances with uncertainty?
This is a fundamental problem seen in various situations in information processing.
One of the issues in our interests is to evaluate several techniques for inference, learning, and adaption by establishing probabilistic models which are suitable for basic description of uncertainty in information processing.
In particular, to produce feasible algorithms for inference and learning in a moderate time, we gaze at mathematical formulations for probabilistic models while traversing several different research fields, for instance, statistical science, information theory, statistical physics, information geometry etc.
Statistical mechanics, which is established to describe nature of material seen in our world, can be regarded as a theoretical framework for inference in probabilistic models with a plenty of degrees of freedom reaching the order of Avogadoro's number.
We study an application of several methods emerged from statistical mechanics to specific problems in information processing; Statistical-mechanical informatics.
In particular, our studies expands over not only arrangement through mathematical comparisons between theoretical formulations in information theory, and statistical science, but also specific applications to digital wireless communication, statistical learning theory, neural network, optimization problem, random matrix theory, Econophysics, quantum annealing.
Information and communication theory
How can we transmit reliable information through noisy channels?
Such a problem in information processing can be dealt with as inference, learning, adaption in circumstances with uncertainty.
From this fascinating point of view, we assess a theoretical capacity of CDMA (Code-Division Multiple Access) used in modern mobile-phone communications and wireless LAN, establish several algorithms to extract reliable information from the received message with noise, and propose a technique of error-correcting code reaching a theoretical limitation.
Statistical learning theory
One of the subjects on our project is verification of several techniques for inference, learning, and adaption.
Image restoration, clustering of multidimensional data, inference of information measurement based on data, inference in digital wireless communication, and reinforcement learning are currently studied.
Let us remind data of pictures, sounds, and movies.
We need a compression for these types of data, since their size continues to increase larger and larger.
Do you think that this fact seems to be crazy!
It is a better choice to convert only the relevant data into storage rather than to compress the raw data including irrelevant redundancy.
"Compressed Sensing" provides us with such a practically useful technique, and attracts a number of researchers.
We study to clarify mathematical aspects and theoretical limitation of this fascinating subject, compressed sensing.