1 October, 20071 October, 2007 ALT&DS2007 (Sendai, Japan)ALT&DS2007 (Sendai, Japan) 11
Introduction to Introduction to Probabilistic Image Processing Probabilistic Image Processing
and Bayesian Networksand Bayesian Networks
Kazuyuki TanakaKazuyuki TanakaGraduate School of Information Sciences,Graduate School of Information Sciences,
Tohoku University, Sendai, JapanTohoku University, Sendai, [email protected]@smapip.is.tohoku.ac.jp
http://www.smapip.is.tohoku.ac.jp/~kazu/http://www.smapip.is.tohoku.ac.jp/~kazu/
1 October, 20071 October, 2007 ALT&DS2007 (Sendai, Japan)ALT&DS2007 (Sendai, Japan) 22
ContentContentss
1.1. IntroductionIntroduction2.2. Probabilistic Image ProcessingProbabilistic Image Processing3.3. Gaussian Graphical ModelGaussian Graphical Model4.4. Belief PropagationBelief Propagation5.5. Other ApplicationOther Application6.6. Concluding RemarksConcluding Remarks
1 October, 20071 October, 2007 ALT&DS2007 (Sendai, Japan)ALT&DS2007 (Sendai, Japan) 33
ContentContentss
1.1. IntroductionIntroduction2.2. Probabilistic Image ProcessingProbabilistic Image Processing3.3. Gaussian Graphical ModelGaussian Graphical Model4.4. Belief PropagationBelief Propagation5.5. Other ApplicationOther Application6.6. Concluding RemarksConcluding Remarks
1 October, 2007 ALT&DS2007 (Sendai, Japan) 4
Markov Random Fields for Image Processing
S. Geman and D. Geman (1986): IEEE Transactions on PAMIS. Geman and D. Geman (1986): IEEE Transactions on PAMIImage Processing for Image Processing for Markov Random Fields (MRF)Markov Random Fields (MRF) (Simulated Annealing, Line Fields)(Simulated Annealing, Line Fields)
J. Zhang (1992): IEEE Transactions on Signal ProcessingJ. Zhang (1992): IEEE Transactions on Signal ProcessingImage Processing in EM algorithm for Image Processing in EM algorithm for Markov Markov Random Fields (MRF)Random Fields (MRF) (Mean Field Methods) (Mean Field Methods)
Markov Random Fields are One of Probabilistic Methods for Image processing.
1 October, 2007 ALT&DS2007 (Sendai, Japan) 5
Markov Random Fields for Image Processing
In Markov Random Fields, we have to consider not only the states with high probabilities but also ones with low probabilities.In Markov Random Fields, we have to estimate not only the image but also hyperparameters in the probabilistic model.We have to perform the calculations of statistical quantities repeatedly.
Hyperparameter Estimation
Statistical Quantities
Estimation of Image
We need a deterministic algorithm for calculating statistical quantities.Belief Propagation
1 October, 2007 ALT&DS2007 (Sendai, Japan) 6
Belief PropagationBelief Propagation
Belief Propagation has been proposed in order to achieve probabilistic inference systems (Pearl, 1988).It has been suggested that Belief Propagation has a closed relationship to Mean Field Methods in the statistical mechanics (Kabashima and Saad 1998). Generalized Belief Propagation has been proposed based on Advanced Mean Field Methods (Yedidia, Freeman and Weiss, 2000).Interpretation of Generalized Belief Propagation also has been presented in terms of Information Geometry (Ikeda, T. Tanaka and Amari, 2004).
Belief Propagation has been proposed in order to achieve probabilistic inference systems (Pearl, 1988).It has been suggested that Belief Propagation has a closed relationship to Mean Field Methods in the statistical mechanics (Kabashima and Saad 1998). Generalized Belief Propagation has been proposed based on Advanced Mean Field Methods (Yedidia, Freeman and Weiss, 2000).Interpretation of Generalized Belief Propagation also has been presented in terms of Information Geometry (Ikeda, T. Tanaka and Amari, 2004).
1 October, 2007 ALT&DS2007 (Sendai, Japan) 7
Probabilistic Model and Belief Propagation
1 2 3 4 5
),(),(),(),( 25242321x x x x x
xxCxxCxxBxxA x3
x1 x2 x4
x5
1 2 3
),(),(),( 133221x x x
xxCxxBxxATreex3x1
x2
Cycle
Function consisting of a product of functions with two variables can be assigned to a graph representation.
Examples
Belief Propagation can give us an exact result for the calculations of statistical quantities of probabilistic models with tree graph representations.
Generally, Belief Propagation cannot give us an exact result for the calculations of statistical quantities of probabilistic models with cycle graph representations.
1 October, 2007 ALT&DS2007 (Sendai, Japan) 8
Application of Belief PropagationApplication of Belief Propagation
Turbo and LDPC codes in Error Correcting Codes (Berrou and Glavieux: IEEE Trans. Comm., 1996; Kabashima and Saad: J. Phys. A, 2004, Topical Review). CDMA Multiuser Detection in Mobile Phone Communication (Kabashima: J. Phys. A, 2003).Satisfability (SAT) Problems in Computation Theory (Mezard, Parisi, Zecchina: Science, 2002).Image Processing (Tanaka: J. Phys. A, 2002, Topical Review; Willsky: Proceedings of IEEE, 2002).Probabilistic Inference in AI (Kappen and Wiegerinck, NIPS, 2002).
Turbo and LDPC codes in Error Correcting Codes (Berrou and Glavieux: IEEE Trans. Comm., 1996; Kabashima and Saad: J. Phys. A, 2004, Topical Review). CDMA Multiuser Detection in Mobile Phone Communication (Kabashima: J. Phys. A, 2003).Satisfability (SAT) Problems in Computation Theory (Mezard, Parisi, Zecchina: Science, 2002).Image Processing (Tanaka: J. Phys. A, 2002, Topical Review; Willsky: Proceedings of IEEE, 2002).Probabilistic Inference in AI (Kappen and Wiegerinck, NIPS, 2002).
Applications of belief propagation to many problems which are formulated as probabilistic models with cycle graph representations have caused to many successful results.
1 October, 2007 ALT&DS2007 (Sendai, Japan) 9
Purpose of My Talk
Review of formulation of probabilistic model for image processing by means of conventional statistical schemes.Review of probabilistic image processing by using Gaussian graphical model (Gaussian Markov Random Fields) as the most basic example.Review of how to construct a belief propagation algorithm for image processing.
1 October, 20071 October, 2007 ALT&DS2007 (Sendai, Japan)ALT&DS2007 (Sendai, Japan) 1010
ContentContentss
1.1. IntroductionIntroduction2.2. Probabilistic Image ProcessingProbabilistic Image Processing3.3. Gaussian Graphical Model Gaussian Graphical Model 4.4. Belief PropagationBelief Propagation5.5. Other ApplicationOther Application6.6. Concluding RemarksConcluding Remarks
1 October, 2007 ALT&DS2007 (Sendai, Japan) 11
Image Representation in Computer Vision
Digital image is defined on the set of points arranged on a square lattice.The elements of such a digital array are called pixels.We have to treat more than 100,000 pixels even in the digital cameras and the mobile phones.
xx
y y
)1,1( )1,2( )1,3(
)2,1( )2,2( )2,3(
)3,1( )3,2( )3,3(
),( yxPixels 200,307480640
1 October, 2007 ALT&DS2007 (Sendai, Japan) 12
Image Representation in Computer Vision
Pixels 65536256256
x
yAt each point, the intensity of light is represented as an integer number or a real number in the digital image data.A monochrome digital image is then expressed as a two-dimensional light intensity function and the value is proportional to the brightness of the image at the pixel.
yxfyx ,),(
0, yxf 255, yxf
1 October, 2007 ALT&DS2007 (Sendai, Japan) 13
Noise Reduction by Conventional FiltersNoise Reduction by Conventional Filters
173
110218100
120219202
190202192
Average
192 202 190
202 219 120
100 218 110
192 202 190
202 173 120
100 218 110
It is expected that probabilistic algorithms for image processing can be constructed from such aspects in the conventional signal processing.
Markov Random Fields Probabilistic Image ProcessingAlgorithm
Smoothing Filters
The function of a linear filter is to take the sum of the product of the mask coefficients and the intensities of the pixels.
1 October, 2007 ALT&DS2007 (Sendai, Japan) 14
Bayes Formula and Bayesian Network
Posterior Probability
}Pr{
}Pr{}|Pr{}|Pr{
B
AABBA
Bayes Rule
Prior Probability
Event A is given as the observed data.Event B corresponds to the original information to estimate. Thus the Bayes formula can be applied to the estimation of the original information from the given data.
A
B
Bayesian Network
Data-Generating Process
1 October, 2007 ALT&DS2007 (Sendai, Japan) 15
Image Restoration by Probabilistic Model
Original Image
Degraded Image
Transmission
Noise
Likelihood Marginal
PriorLikelihood
Posterior
}ageDegradedImPr{
}Image OriginalPr{}Image Original|Image DegradedPr{
}Image Degraded|Image OriginalPr{
Assumption 1: The degraded image is randomly generated from the original image by according to the degradation process. Assumption 2: The original image is randomly generated by according to the prior probability.
Bayes Formula
1 October, 2007 ALT&DS2007 (Sendai, Japan) 16
Image Restoration by Probabilistic Model
Degraded
Image
i
fi: Light Intensity of Pixel iin Original Image
),( iii yxr
Position Vector
of Pixel i
gi: Light Intensity of Pixel iin Degraded Image
i
Original
Image
The original images and degraded images are represented by f = {fi} and g = {gi}, respectively.
1 October, 2007 ALT&DS2007 (Sendai, Japan) 17
Probabilistic Modeling of Image Restoration
PriorLikelihood
Posterior
}Image OriginalPr{}Image Original|Image DegradedPr{
}Image Degraded|Image OriginalPr{
N
iii fg
1
),(}|Pr{
}Image Original|Image DegradedPr{
fFgG
fg
Random Fieldsfi
gi
fi
gi
or
Assumption 1: A given degraded image is obtained from the original image by changing the state of each pixel to another state by the same probability, independently of the other pixels.
1 October, 2007 ALT&DS2007 (Sendai, Japan) 18
Probabilistic Modeling of Image Restoration
PriorLikelihood
Posterior
}Image OriginalPr{}Image Original|Image DegradedPr{
}Image Degraded|Image OriginalPr{
NN:
),(}Pr{
}Image OriginalPr{
ijji fffF
f
Random Fields
Assumption 2: The original image is generated according to a prior probability. Prior Probability consists of a product of functions defined on the neighbouring pixels.
i j
Product over All the Nearest Neighbour Pairs of Pixels
1 October, 2007 ALT&DS2007 (Sendai, Japan) 19
Prior Probability for Binary Image
== >p p p
2
1p
2
1i j Probability of Neigbouring Pixel
NeighbourNearest :
),(}Pr{ij
ji fffFi j
It is important how we should assume the function (fi,fj) in the prior probability.
)0,1()1,0()0,0()1,1(
We assume that every nearest-neighbour pair of pixels take the same state of each other in the prior probability.
1,0if
1 October, 2007 ALT&DS2007 (Sendai, Japan) 20
Prior Probability for Binary Image
Prior probability prefers to the configuration with the least number of red lines.
Which state should the center pixel be taken when the states of neighbouring pixels are fixed to the white states?
?
>
== >p p p
2
1p
2
1i j Probability of Nearest Neigbour Pair of Pixels
1 October, 2007 ALT&DS2007 (Sendai, Japan) 21
Prior Probability for Binary ImagePrior Probability for Binary Image
Which state should the center pixel be taken when the states of neighbouring pixels are fixed as this figure?
? -?== >
p p
> >=
Prior probability prefers to the configuration with the least number of red lines.
1 October, 2007 ALT&DS2007 (Sendai, Japan) 22
What happens for the case of large umber of pixels?
p 0.0
0.2
0.4
0.6
0.8
1.0
0.0 0.2 0.4 0.6 0.8 1.0
lnp
Disordered StateCritical Point
(Large fluctuation)
small p large p
Covariance between the nearest neghbour pairs of pixels
Sampling by Marko chain Monte Carlo
Ordered State
Patterns with both ordered statesand disordered states are often generated near the critical point.
1 October, 2007 ALT&DS2007 (Sendai, Japan) 23
Pattern near Critical Point of Prior Probability
0.0
0.2
0.4
0.6
0.8
1.0
0.0 0.2 0.4 0.6 0.8 1.0ln p
similar
small p large p
Covariance between the nearest neghbour pairs of pixels
We regard that patterns generated near the critical point are similar to the local patterns in real world images.
1 October, 2007 ALT&DS2007 (Sendai, Japan) 24
Bayesian Image Analysis
NeighbourNearest:1
),(),(
Pr
PrPrPr
ijji
N
iii ffgf
gG
fFfFgGgGfF
fg
fF Pr fFgG Pr gOriginalImage
Degraded Image
Prior Probability
Posterior Probability
Degradation Process
Image processing is reduced to calculations of averages, variances and co-variances in the posterior probability.
B : Set of all the nearest neighbour pairs of pixels
Ω : Set of All the pixels
1 October, 2007 ALT&DS2007 (Sendai, Japan) 25
Estimation of Original Image
We have some choices to estimate the restored image from posterior probability.
In each choice, the computational time is generally exponential order of the number of pixels.
}|Pr{maxargˆ gG iiz
i zFfi
2
}|Pr{minargˆ
z
gGzFiii zfi
}|Pr{maxargˆ gGzFfz
Thresholded Posterior Mean (TPM) estimation
Maximum posterior marginal (MPM) estimation
Maximum A Posteriori (MAP) estimation
if
ii fF\
}|Pr{}|Pr{f
gGfFgG
(1)
(2)
(3)
1 October, 2007 ALT&DS2007 (Sendai, Japan) 26
Statistical Estimation of Hyperparameters
z
zFzFgGgG }|Pr{},|Pr{},|Pr{
},|Pr{max arg)ˆ,ˆ(,
gG
f g
Marginalized with respect to F
}|Pr{ fF },|Pr{ fFgG gOriginal Image
Marginal Likelihood
Degraded Imagey
x
},|Pr{ gG
Hyperparameters are determinedso as to maximize the marginal likelihood Pr{G=g|,} with respect to ,
1 October, 2007 ALT&DS2007 (Sendai, Japan) 27
Maximization of Marginal Likelihood by EM Algorithm
z
zFzFgGgG }|Pr{},|Pr{},|Pr{ Marginal Likelihood
},|,Pr{ln}',',|Pr{
,',',
z
gGzFgGzF
g
Q
.,,maxarg1,1 :Step-M
},|,Pr{ln)}(),(,|Pr{
,,
:Step-E
,ttQtt
tt
ttQ
z
gGzFgGzF
E-step and M-Step are iterated until convergence:
EM (Expectation Maximization) Algorithm
Q-Function
1 October, 20071 October, 2007 ALT&DS2007 (Sendai, Japan)ALT&DS2007 (Sendai, Japan) 2828
ContentContentss
1.1. IntroductionIntroduction2.2. Probabilistic Image ProcessingProbabilistic Image Processing3.3. Gaussian Graphical ModelGaussian Graphical Model4.4. Belief PropagationBelief Propagation5.5. Other ApplicationOther Application6.6. Concluding RemarksConcluding Remarks
1 October, 2007 ALT&DS2007 (Sendai, Japan) 29
Bayesian Image Analysis by Gaussian Graphical Model
Bijji ff 2
2
1expPr fF
0005.0 0030.00001.0
Patterns are generated by MCMC.
Markov Chain Monte Carlo Method
PriorProbability
,if
B:Set of all the nearest-neghbour pairs of pixels
:Set of all the pixels
1 October, 2007 ALT&DS2007 (Sendai, Japan) 30
Bayesian Image Analysis by Gaussian Graphical Model
2,0~ Nfg ii
iii gf 2
22 2
1exp
2
1Pr
fFgG
Histogram of Gaussian Random Numbers
n
Noise Gaussianf
Image Original g
Image Degraded
,, ii gf
Degraded image is obtained by adding a white Gaussian noise to the original image.
Degradation Process is assumed to be the additive white Gaussian noise.
:Set of all the pixels
1 October, 2007 ALT&DS2007 (Sendai, Japan) 31
gCI
Izgzz
2 ,
d,P
Bayesian Image Analysis by Gaussian Graphical Model ,, ii gf
otherwise0
1
4
Bij
ji
ji C
Multi-Dimensional Gaussian Integral Formula
Bijji
iii ffgfP 22
2 2
1
2
1exp),,|(
gf
Posterior Probability
Average of the posterior probability can be calculated by using the multi-dimensional Gauss integral Formula
NxN matrix
B:Set of all the nearest-neghbour pairs of pixels
:Set of all the pixels
1 October, 2007 ALT&DS2007 (Sendai, Japan) 32
Bayesian Image Analysis by Gaussian Graphical Model
1
2T
2
2
11
1
11
1Tr
1
g
CI
Cg
CI
C
ttNtt
t
Nt
gCI
Cg
CI
I22
242T
2
2
11
111
11
1Tr
1
tt
tt
Ntt
t
Nt
0
0.0002
0.0004
0.0006
0.0008
0.001
0 20 40 60 80 100 t
t
g f̂
gCI
Im
2)()()(
ttt
2)(minarg)(ˆ tmztf ii
zi
i
Iteration Procedure of EM algorithm in Gaussian Graphical Model
EM
f̂
g
1 October, 2007 ALT&DS2007 (Sendai, Japan) 33
Image Restoration by Markov Random Field Model and Conventional Filters
2ˆ||
1MSE
iii ff
MSE
Statistical Method 315
Lowpass Filter
(3x3) 388
(5x5) 413
Median Filter
(3x3) 486
(5x5) 445
(3x3) Lowpass(3x3) Lowpass (5x5) Median(5x5) MedianMRFMRF
Original ImageOriginal Image Degraded ImageDegraded Image
RestoredRestoredImageImage
:Set of all the pixels
1 October, 20071 October, 2007 ALT&DS2007 (Sendai, Japan)ALT&DS2007 (Sendai, Japan) 3434
ContentContentss
1.1. IntroductionIntroduction2.2. Probabilistic Image ProcessingProbabilistic Image Processing3.3. Gaussian Graphical ModelGaussian Graphical Model4.4. Belief PropagationBelief Propagation5.5. Other ApplicationOther Application6.6. Concluding RemarksConcluding Remarks
1 October, 2007 ALT&DS2007 (Sendai, Japan) 35
Graphical Representation for Tractable Models
Tractable Model
FT,FT,FT,
FT, FT, FT,
),(),(),(
),(),(),(
CBA
A B C
DChDBgDAf
DChDBgDAf A
B C
D FT, FT, FT,A B C
Intractable Model
FT, FT, FT,
),(),(),(A B C
AChCBgBAf
A
B C
FT, FT, FT,A B C
Tree Graph
Cycle Graph
It is possible to calculate each summation independently.
It is hard to calculate each summation independently.
1 October, 2007 ALT&DS2007 (Sendai, Japan) 36
Belief Propagation for Tree Graphical Model
3
1 2
5
4
3
1 2
5
4
23M
24M
25M
After taking the summations over red nodes 3,4 and 5, the function of nodes 1 and 2 can be expressed in terms of some messages.
)(
52
)(
42
)(
3221
52423221
225
8
224
7
223
3
3 4 5
),(),(),(),(
),(),(),(),(
fM
f
fM
f
fM
f
f f f
ffDffCffBffA
ffDffCffBffA
1 October, 2007 ALT&DS2007 (Sendai, Japan) 37
Belief Propagation for Tree Graphical Model
3
1 2
5
4
By taking the summation over all the nodes except node 1, message from node 2 to node 1 can be expressed in terms of all the messages incoming to node 2 except the own message.
3
1 2
5
4
23M
24M
25M
2z
212M
1
Summation over all the nodes except 1
1 October, 2007 ALT&DS2007 (Sendai, Japan) 38
Loopy Belief Propagation for Graphical Model in Image Processing
Graphical model for image processing is represented in terms of the square lattice.Square lattice includes a lot of cycles.Belief propagation are applied to the calculation of statistical quantities as an approximate algorithm.
1 2 4
5
3
1 2 4
5
Every graph consisting of a pixel and its four neighbouring pixels can be regarded as a tree graph.
Loopy Belief Propagation
3
1 2
5
42z
21
3
1 October, 2007 ALT&DS2007 (Sendai, Japan) 39
Loopy Belief Propagation for Graphical Model in Image Processing
MM
1 2
1
1151141132112
1151141132112
221 ,
,
z z
z
zMzMzMzz
zMzMzMfz
fM
42
3
1
5
Message Passing Rule in Loopy Belief Propagation
Averages, variances and covariances of the graphical model are expressed in terms of messages.
3
1 2
5
42z
21
1 October, 2007 ALT&DS2007 (Sendai, Japan) 40
Loopy Belief Propagation for Graphical Model in Image Processing
We have four kinds of message passing rules for each pixel.
Each massage passing rule includes 3 incoming messages and 1 outgoing message
Visualizations of Passing Messages
1 October, 2007 ALT&DS2007 (Sendai, Japan) 41
EM algorithm by means of Belief Propagation
Input
Output
LoopyBP
EM Update Rule of Loopy BP3
1 2
5
4
23M
24M25M
2z
212M
1
EM Algorithm for Hyperparameter Estimation
.,,,maxarg
1,1
,gttQ
tt
1 October, 2007 ALT&DS2007 (Sendai, Japan) 42
Probabilistic Image Processing by EM Algorithm and Loopy BP for Gaussian Graphical Model
.,,,maxarg1,1,
gttQtt
g
f̂
Loopy Belief Propagation
Exact
0006000ˆ
335.36ˆ
.
LBP
LBP
0007130ˆ
624.37ˆ
.
Exact
Exact
MSE:327
MSE:315
0
0.0002
0.0004
0.0006
0.0008
0.001
0 20 40 60 80 100 t
t
1 October, 20071 October, 2007 ALT&DS2007 (Sendai, Japan)ALT&DS2007 (Sendai, Japan) 4343
ContentContentss
1.1. IntroductionIntroduction2.2. Probabilistic Image ProcessingProbabilistic Image Processing3.3. Gaussian Graphical Model Gaussian Graphical Model 4.4. Belief PropagationBelief Propagation5.5. Other ApplicationOther Application6.6. Concluding RemarksConcluding Remarks
1 October, 2007 ALT&DS2007 (Sendai, Japan) 44
Digital Images Inpainting based on MRF
Inpu
t
Ou
tpu
t
MarkovRandomFields
M. Yasuda, J. Ohkubo and K. Tanaka: Proceedings ofCIMCA&IAWTIC2005.
1 October, 20071 October, 2007 ALT&DS2007 (Sendai, Japan)ALT&DS2007 (Sendai, Japan) 4545
ContentContentss
1.1. IntroductionIntroduction2.2. Probabilistic Image ProcessingProbabilistic Image Processing3.3. Gaussian Graphical ModelGaussian Graphical Model4.4. Belief PropagationBelief Propagation5.5. Other ApplicationOther Application6.6. Concluding RemarksConcluding Remarks
1 October, 2007 ALT&DS2007 (Sendai, Japan) 46
Summary
Formulation of probabilistic model for image processing by means of conventional statistical schemes has been summarized.
Probabilistic image processing by using Gaussian graphical model has been shown as the most basic example.
It has been explained how to construct a belief propagation algorithm for image processing.
1 October, 2007 ALT&DS2007 (Sendai, Japan) 47
Statistical Mechanics Informatics for Probabilistic Image Processing
S. Geman and D. Geman (1986): IEEE Transactions on PAMIS. Geman and D. Geman (1986): IEEE Transactions on PAMIImage Processing for Markov Random Fields (MRF) (Image Processing for Markov Random Fields (MRF) (SimulSimulated Annealingated Annealing, Line Fields), Line Fields)
J. Zhang (1992): IEEE Transactions on Signal ProcessingJ. Zhang (1992): IEEE Transactions on Signal ProcessingImage Processing in EM algorithm for Markov Random Image Processing in EM algorithm for Markov Random Fields (MRF) (Fields (MRF) (Mean Field MethodsMean Field Methods))
K. Tanaka and T. Morita (1995): Physics Letters AK. Tanaka and T. Morita (1995): Physics Letters ACluster Variation MethodCluster Variation Method for MRF in Image Processing for MRF in Image Processing
Original ideas of some techniques, Simulated Annealing, Mean Field Methods and Belief Propagation, is often based on the statistical mechanics.
Mathematical structure of Belief Propagation is equivalent to Bethe Approximation and Cluster Variation Method (Kikuchi Method) which are ones of advanced mean field methods in the statistical mechanics.
1 October, 2007 ALT&DS2007 (Sendai, Japan) 48
Statistical Mechanical Informatics for Probabilistic Information Processing
It has been suggested that statistical performance estimations for probabilistic information processing are closed to the spin glass theory. The computational techniques of spin glass theory has been applied to many problems in computer sciences.
Error Correcting Codes (Y. Kabashima and D. Saad: J. Phys. A, 2004, Topical Review). CDMA Multiuser Detection in Mobile Phone Communication (T. Tanaka: IEEE Information Theory, 2002).SAT Problems (Mezard, Parisi, Zecchina: Science, 2002).Image Processing (K. Tanaka: J. Phys. A, 2002, Topical Review).
Error Correcting Codes (Y. Kabashima and D. Saad: J. Phys. A, 2004, Topical Review). CDMA Multiuser Detection in Mobile Phone Communication (T. Tanaka: IEEE Information Theory, 2002).SAT Problems (Mezard, Parisi, Zecchina: Science, 2002).Image Processing (K. Tanaka: J. Phys. A, 2002, Topical Review).
1 October, 20071 October, 2007 ALT&DS2007 (Sendai, Japan)ALT&DS2007 (Sendai, Japan) 4949
SMAPIP ProjectSMAPIP ProjectSMAPIP ProjectSMAPIP Project
MEXT Grant-in Aid for Scientific Research on Priority AreasMEXT Grant-in Aid for Scientific Research on Priority Areas
Period: 2002 –2005Head Investigator: Kazuyuki TanakaPeriod: 2002 –2005Head Investigator: Kazuyuki Tanaka
Webpage URL: http://www.smapip.eei.metro-u.ac.jp./Webpage URL: http://www.smapip.eei.metro-u.ac.jp./
Member:K. Tanaka, Y. Kabashima,H. Nishimori, T. Tanaka, M. Okada, O. Watanabe, N. Murata, ......
1 October, 20071 October, 2007 ALT&DS2007 (Sendai, Japan)ALT&DS2007 (Sendai, Japan) 5050
DEX-SMI ProjectDEX-SMI ProjectDEX-SMI ProjectDEX-SMI Project
http://dex-smi.sp.dis.titech.ac.jp/DEX-SMI/http://dex-smi.sp.dis.titech.ac.jp/DEX-SMI/http://dex-smi.sp.dis.titech.ac.jp/DEX-SMI/http://dex-smi.sp.dis.titech.ac.jp/DEX-SMI/
情報統計力学 GOGO
DEX-SMI GOGO
MEXT Grant-in Aid for Scientific Research on Priority Areas
Period: 2006 –2009Head Investigator: Yoshiyuki KabashimaPeriod: 2006 –2009Head Investigator: Yoshiyuki Kabashima
Deepening and Expansion of Statistical Mechanical Informatics
1 October, 20071 October, 2007 ALT&DS2007 (Sendai, Japan)ALT&DS2007 (Sendai, Japan) 5151
ReferencReferenceses
ReferencReferenceses
K. Tanaka: Statistical-Mechanical Approach to K. Tanaka: Statistical-Mechanical Approach to Image Processing (Topical Review), J. Phys. A, Image Processing (Topical Review), J. Phys. A, 3535 (2002). (2002).
A. S. Willsky: Multiresolution Markov Models for A. S. Willsky: Multiresolution Markov Models for Signal and Image Processing, Proceedings of Signal and Image Processing, Proceedings of IEEE, IEEE, 9090 (2002). (2002).
K. Tanaka: Statistical-Mechanical Approach to K. Tanaka: Statistical-Mechanical Approach to Image Processing (Topical Review), J. Phys. A, Image Processing (Topical Review), J. Phys. A, 3535 (2002). (2002).
A. S. Willsky: Multiresolution Markov Models for A. S. Willsky: Multiresolution Markov Models for Signal and Image Processing, Proceedings of Signal and Image Processing, Proceedings of IEEE, IEEE, 9090 (2002). (2002).
Top Related