其他分享
首页 > 其他分享> > Probabilistic Graphical Modeling概率图模型学习笔记

Probabilistic Graphical Modeling概率图模型学习笔记

作者:互联网

Probabilistic Graphical Modeling概率图模型学习笔记

0. learning materials

1. Introduction

A great amout of problems we have are to model the real world with some kind of functions (the most direct example is estimation fitting problems, and further our deep learning, machine learning algorithms are mostly fitting the real problem with a function).

As a result, Probabilistic Graphical Modeling is concived to solve such kinds of questions.
There are three main elements in PGM :

2. Representation

2.1 Bayesian network

It is a directed acyclic graph, and it can deal with variables with causality (the variables, which have directed relationship).

p(x1,x2,x3,..,xn)=p(x1)p(x2x1)...p(xnxn1,...,x2,x1) p(x_{1}, x_{2}, x_{3},..,x_{n}) = p(x_{1})p(x_{2} | x_{1}) ...p(x_{n}|x_{n-1}, ...,x_{2},x_{1}) p(x1​,x2​,x3​,..,xn​)=p(x1​)p(x2​∣x1​)...p(xn​∣xn−1​,...,x2​,x1​)

Based on these relationship, a directed graph could be built, and further the probabilty expression could be formed. That the variable only depends on some of the ancestors AiA_{i}Ai​.

p(xixi1,...,x2,x1)=p(xixAi) p(x_{i}|x_{i-1}, ... ,x_{2}, x_{1}) = p(x_{i}|x_{A_{i}}) p(xi​∣xi−1​,...,x2​,x1​)=p(xi​∣xAi​​)

Fromal definition:
A bayesian network is a direct graph G=(V,E)G=(V,E)G=(V,E) together with :

End of definition

2.2 Markov Random Fields

There are cases where Bayesian network cannot describe. But Markov Random Fields (a undirected graph) can solve some. And this expression is used more in computer vision area.

For an example, the friendship, should not be expressed by directed edge. which is also hard to express with conditional probability. Which is nature to introduce the undirected edge in Markov Random Fields.

Fromal definition:
A Markov Random Field (MRF) is a probability distribution ppp over variables x1,x2,...,xnx_{1},x_{2},...,x_{n}x1​,x2​,...,xn​ defined by an undirected graph GGG in which nodes correspond to variables xix_{i}xi​. The probability ppp has the form :
p(x1,x2,...,xn)=1ZcCϕc(xc) p(x_{1},x_{2},...,x_{n}) = \frac{1}{Z}\prod_{c \in C}\phi_{c}(x_{c}) p(x1​,x2​,...,xn​)=Z1​c∈C∏​ϕc​(xc​)
Where CCC donates the set of cliques (fully connected subgraphs) of GGG, and each factor ϕc\phi_{c}ϕc​ is an nonegative function over the variables in the clique. The partition function :
Z=x1,x2,...,xncCϕc(xc) Z = \sum_{x_{1},x_{2},...,x_{n}}\prod_{c \in C}\phi_{c}(x_{c}) Z=x1​,x2​,...,xn​∑​c∈C∏​ϕc​(xc​)
is a normalizing constant that ensures that the distribution sums to one.
End of definition

A Bayesian network can always be converted into an undirected network with normalization constant one. The converse is also possible, but may be computationally intractable, and may produce a very large (e.g. fully connected) directed graph.

2.3 Conditional Random Fields

It is a special case of Markov Random Fields, applied to model a conditional probability distribution.
a conditional random field results in an instantiation of a new Markov Random Field for each input x.

2.4 Factor Graph

A factor graph is a bipartite graph where one group is the variables in the distribution being modeled, and the other group is the factors defined on these variables. Edges go between factors and variables that those factors depend on.

hhhliuye 发布了17 篇原创文章 · 获赞 4 · 访问量 2263 私信 关注

标签:...,Graphical,xn,variables,AC,x2,Modeling,x1,Probabilistic
来源: https://blog.csdn.net/weixin_44492024/article/details/104574272