2018 10-708 (CMU) Probabilistic Graphical Models {Lecture 10} [HHM and CRF]
Posted ecoflex
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了2018 10-708 (CMU) Probabilistic Graphical Models {Lecture 10} [HHM and CRF]相关的知识,希望对你有一定的参考价值。
between tags and words, there‘s table 1.
between tags, there‘s table 2.
combine the two tables, p(...) to get the results.
MRF: factors of the tables not necessarily probabilities
BN: must be probabilities. => BN is easier to learn than MRF
Maximum-Entropy Markov Model (MEMM)
Marginals:
1) forward:
2) Belief:
HMM is generative, modeling joint probability P(x,y)
but tagging just needs P(y|x)
https://cedar.buffalo.edu/~srihari/CSE574/Discriminative-Generative.pdf
Full obervation!
(like the offline SLAM?)
biased! because we only look at local observation.
P(x_2|x_1) can be called Psi(x_1,x_2)
If Y_1 ~~~~ Y_{n-2} are connected somehow, what should be changed?
How close the model is closed to the truth.
以上是关于2018 10-708 (CMU) Probabilistic Graphical Models {Lecture 10} [HHM and CRF]的主要内容,如果未能解决你的问题,请参考以下文章
2018 10-708 (CMU) Probabilistic Graphical Models {Lecture 5} [Algorithms for Exact Inference]
2018 10-708 (CMU) Probabilistic Graphical Models {Lecture 15} [Mean field Approximation]
2018 10-708 (CMU) Probabilistic Graphical Models {Lecture 21} [A Hybrid: Deep Learning and Graphical
Probabilistic Graphical Models 10-708, Spring 2017