Point-wise Mutual Information

Posted fengyubo

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了Point-wise Mutual Information相关的知识,希望对你有一定的参考价值。

Point-wise Mutual Information

(Yao, et al 2019) reclaimed a clear description of Point-wise Mutual Information as below:
\[ PMI(i, j) = \log \fracp(i,j)p(i)p(j) \p(i, j) = \frac\#(i,j)\#W \p(i) = \frac\#(i)\#W \]
where \(\#(i)\) is the number of sliding windows in a corpus hat contain word \(i\)

where \(\#(i,j)\) is the number of sliding windows that contain both word \(i\) and \(j\)

where \(\#W\) is the total number of sliding windows in the corpus.

(Levy, et al 2014) simplified PMI formula as below:
\[ PMI(i,j) = \log\frac\#(i,j)\#W\#(i)\#(j) \]

Obviously, \(\#W\) is a constant if we fixed slide window size and corpus, hence we can further simplify the formula as below:
\[ PMI(i, j) = \log\frac\#(i,j)\#(i)\#(j) \]

References

Liang Yao, et al, 2019. Graph Convolutional Networks for Text Classification. AAAI

Omer Levy, et al, 2014. NeuralWord Embedding as Implicit Matrix Factorization. NIPS

以上是关于Point-wise Mutual Information的主要内容,如果未能解决你的问题,请参考以下文章

Mutual information and Normalized Mutual information

Mutual Review

Mutual review

原创NSURLSession HTTPS Mutual Authentication

Entropy, relative entropy and mutual information

chop|divorce|harsh|mutual|compel|