本文介绍了从3个离散变量中查找条件互信息的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用pyitlib python软件包并借助以下公式来查找三个离散随机变量之间的条件互信息:

I am trying to find conditional mutual information between three discrete random variable using pyitlib package for python with the help of the formula:

I(X;Y|Z)=H(X|Z)+H(Y|Z)-H(X,Y|Z)

预期的条件互助信息值为= 0.011

The expected Conditional Mutual information value is= 0.011

我的第一个代码:

import numpy as np
from pyitlib import discrete_random_variable as drv

X=[0,1,1,0,1,0,1,0,0,1,0,0]
Y=[0,1,1,0,0,0,1,0,0,1,1,0]
Z=[1,0,0,1,1,0,0,1,1,0,0,1]

a=drv.entropy_conditional(X,Z)
##print(a)
b=drv.entropy_conditional(Y,Z)
##print(b)
c=drv.entropy_conditional(X,Y,Z)
##print(c)

p=a+b-c
print(p)

我在这里得到的答案是= 0.4632245116328402

The answer i am getting here is=0.4632245116328402

我的第二个代码:

import numpy as np
from pyitlib import discrete_random_variable as drv

X=[0,1,1,0,1,0,1,0,0,1,0,0]
Y=[0,1,1,0,0,0,1,0,0,1,1,0]
Z=[1,0,0,1,1,0,0,1,1,0,0,1]

a=drv.information_mutual_conditional(X,Y,Z)
print(a)

我在这里得到的答案是= 0.1583445441575102

The answer i am getting here is=0.1583445441575102

预期结果为= 0.011

有人可以帮忙吗?我现在有大麻烦了.任何形式的帮助都将是可贵的.预先感谢.

Can anybody help? I am in big trouble right now. Any kind of help will be appreciable.Thanks in advance.

推荐答案

基于条件熵的定义,以位为单位(即以2为基础)进行计算,我得到H(X | Z)= 0.784159,H(Y | Z) = 0.325011,H(X,Y | Z)= 0.950826.根据上面提供的条件互信息的定义,我得到I(X; Y | Z)= H(X | Z)+ H(Y | Z)-H(X,Y | Z)= 0.158344.注意pyitlib默认使用基数2,drv.information_mutual_conditional(X,Y,Z)似乎在计算正确的结果.

Based on the definitions of conditional entropy, calculating in bits (i.e. base 2) I obtain H(X|Z)=0.784159, H(Y|Z)=0.325011, H(X,Y|Z) = 0.950826. Based on the definition of conditional mutual information you provide above, I obtain I(X;Y|Z)=H(X|Z)+H(Y|Z)-H(X,Y|Z)= 0.158344. Noting that pyitlib uses base 2 by default, drv.information_mutual_conditional(X,Y,Z) appears to be computing the correct result.

请注意,在第一个示例中使用drv.entropy_conditional(X,Y,Z)来计算条件熵是不正确的,但是您可以使用drv.entropy_conditional(XY,Z),其中XY是一维数组,表示关于X和Y的联合观测,例如XY = [2 * xy [0] + xy [1]对于zip(X,Y)中的xy.

Note that your use of drv.entropy_conditional(X,Y,Z) in your first example to compute conditional entropy is incorrect, you can however use drv.entropy_conditional(XY,Z), where XY is a 1D array representing the joint observations about X and Y, for example XY = [2*xy[0] + xy[1] for xy in zip(X,Y)].

这篇关于从3个离散变量中查找条件互信息的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

09-18 04:35