Mutual information is the idea that learning something about one variable might tell you about another. For example, learning that it's daytime might give you information about whether the sun is shining. it could still be cloudy, but you can be more sure that it's sunny than before you learned it was daytime.
Mathematically, mutual information is represented using the concept of entropy. The information gained about a variable X, assuming you learn Y, is given by:
In this case, is a measure of the entropy. It is given by
Mutual information is supposed to be symmetric (), but I'm interested in how that works in a causal context.
Let's say you have a lightbulb that can be turned on from either of two light switches. If either lightswitch is on, then the bulb is on. Learning that one light switch is on tells you the bulb is on, but learning that the bulb is on does *not* tell you that one specific light switch is on. It tells you that at least one is on (but not which one).
Let's assume for the sake of argument that each light switch has a probability p(on) = 0.25 of being turned on (and equivalently a probability p(off) = 0.75 of being off). Assume also that they're independent.
The entropy of switch one is
Since either switch has a probability of 0.25 of being on, and they're independent, the bulb itself has a probability of 7/16 of being on.
The entropy of the bulb is
If you know switch 1's state, then the information you have about the light is given by
If instead you know the bulb's state, then the information you have about switch 1 is given by
So even in a causal case the mutual information is still symmetric.
For me the point that helps give an intuitive sense of this is that if you know S1 is on, you know the bulb is on. Symmetrically, if you know the bulb is off, you know that S1 is off.