Conditional independence

In probability theory, conditional independence describes situations wherein an observation is irrelevant or redundant when evaluating the certainty of a hypothesis. Conditional independence is usually formulated in terms of conditional probability, as a special case where the probability of the hypothesis given the uninformative observation is equal to the probability without. If is the hypothesis, and and are observations, conditional independence can be stated as an equality:

where is the probability of given both and . Since the probability of given is the same as the probability of given both and , this equality expresses that contributes nothing to the certainty of . In this case, and are said to be conditionally independent given , written symbolically as: . In the language of causal equality notation, two functions and which both depend on a common variable are described as conditionally independent using the notation , which is equivalent to the notation .

The concept of conditional independence is essential to graph-based theories of statistical inference, as it establishes a mathematical relation between a collection of conditional statements and a graphoid.