Skip to content

Information Theory Glossary

25 essential terms — because precise language is the foundation of clear thinking in Information Theory.

Showing 25 of 25 terms

A memoryless channel where each input bit is flipped independently with a fixed crossover probability p.

Related:Channel CapacityBinary Entropy FunctionError Probability

The basic unit of information, equal to the information content of a single binary choice between two equally likely outcomes.

Related:NatShannonBinary Digit

A system that transmits information from a sender to a receiver, potentially introducing noise or distortion during transmission.

Related:Channel CapacityBinary Symmetric ChannelGaussian Channel

The tight upper bound on the rate of information that can be reliably transmitted over a channel, measured in bits per use.

Related:Shannon-Hartley TheoremNoisy-Channel Coding Theorem

The process of adding structured redundancy to transmitted data to enable error detection and correction at the receiver.

Related:Error-Correcting CodeChannel CapacityRedundancy

The expected amount of information needed to describe a random variable Y given that the value of another random variable X is known.

Related:Joint EntropyMutual InformationChain Rule

The average number of bits needed to identify an event from distribution P when using a coding scheme optimized for distribution Q.

Related:KL DivergenceEntropyLoss Function

The principle that no processing of data Y can increase the information it contains about a source X: post-processing can only lose information.

Related:Mutual InformationMarkov ChainSufficient Statistic

The average amount of information or surprise produced by a stochastic source, defined as H(X) = -sum p(x) log p(x).

Related:Shannon EntropyConditional EntropyJoint Entropy

A coding scheme that enables detection and correction of errors in transmitted data by introducing controlled redundancy.

Related:Hamming CodeReed-Solomon CodeLDPC Code

A channel model where additive white Gaussian noise is added to the transmitted signal, characterized by bandwidth and signal-to-noise ratio.

Related:Shannon-Hartley TheoremChannel CapacityAWGN

An optimal prefix-free variable-length code that minimizes average codeword length for a known symbol probability distribution.

Related:Prefix-Free CodeSource CodingEntropy

A quantifiable reduction in uncertainty about the state of a system, measured in bits (base-2 logarithm) or nats (natural logarithm).

Related:EntropyBitSurprise

The total entropy of a pair of random variables considered together, measuring the combined uncertainty.

Related:Conditional EntropyChain RuleMutual Information

Kullback-Leibler divergence: a non-symmetric measure of the difference between two probability distributions P and Q.

Related:Cross-EntropyMutual InformationRelative Entropy

A necessary and sufficient condition for the existence of a prefix-free code: sum of 2^(-l_i) <= 1, where l_i are the codeword lengths.

Related:Prefix-Free CodeHuffman CodeCodeword Length

Low-density parity-check code: a linear error-correcting code with a sparse parity-check matrix, offering near-capacity performance with iterative decoding.

Related:Error-Correcting CodeChannel CapacityBelief Propagation

The amount of information that one random variable provides about another, equal to the reduction in entropy of one variable given knowledge of the other.

Related:EntropyConditional EntropyKL Divergence

A class of error-correcting codes, invented by Erdal Arikan, that provably achieve channel capacity with efficient encoding and successive cancellation decoding.

Related:Channel Capacity5G NRChannel Polarization

A code in which no codeword is a prefix of any other codeword, allowing unambiguous instantaneous decoding.

Related:Huffman CodeKraft InequalityUniquely Decodable

The von Neumann entropy S(rho) = -Tr(rho log rho), extending Shannon entropy to quantum states described by density matrices.

Related:Quantum InformationVon Neumann EntropyQubit

The branch of information theory that characterizes the minimum bit rate required for lossy compression at a given distortion level.

Related:Lossy CompressionDistortionSource Coding

The portion of a message that is not essential for conveying information, representing the difference between maximum and actual entropy.

Related:CompressionEntropyError Correction

The process of compressing a source's output to approach its entropy rate, removing statistical redundancy.

Related:Huffman CodingLempel-ZivEntropy

The set of sequences whose empirical entropy is close to the true entropy of the source, containing approximately 2^(nH) sequences for length n.

Related:AEPSource Coding TheoremEntropy
Information Theory Glossary - Key Terms & Definitions | PiqCue