Coding Theory Cheat Sheet
The core ideas of Coding Theory distilled into a single, scannable reference — perfect for review or quick lookup.
Quick Reference
Hamming Distance
The number of positions at which two codewords of equal length differ. The minimum Hamming distance of a code determines its error-detecting and error-correcting capability: a code with minimum distance d can detect up to d-1 errors and correct up to floor((d-1)/2) errors.
Linear Code
A code in which any linear combination of codewords is also a codeword, meaning the set of codewords forms a vector subspace over a finite field. Linear codes are described by an [n, k, d] notation where n is the block length, k is the message dimension, and d is the minimum distance.
Shannon's Channel Capacity Theorem
Claude Shannon's noisy channel coding theorem states that for any communication channel with capacity C, it is possible to transmit data at any rate R < C with an arbitrarily low probability of error, given sufficiently long block codes. Conversely, rates above capacity inevitably produce errors.
Reed-Solomon Codes
A family of non-binary cyclic error-correcting codes that operate on symbols from a finite field rather than individual bits. An RS(n, k) code over GF(q) can correct up to t = (n-k)/2 symbol errors, making them especially powerful against burst errors.
Parity-Check Matrix
A matrix H that defines a linear code by specifying the constraints that every valid codeword must satisfy: a vector c is a codeword if and only if Hc^T = 0. The parity-check matrix provides a compact representation of the code and is used in syndrome decoding.
Convolutional Codes
Codes that process the input data stream continuously using shift registers, where each output depends on the current and several previous input bits. Unlike block codes, convolutional codes have memory and are decoded with algorithms such as the Viterbi algorithm or BCJR algorithm.
Turbo Codes
A class of high-performance error-correcting codes discovered by Berrou, Glavieux, and Thitimajshima in 1993 that use two or more constituent convolutional encoders connected in parallel through an interleaver. They are decoded iteratively and were the first practical codes to approach Shannon's capacity limit.
Low-Density Parity-Check (LDPC) Codes
Linear block codes defined by a sparse parity-check matrix, originally invented by Robert Gallager in 1960 and rediscovered in the 1990s. They are decoded using iterative belief propagation on a Tanner graph and can approach the Shannon limit within a fraction of a decibel.
Cyclic Codes
A subclass of linear codes in which any cyclic shift of a codeword produces another valid codeword. They have efficient encoding and decoding algorithms based on polynomial arithmetic and are described by a generator polynomial over a finite field.
Singleton Bound and MDS Codes
The Singleton bound states that for any [n, k, d] code, d <= n - k + 1. Codes that achieve this bound with equality are called Maximum Distance Separable (MDS) codes and offer the optimal trade-off between redundancy and error correction.
Key Terms at a Glance
Get study tips in your inbox
We'll send you evidence-based study strategies and new cheat sheets as they're published.
We'll notify you about updates. No spam, unsubscribe anytime.