Contents
Preface & Acknowledgements
Symbols & Abbreviations
1 INFORMATION
AND CODING THEORY
1.1 Information
1.1.1 Information measure
1.2 Entropy and Information Rate
1.3 Extended Discrete Memoryless Source
1.4 Channels and Mutual Information
1.4.1 Information transmission over discrete channels
1.4.2 Information channels
1.5 Channel Probability Relationships
1.6 The a priori and a posteriori Entropies
1.7 Mutual Information
1.7.1 Mutual information: Definition
1.7.2 Mutual information: Properties
1.8 Capacity of a Discrete Channel
1.9 Shannon’s Theorems
1.9.1 Source coding theorem
1.9.2 Channel capacity and coding
1.9.3 Channel coding theorem
1.10 Signal Spaces and the Channel Coding Theorem
1.10.1 Capacity of the Gaussian Channel
1.11 Error Control Coding
1.12 Limits to Communication and their Consequences
Bibliography and References
Problems
2. BLOCK
CODES
2.1 Error Control Coding
2.2 Error Detection and Correction
2.2.1 Simple codes: The repetition code
2.3 Block Codes: Introduction and Parameters
2.4 The Vector Space over the Binary Field
2.4.1 Vector subspaces
2.4.2 Dual subspaces
2.4.3 Matrix form
2.4.4 Dual subspace matrix
2.5 Linear Block Codes
2.5.1 Generator matrix
2.5.2 Block codes in systematic form
2.5.3 Parity check matrix
2.6 Syndrome Error Detection
2.7 Minimum Distance of a Block Code
2.7.1 Minimum distance and the structure of the matrix
2.8 Error Correction Capability of a Block Code
2.9 Syndrome Detection and the Standard Array
2.10 Hamming Codes
2.11 Forward Error Correction (FEC) and Automatic Repeat ReQuest (ARQ)
2.11.1 Forward error correction (FEC)
2.11.2 Automatic Repeat ReQuest (ARQ)
2.11.3 ARQ schemes
2.11.3.1 Stop and Wait
2.11.3.2 Go back N
2.11.3.3 Selective Repeat
2.11.4 ARQ scheme efficiencies
2.11.5 Hybrid-ARQ schemes
Bibliography and References
Problems
3 CYCLIC
CODES
3.1 Description
3.2 Polynomial Representation of Codewords
3.3 Generator Polynomial of a Cyclic Code
3.4 Cyclic Codes in Systematic Form
3.5 Generator Matrix of a Cyclic Code
3.6 Syndrome Calculation and Error Detection
3.7 Decoding of Cyclic Codes
3.8 An Application Example: CRC Code for the Ethernet Standard
Bibliography and References
Problems
4 BCH CODES
4.1 Introduction: The Minimal Polynomial
4.2 Description of BCH Cyclic Codes
4.2.1 Bounds on the error-correction capability of a BCH code: the Vandermonde
determinant
4.3 Decoding of BCH Codes
4.4 Error Location and Error Evaluation Polynomials
4.5 The Key Equation
4.6 Decoding of BCH Codes using the Euclidean Algorithm
4.6.1 The Euclidean algorithm
Bibliography and References
Problems
5 REED-SOLOMON
CODES
5.1 Introduction
5.2 Error Correction Capability of RS Codes: The Vandermonde Determinant
5.3 RS Codes in Systematic Form
5.4 Syndrome Decoding of RS Codes
5.5 The Euclidean Algorithm. Error Location and Evaluation Polynomials
5.6 Decoding of RS Codes using the Euclidean Algorithm
5.6.1 Steps of the Euclidean algorithm
5.7 Decoding of RS and BCH Codes using the Berlekamp- Massey Algorithm
5.7.1 Berlekamp-Massey iterative algorithm for finding the error location
polynomial
5.7.2 Berlekamp-Massey decoding of RS codes
5.7.3 Relationship between the error location polynomials of the Euclidean
and Berlekamp-Massey algorithms
5.8 A Practical Application: Error-Control Coding for the Compact Disc
(CD)
5.8.1 CD Characteristics
5.8.2 Channel characteristics
5.8.3 Coding procedure
5.9 Encoding for RS codes , and
5.10 Decoding of RS Codes and
5.10.1 Berlekamp-Massey decoding
5.10.2 Alternative decoding methods
5.10.3 Direct solution of syndrome equations
5.11 Importance of Interleaving
Bibliography and References
Problems
6 CONVOLUTIONAL
CODES
6.1 Linear Sequential Circuits
6.2 Convolutional Codes and Encoders
6.3 Description in the D-Transform Domain
6.4 Convolutional Encoder Representations
6.4.1 Representation of connections
6.4.2 State diagram representation
6.4.3 Trellis representation
6.5 Convolutional Codes in Systematic Form
6.6 General Structure of FIR and IIR Finite State Sequential Machines
6.6.1 FIR FSSM
6.6.2 IIR FSSM
6.7 State Transfer Function Matrix: Calculation of the Transfer Function
6.7.1 State transfer function for FIR FSSMs
6.7.2 State transfer function for IIR FSSMs
6.8 Relationship between the Systematic and Non-Systematic Forms
6.9 Distance Properties of Convolutional Codes
6.10 Minimum Free Distance of a Convolutional Code
6.11 Maximum Likelihood Detection (MLD)
6.12 Decoding of Convolutional Codes: The Viterbi Algorithm
6.13 Extended and Modified State Diagram
6.14 Error Probability Analysis for Convolutional Codes
6.15 Hard and soft Decisions
6.15.1 Maximum likelihood criterion for the Gaussian channel
6.15.2 Bounds for soft decision detection
6.15.3 An example of soft decision decoding of convolutional codes
6.16 Punctured Convolutional Codes and Rate Compatible Schemes
Bibliography and References
Problems
7 TURBO CODES
7.1 A Turbo Encoder
7.2 Decoding of Turbo Codes
7.2.1 Turbo decoder
7.2.2 Probabilities and estimates
7.2.3 Symbol detection
7.2.4 The log likelihood ratio
7.3 Markov Sources and Discrete Channels
7.4 The BCJR Algorithm: Trellis Coding and Discrete Memoryless Channels
7.5 Iterative Coefficient Calculation
7.6 MAP BCJR Algorithm and the Log Likelihood Ratio
7.6.1 The BCJR MAP algorithm: LLR calculation
7.6.2 Calculation of coefficients
7.7 Turbo Decoding
7.7.1 Initial conditions of coefficients and
7.8 Construction Methods for Turbo Codes
7.8.1 Interleavers
7.8.2 Block interleavers
7.8.3 Convolutional interleavers
7.8.4 Random interleavers
7.8.5 Linear interleavers
7.8.6 Code concatenation methods
7.8.6.1 Serial concatenation
7.8.6.2 Parallel concatenation
7.8.7 Turbo code performance as a function of size and type of interleaver
7.9 Other Decoding Algorithms for Turbo Codes
7.10 EXIT Charts for Turbo Codes
7.10.1 Introduction to EXIT charts
7.10.2 Construction of the EXIT chart
7.10.3 Extrinsic transfer characteristics of the constituent decoders
Bibliography and References
Problems
8 LOW-DENSITY
PARITY-CHECK (LDPC) CODES
8.1 Different Systematic Forms of a Block Code
8.2 Description of LDPC Codes
8.3 Construction of LDPC Codes
8.3.1 Regular LDPC codes
8.3.2 Irregular LDPC codes
8.3.3 Decoding of LDPC codes: the Tanner graph
8.4 The Sum-Product Algorithm
8.5 Sum-Product Algorithm for LDPC Codes: An Example
8.6 Simplifications of the Sum-Product Algorithm
8.7 A Logarithmic LDPC Decoder
8.7.1 Initialization
8.7.2 Horizontal step
8.7.3 Vertical step
8.7.4 Summary of the logarithmic decoding algorithm
8.7.5 Construction of the lookup table
8.8 EXIT Charts for LDPC Codes
8.8.1 Introduction
8.8.2 Iterative decoding of block codes
8.8.3 EXIT chart construction for LDPC codes
8.8.4 Mutual information function
8.8.5 EXIT chart for the symbol node decoder (SND)
8.8.6 EXIT chart for the parity check node decoder (PCND)
8.9 Fountain and LT Codes
8.9.1 Introduction
8.9.2.Fountain codes
8.9.3.Linear random codes
8.9.4 LT codes
8.9.4.1 LT encoder
8.9.4.2 LT decoder
8.10 LDPC and Turbo Codes
Bibliography and References
Problems
APPENDIX
A: Error Probability in the Transmission of Digital Signals
A.1 Digital Signalling
A.1.1 Pulse amplitude modulated digital signals
A.2 Bit Error Rate (BER)
Bibliography
APPENDIX
B: Galois Fields GF(q)
B.1 Groups
B.2 Addition and Multiplication modulo-2
B.3 Fields
B.4 Polynomials over Binary Fields
B.5 Construction of a Galois Field GF(2m)
B.6 Properties of Extended Galois Fields
B.7 Minimal Polynomials
B.7.1 Properties of Minimal Polynomials
Bibliography
Answers to Problems
Index
|