DOC PREVIEW
UH ECE 4371 - ECE 4371 Lecture Notes

This preview shows page 1-2-3-25-26-27-28-50-51-52 out of 52 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 52 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 52 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 52 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 52 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 52 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 52 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 52 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 52 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 52 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 52 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 52 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

ECE 4331, Fall, 2009OverivewConvolutional Code IntroductionEncoderExampleSlide 6Generator sequencesConvolution point of view in encoding and generator matrixExample: Using generator matrixRepresenting convolutional codes: Code treeRepresenting convolutional codes compactly: code trellis and state diagramInspecting state diagram: Structural properties of convolutional codesDistance for some convolutional codesPuncture CodeDecoding of convolutional codesExample of exhaustive maximal likelihood detectionSlide 17The Viterbi algorithmThe survivor pathExample of using the Viterbi algorithmThe maximum likelihood pathHow to end-up decoding?Concepts to LearnViterbi AlgorithmHamming Code ExampleError CorrectionExample of CRCChecking for errorsSlide 29Slide 30Slide 31Turbo CodesMotivation: Performance of Turbo CodesConcatenated CodingParallel Concatenated CodesPseudo-random InterleavingRecursive Systematic Convolutional EncodingWhy Interleaving and Recursive Encoding?Iterative DecodingThe Turbo-PrinciplePerformance as a Function of Number of IterationsPerformance Factors and TradeoffsInfluence of Interleaver SizePower Efficiency of Existing StandardsTurbo Code SummaryLDPC IntroductionTanner Graph (1/2)Decoding of LDPC CodesPro and ConTrellis Coded ModulationSet PartitioningCoding GainECE 4331, Fall, 2009Zhu HanDepartment of Electrical and Computer EngineeringClass 24Nov. 12th, 2009OverivewOverivewConvolutional code–Encoder –Decoder: Viterbi decodingTurbo CodeLDPC CodeTCM modulationHomework 10.5, 10.6, 10.7,10.8, 10.9, 10.15, 10.16, 10.20, 10.21, 10.22, 10.23, 10.24, 10.25, due 12/2Convolutional Code IntroductionConvolutional Code IntroductionConvolutional codes map information to code bits sequentially by convolving a sequence of information bits with “generator” sequences A convolutional encoder encodes K information bits to N>K code bits at one time step Convolutional codes can be regarded as block codes for which the encoder has a certain structure such that we can express the encoding operation as convolutionEncoder Encoder Convolutional codes are applied in applications that require good performance with low implementation cost. They operate on code streams (not in blocks)Convolution codes have memory that utilizes previous bits to encode or decode following bits (block codes are memoryless)Convolutional codes achieve good performance by expanding their memory depthConvolutional codes are denoted by (n,k,L), where L is code (or encoder) Memory depth (number of register stages)Constraint length C=n(L+1) is defined as the number of encoded bits a message bit can influence toExampleExampleConvolutional encoder, k = 1, n = 2, L=2–Convolutional encoder is a finite state machine (FSM) processing information bits in a serial manner–Thus the generated code is a function of input and the state of the FSM–In this (n,k,L) = (2,1,2) encoder each message bit influences a span of C= n(L+1)=6 successive output bits = constraint length C–Thus, for generation of n-bit output, we require n shift registers in k = 1 convolutional encodersExampleExample(3,2,1) Convolutional encoder3 2'j j j jx m m m   3 1''j j j jx m m m   2'''j j jx m m Here each message bit influences a span of C = n(L+1)=3(1+1)=6 successive output bitsGenerator sequencesGenerator sequencesConvolution point of view in encoding and Convolution point of view in encoding and generator matrixgenerator matrix3Example: Using generator matrixExample: Using generator matrix(1)( 2)[1 0 11][111 1]   ggRepresenting convolutional codes: Code treeRepresenting convolutional codes: Code tree2 12'''j j j jj j jx m m mx m m    1 1 2 2 3 3' '' ' '' ' '' ...outx x x x x x xThis tells how one input bitis transformed into two output bits(initially register is all zero)2 10 1j jm m ' 1''0110jjxx   ' 0''0010jjxx   (n,k,L) = (2,1,2) encoderRepresenting convolutional codes Representing convolutional codes compactly: code trellis and state diagramcompactly: code trellis and state diagramShift register statesInput state ‘1’ indicated by dashed lineCode trellisState diagramInspecting state diagram: Structural Inspecting state diagram: Structural properties of convolutional codesproperties of convolutional codesEach new block of k input bits causes a transition into new stateHence there are 2k branches leaving each state Assuming encoder zero initial state, encoded word for any input of k bits can thus be obtained. For instance, below for u=(1 1 1 0 1), encoded word v=(1 1, 1 0, 0 1, 0 1, 1 1, 1 0, 1 1, 1 1) is produced: - encoder state diagram for (n,k,L)=(2,1,2) code- note that the number of states is 2L+1 = 8Distance for some convolutional codesDistance for some convolutional codesLower the coding rate, larger the L, then larger the distancePuncture CodePuncture CodeA sequence of coded bits is punctured by deleting some of the bits in the sequence according to some fixed rule.The resulting coding rate is increased. So a lower rate code can be extended to a sequence of higher rate codes.Decoding of convolutional codesDecoding of convolutional codesExample of exhaustive maximal likelihood detectionExample of exhaustive maximal likelihood detection Assume a three bit message is transmitted [and encoded by (2,1,2) convolutional encoder]. To clear the decoder, two zero-bits are appended after message. Thus 5 bits are encoded resulting 10 bits of code. Assume channel error probability is p = 0.1. After the channel 10,01,10,11,00 is produced (including some errors). What comes after the decoder, e.g. what was most likely the transmitted code and what were the respective message bits?abcdstatesdecoder outputsif this path is selectedNote also the Hamming distances!correct:1+1+2+2+2=8;8 ( 0.11) 0.88false:1+1+0+0+0=2;2 ( 2.30) 4.6total path metric: 5.48    The largest metric, verifythat you get the same result!The Viterbi algorithmThe Viterbi algorithmProblem of optimum decoding is to find the minimum distance path from the initial state back to initial state (below from S0 to S0). The minimum distance is the sum of all path metricsthat is maximized by the correct pathExhaustive maximum likelihood method must search all the paths in phase trellis (2k paths emerging/entering from 2 L+1


View Full Document

UH ECE 4371 - ECE 4371 Lecture Notes

Download ECE 4371 Lecture Notes
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view ECE 4371 Lecture Notes and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view ECE 4371 Lecture Notes 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?