Compare with 1 courses

Information Theory and Coding

Information Theory and Coding

Free

Overview Introduction to Information Theory and Coding - Definition of Information Measure and Entropy - Extension of An Information Source and Markov Source - Adjoint of An Information Source, Joint and Conditional Information Measures - Properties of Joint and Conditional Information Measures and a Markov Source - Asymptotic Properties of Entropy and Problem Solving in Entropy - Block Code and Its Properties - Instantaneous Code and Its Properties - Kraft-Mcmillan Equality and Compact Codes - Shannon's First Theorem - Coding Strategies and Introduction to Huffman Coding - Huffman Coding and Proof of Its Optimality - Competitive Optimality of The Shannon Code - Non-Binary Huffman Code and Other Codes - Adaptive Huffman Coding - Shannon-Fano-Elias Coding and Introduction to Arithmetic Coding

Learn more
Has discount
Expiry period Lifetime
Made in English
Last updated at Wed Dec 1969
Level
Beginner
Total lectures 41
Total quizzes 0
Total duration 35:42:29 Hours
Total enrolment 0
Number of reviews 0
Avg rating
Short description Overview Introduction to Information Theory and Coding - Definition of Information Measure and Entropy - Extension of An Information Source and Markov Source - Adjoint of An Information Source, Joint and Conditional Information Measures - Properties of Joint and Conditional Information Measures and a Markov Source - Asymptotic Properties of Entropy and Problem Solving in Entropy - Block Code and Its Properties - Instantaneous Code and Its Properties - Kraft-Mcmillan Equality and Compact Codes - Shannon's First Theorem - Coding Strategies and Introduction to Huffman Coding - Huffman Coding and Proof of Its Optimality - Competitive Optimality of The Shannon Code - Non-Binary Huffman Code and Other Codes - Adaptive Huffman Coding - Shannon-Fano-Elias Coding and Introduction to Arithmetic Coding
Outcomes
Requirements