Information Theory 2014
News:
30 Nov 2014: Philip added some more topics to the list of topics for the final presentations.26 Nov 2014: The final exam will consist of student presentations. Check the list of topics here.
27 Oct 2014: Good reasons to put away your mobile phone during class
Content of the course
Information theory was developed by Claude E. Shannon in the 1950s to investigate the fundamental limits on signalprocessing operations such as compressing data and on reliably storing and communicating data. These tasks have turned out to be fundamental for all of computer science.In this course, we quickly review the basics of (discrete) probability theory and introduce concepts such as (conditional) Shannon entropy, mutual information and Renyi entropy. Then, we prove Shannon's theorems about data compression and channel coding. In the course, we also cover some aspects of informationtheoretic security and show applications of information theory to the area of machine learning.
Intended Learning Outcomes
At the end of the course, you are able to Define Shannon entropy and Mutual Information and compute these quantities on examples.
 Work with joint discrete random variables (conditioning, Bayes' rule)
 Define basic discrete probability distributions (Bernoulli, Binomial, Geometric) and compute their expected value and variance
 State Jensen's inequality for convex and concave functions and use it in proofs
 Use entropy diagrams to read off and find new relations between entropic quantities
 Prove Shannon's theorem about perfectly secure encryption (e.g. by using an entropy diagram)
 Define typical and jointly typical sets, prove properties about their size and probability mass and know how they are used in source coding and channel coding
 Use Kraft's inequality e.g. to check if a prefixfree code exists for given codeword lengths
 Compute a dary Huffman code
 Describe how much a given source can be compressed and give a way to do it
 Prove properties about Arithmetic Codes
 Find the confusability graph of a given channel, and find channels for a given confusability graph
 Compute the independence number and (zeroerror) Shannon capacity of a given confusability graph
 Compute the capacity and maximizing input distribution of a given channel
 Define basic channels (binary symmetric, erasure channel)
 State Shannon's noisy channelcoding theorem and and understand the key ingredients of the proof
 Grasp definitions of types of entropy different than Shannon entropy
Course website
Updated information about the course can be found on https://staff.science.uva.nl/c.schaffner/courses/inftheory/2014/Study Material
The material will be presented in blackboards lectures. The following are good references: [CF] Ronald Cramer, Serge Fehr: The Mathematical Theory of Information, and Applications, lecture notes, Version 2.0
 [CT] Thomas M. Cover, Joy A. Thomas. Elements of information theory, 2nd Edition. New York: WileyInterscience, 2006. ISBN 0471241954.
 [MacKay] David J. C. MacKay. Information Theory, Inference, and Learning Algorithms. Cambridge: Cambridge University Press, 2003. ISBN 0521642981
Lectures and Exercise sessions (2 x 45min each)
please check Datanose for the definite times and locations.Homework, exam, and grading
This is a 6 ECTS course, which comes to roughly 20 hours of work per week.There will be homework exercises every week to be handed in one week later. The answers should be in English. Feel free to use LaTeX, here is a template to get you started, but readable handwritten solutions are fine, too. Cooperation while solving the exercises is allowed and encouraged, but everyone has to hand in their own solution set in their own words.
The final exam will consist of student presentations about slightly more advanced topics connected to the course. The detailed procedure and list of topics can be found here.
The final grade for the course consists by 1/2 of the average homework grade (ignoring the worst grade) and 1/2 of the grade obtained at the final exam.
Course schedule Fall 2014
Date  Content  Homework  

Mon, 27 Oct 
Overview of the course, Probability Theory Section 2.1 of [CF] 

Wed, 29 Oct 
Shannon Entropy, Jensen's inequality, Properties of Shannon entropy Sections 3.13.3 of [CF] 
Exercises #1  
Mon, 3 Nov 
Chain Rule, Mutual Information, Entropy Diagrams, Markov Chains Sections 3.3, 3.4 of [CF], Sections 2.5, 2.8 of [CT] 
Homework #1  
Wed, 5 Nov 
DataProcessing Inequality, Sufficient Statistics, Perfectly Secure Encryption: Onetime pad Sections 2.82.9 of [CT], Section 4 of [CF] 
Exercises #2  
Mon, 10 Nov 
Perfectly Secure Encryption: Shannon's theorem. Fano's inequality. Data compression / Source coding Section 4 of [CF], Sections 2.10 and 3.1 of [CT] Insecurity of Key Reuse in OTP 
Homework #1  
Wed, 12 Nov 
Data Compression: Asymptotic Equipartition Property, Typical set, Sourcecoding Theorem, highprobability set Section 3.2+3.3 of [CT] 
Homework #3  
Mon, 17 Nov 
Data Compression: symbol codes, properties, sourcecoding theorem reloaded, Kraft's inequality Section 5 of [CF], Chapter 5 of [CT], Chapter 5 of [MacKay] 
Homework #1  
Wed, 19 Nov 
Huffman codes and their optimality, Game of 20 Questions, Arithmetic Codes Section 5.4+5.5 of [CF], Section 6.2 of [MacKay] 
Homework #4  
Mon, 24 Nov 
Advantages of Arithmetic Codes, NoisyChannel Coding, Basic Definitions, Graph Theory Pages 4+5 of this survey paper Section 7.5 of [CT] 
Homework #1  
Wed, 26 Nov 
Zeroerror channel coding 
Homework #5  
Mon, 1 Dec 
Noisychannel coding: capacity, setup and proof of converse Sections 7.1, 7.4, 7.5, 7.9, 7.12 of [CT] 
Homework #1  
Wed, 3 Dec 
Shannon's theorem about noisychannel coding Sections 7.6, 7.7 of [CT], Chapter 10 of [MacKay] 
Homework #6  
Mon, 8 Dec 
Sourcechannel Separation, Errorcorrecting codes Section 7.13 of [CT], Chapter 1 of [MacKay] 

Wed, 10 Dec 
Philip Schulz about Information Theory in Machine Translation 
Homework #7  
Wed, 17 Dec, 10:0012:30 
Location: CWI, ground floor of the new wing, room L0.17 

Thu, 18 Dec, 13:0015:00 

Life after "Information Theory"
If you got hooked on the world of entropies, you have several options after the course to pursue the topics of information theory and cryptography: Talk to Christian about the possibilities of doing a semester project or master project in information theory or cryptography. He can also hook you up with other people at the ILLC, at CWI or in the rest of the world, working on different aspects of information theory.
 Follow Ronald de Wolf's course about quantum computing at the university of Amsterdam, starting Spring 2015.
 Follow this mastermath course about crypology by Marc Stevens and Tanja Lange, starting in Spring 2015.
 Follow Harry Buhrmans's course about computational complexity at the university of Amsterdam, starting Fall 2015.
 Follow various online classes such as Raymond W. Yeung's Information Theory course, Dan Boneh's crypto, crypto II, Jon Katz's crypto class or Umesh Vazirani's course about quantum computing.