Information and Communication
News:
20 Jan 2016: added last pictures and updated presentation times5 March 2015: It is important to show up personally for the first lecture on Monday, 4 January 2016 because the details of the course will be further organised then.
Content of the course
Information theory was developed by Claude E. Shannon in the 1950s to investigate the fundamental limits on signalprocessing operations such as compressing data and on reliably storing and communicating data. These tasks have turned out to be fundamental for all of computer science.In this course, we introduce the basics of probability theory and then study concepts such as (conditional) Shannon entropy and mutual information. Then, we treat Shannon's theorems about data compression and channel coding. We will also cover some aspects of informationtheoretic security for encryption.
Intended Learning Outcomes
At the end of the course, you are able to Define Shannon entropy and Mutual Information and compute these quantities on examples.
 Work with joint discrete random variables (conditioning, Bayes' rule)
 Define basic discrete probability distributions (Bernoulli, Binomial, Geometric) and compute their expected value and variance
 State Jensen's inequality for convex and concave functions
 Use entropy diagrams to read off and find new relations between entropic quantities
 State Shannon's theorem about perfectly secure encryption
 Compute a dary Huffman code
 Describe how much a given source can be compressed and give a way to do it
 Define basic channels (binary symmetric, erasure channel)
 State Shannon's noisy channelcoding theorem
 Study a slightly more advanced topic about information and communication
 Present this topic to the class
 Write a final report about it
Course website
Updated information about the course can be found on https://staff.science.uva.nl/c.schaffner/ourses/infcom/2015/Prerequisites
Basic calculus, e.g. working with logarithms. Also, basic notions of discrete probability (as learned e.g. in stochastiek 1) are helpful, but are not a strict requirement. This course is wellsuited for students who are pursuing a double bachelor in mathematics and computer science.Study Material
The material will be presented on slides and blackboards lectures. The following are good references: [CF] Ronald Cramer, Serge Fehr: The Mathematical Theory of Information, and Applications, lecture notes, Version 2.0
 [CT] Thomas M. Cover, Joy A. Thomas. Elements of information theory, 2nd Edition. New York: WileyInterscience, 2006. ISBN 0471241954.
 [MacKay] David J. C. MacKay. Information Theory, Inference, and Learning Algorithms. Cambridge: Cambridge University Press, 2003. ISBN 0521642981
Schedule
please check Datanose for the definite times and locations.It is important to show up personally for the first lecture on Monday, 4 January because the details of the course will be further organised then. If you cannot make it on Monday, but you want to attend the course nevertheless, please send me an email.
Language
The lectures will be given in English. The homework and final report might be written in Dutch, and the presentation can be delivered in Dutch, but the use of English is encouraged.Credits, homework, final presentation, report
This is a 6 ECTS course, which will keep you busy fulltime (40h/week) for the month of January 2016. There will be lectures in the first two weeks (415 January) and homework exercises to solve and hand in. In the third week of the course, you choose a topic from this list and study it. In the final week, you present the topic to the class and write a final report about this topic.Grades
Your grade for the final presentation will be determined by the quality of the presentation, your ability to answer questions about the subject (we will use this list for the evaluation).The final presentation counts 1/3 towards your final grade of the course, 1/3 will be determined by the report, and 1/3 will be determined by the average of the 3 homework exercises.
Course schedule for January 2016
(preliminary version)
Day 
Contents 
[CF] 
[CT] 
[MacKay] 
Exercises 


Mon, 4 Jan 2016, 9:0010:00 
Overview and organisation of the course It is essential to attend this first lecture if you want to follow the course. 

Mon, 4 Jan 2016, 10:0012:00 
Discrete Probability Theory 
2.1 
2.1, 2.2  
Mon, 4 Jan 2016, 12:0013:00 
Exercise session (on Discrete Probability Theory) 
Homework #1  
Tue, 5 Jan 2016, 9:0011:00 
Jensen's inequality, Entropy Blackboard Photo 1 Photo 2 Photo 3 Photo 4 Photo 5 Photo 6 Photo 7 
2.2, 3.1, 3.2  2.1, 2.6  2.7  
Tue, 5 Jan 2016, 11:0013:00 
Exercise Session on Probability Theory 

Wed, 6 Jan 2016, 11:0013:00 
Exercise Session on Entropy 
3 
Homework #2  
Thu, 7 Jan 2015, 9:0011:00 
Data Compression: symbol codes, Kraft's inequality, sourcecoding theorem (symbolcode version), Huffman codes Entropy of Alice in Wonderland Hex Editor with statistics Blackboard Photo 1 Photo 2 Photo 3 Photo 4 Photo 5 Photo 6 Photo 7 
5.1, 5.2 
5 
5, L4 

Thu, 7 Jan 2016, 11:0013:00 
Exercise Session 

Fri, 8 Jan 2015, 11:0013:00 
Huffman coding Preparation Homework: Figure out how to construct a Huffman code 

Fri, 8 Jan 2016, 13:0015:00 
Exercise Session 

Mon, 11 Jan 2016, 9:1511:00 
Entropy Diagrams, Markov chains, DataProcessing Inequality, Fano's inequality ILLC meeting room F1.15 Blackboard Photo 1 Photo 2 Photo 3 Photo 4 Photo 5 Photo 6 Photo 7 Photo 8 Photo 9 
3.4  4  
Mon, 11 Jan 2016, 11:0013:00 
Exercise Session 

Tue, 12 Jan 2016, 9:1511:00 
Sufficient Statistic, Perfectly Secure Encryption: Definitions, Onetime Pad and Shannon's theorem Insecurity of Key Reuse in OTP Try it yourself by solving this crypto challenge! 
4  
Tue, 12 Jan 2016, 11:0013:00 
Exercise Session 
Homework #3  
Thu, 14 Jan 2016, 9:1512:00 
Topic selection for continuation of the course, Exercise session ILLC meeting room F1.15 

Tue, 26 Jan 2016, 10:3013:00 
Student presentations at ILLC meeting room F1.15


Wed, 27 Jan 2016, 13:3016:00 
Student presentations at CWI L017

Life after "Information & Communication"
If you got hooked on the world of entropies, you have several options after the course to pursue the topics of information theory and cryptography: Talk to Christian about the possibilities of doing a semester project or bachelor project in information theory or cryptography. He can also hook you up with other people at the ILLC, at CWI or in the rest of the world, working on different aspects of information theory.
 Follow various online classes such as Raymond W. Yeung's Information Theory course, Dan Boneh's crypto, crypto II, Jon Katz's crypto class or Umesh Vazirani's course about quantum computing.