Associate professor, Georgia Institute of Technology

Topic: Physical-layer security: information-theoretic tools and code constructions

Abstract: In this short course, we will explore some of the core concepts and mathematical tools behind physical-layer security. The emphasis of the course will be on developing a conceptually simple yet powerful framework to approach physical-layer security problems; the usefulness of the approach will be illustrated by revisiting several examples, including covert communications, secret communications, etc.


Matthieu Bloch received the Engineering degree from Supélec, Gif-sur-Yvette, France, the M.S. degree in Electrical Engineering from the Georgia Institute of Technology, Atlanta, in 2003, the Ph.D. degree in Engineering Science from the Université de Franche-Comté, Besançon, France, in 2006, and the Ph.D. degree in Electrical Engineering from the Georgia Institute of Technology in 2008.

In 2008-2009, he was a postdoctoral research associate at the University of Notre Dame, South Bend, IN, USA. Since July 2009, Dr. Bloch has been on the faculty of the School of Electrical and Computer Engineering at the Georgia Institute of Technology, where he is currently an Associate Professor. His research interests are in the areas of information theory, error-control coding, wireless communications, and cryptography.

Dr. Bloch is a member of the IEEE and has served on the organizing committee of several international conferences; he is the current chair of the Online Committee of the IEEE Information Theory Society. He is the co-recipient of the IEEE Communications Society and IEEE Information Theory Society 2011 Joint Paper Award and the co-author of the textbook Physical-Layer Security: From Information Theory to Security Engineering published by Cambridge University Press.

Homepage: http://bloch.ece.gatech.edu/

Professor of Electrical Engineering, Stanford University

Topic: Statistical inference of and with information theoretic quantities via compression

Abstract: The talks will cover both classical and recent results with two intertwined themes. The first is that the characterization of fundamental limits in data compression and communication gives rise to quantities - such as entropy, mutual information, and directed information - that are of relevance to statistical inference at large. The second is that analytic tools and algorithmic know-how from compression (both lossless and lossy) can be harnessed to construct and analyze schemes for general tasks such as sequential decision making, non-sequential inference (e.g. denoising), and estimation of the degree to which one phenomenon is of relevance in prediction of another. Examples will illustrate these ideas on such data as financial time series, genomic sequences, text, and multimedia signals.

Biography: Tsachy Wiessman graduated summa cum laude with a B.Sc. in Electrical Engineering from the Technion in 1997, and later earned a PhD from the same institution in 2001. He worked at Hewlett-Packard Laboratories with the Information Theory Group until 2003, and has been a professor of Electrical Engineering at Stanford University since 2003. He spent the academic years 2007-2009 on leave with the Department of Electrical Engineering at the Technion.

His research focuses on Information Theory and Communications, Statistical Signal Processing, the interplay between them, and their applications. He is the inventor of several patents and involved in a number of companies as researcher or member of the technical board. His honors include an NSF CAREER award, several best paper awards, Horev fellowship for Leaders in Science and Technology, Henry Taub prize for excellence in research, incumbent of the STMicroelectronics Chair in the School of Engineering. He is an IEEE fellow. He is also on the editorial boards of the IEEE Transactions on Information Theory and Foundations and Trends in Communications and Information Theory, and is a founding Director of the Stanford Compression Forum.

Homepage: http://web.stanford.edu/~tsachy/

Professor, University of California at San Diego

Topic: Private Information Retrieval: Coding Instead of Replication

Abstract: Private information retrieval protocols allow a user to retrieve a data item from a database without revealing any information about the identity of the item being retrieved. Specifically, in information-theoretic k-server PIR, the database is replicated among k non-communicating servers, and each server learns nothing about the item retrieved by the user. The effectiveness of PIR protocols is usually measured in terms of their communication complexity, which is the total number of bits exchanged between the user and the servers. However, another important cost parameter is the storage overhead, which is the ratio between the total number of bits stored on all the servers and the number of bits in the database. Since single-server information-theoretic PIR is impossible, the storage overhead of all existing PIR protocols is at least 2 (or k, in the case of k-server PIR).

In this work, we show that information-theoretic PIR can be achieved with storage overhead arbitrarily close to the optimal value of 1, without sacrificing the communication complexity. Specifically, we prove that all known k-server PIR protocols can be efficiently emulated, while preserving both privacy and communication complexity but significantly reducing the storage overhead. To this end, we distribute the n bits of the database among s+r servers, each storing n/s coded bits (rather than replicas). Notably, our coding scheme remains the same, regardless of the specific k-server PIR protocol being emulated. For every fixed k, the resulting storage overhead (s+r)/s approaches 1 as s grows; explicitly we have r < k \sqrt{s}(1 + o(1)). Moreover, in the special case k = 2, the storage overhead is only 1 + (1/s).

In order to achieve these results, we introduce and study a new kind of binary linear codes. We call them k-server PIR codes, although they could be also called "availability codes". We then show how such codes can be constructed from Steiner systems, from one-step majority-logic decodable codes, from constant-weight codes, and from certain locally recoverable codes. We also establish several bounds on the parameters of k-server PIR codes, and tabulate the results for all s \le 32 and k \le 16.

Biography: Professor Vardy is a leading expert in coding theory. With coding and decoding of error-correcting codes taking place in a plethora of devices, from satellites to cell phones and from computer drives to CDs, his work has widespread implications. His research is leading to a better understanding of the uses and limitations of error-correcting codes in encoding data for transmission and storage. Soft-decision decoders capture more information from an encoded signal than conventional hard-decision decoders, and thereby enable reliable communication at lower signal power.

Alexander Vardy has co-developed an efficient algebraic soft-decision decoder for Reed-Solomon codes, which are the most popular class of error-correcting codes in use today. It has long been known that random codes achieve channel capacity. However, Vardy proved that computing the distance of a random code, which is necessary for its evaluation, is an intractable problem. He has also helped explain mathematically why iterative codes (such as turbo codes and LDPC codes) come close to achieving capacity.

Alexander Vardy moved to the Jacobs School in UCSD in 1998 from the faculty of the University of Illinois, Urbana-Champaign. Before that, he worked in the private sector, including a stint as a visiting scientist at the IBM Almaden Research Center. Among other honors, he is a Fellow of the IEEE and a past Editor-in-Chief of the IEEE Transactions on Information Theory. He received his Ph.D. in Electrical Engineering in 1991 from Tel Aviv University. He is a recipient of a David and Lucile Packard Fellowship and an NSF CAREER award.

Homepage: http://jacobsschool.ucsd.edu/faculty/faculty_bios/index.sfe?fmp_recid=76

Associate professor, University of Washington at Seattle

Topic: Coding and Information in Interactive Communication

Abstract: The study of efficient interactive communication protocols using tools from information theory has led to many interesting results for basic problems in the last few decades. In this mini-course, we shall learn the basic concepts of communication complexity and take a quick tour of some its applications to distributed computing, data structures and complexity theory, stressing the central role played by information theory in this arena. We shall also make a brief exploration of coding theory in the context of interactive communication protocols.

Biography: Anup Rao received his Bachelors in mathematics and computer science from Georgia tech in 2002 and his PhD in computer science from the University of Texas at Austin in 2007. He was a postdoctoral researcher at the Institute of Advanced Study at Princeton during 2007-2009, and is currently an associate professor at the University of Washington at Seattle.

He was a recipient of the best student paper award at the 2006 ACM Symposium On Theory of Computing. His research interests include communication complexity, pseudorandomness, coding theory, and combinatorics.

Homepage: http://homes.cs.washington.edu/~anuprao/

Research scientist, Nokia Bell labs

Topic: Cache Networks: Fundamentals, Opportunities, and Challenges

Abstract: Caching is an essential technique to improve throughput and latency in a vast variety of applications from CPU design to web content delivery networks (CDNs). Companies like Akamai, Facebook, Netflix, Google, etc. are heavily investing on their cache networks to increase system performance. There is a rich and beautiful theory, developed mostly in the computer science community during the 80s and 90s, for systems with a single cache. However, when it comes to networks of caches, the existing theory falls short, and engineers instead rely on heuristics and the intuition gained from the analysis of single-cache systems. Recent results suggest that information theory can in fact provide the theoretical underpinning for the deployment and operation of cache networks. Applying information-theoretic tools for the analysis of cache networks reveals that the conventional way to operate these networks can be substantially suboptimal, and that new concepts such as coded multicasting are needed. The goal of this short course is to discuss various opportunities and challenges for cache networks, with emphasis on the role of information theory in offering a fundamental view on this problem. We start by giving an overview of caching systems, followed by a review of the theory of single-cache systems. We proceed with recent results on cache networks in a variety of scenarios. During the tutorial we will discuss various open problems motivated by real-life caching applications. If time allows, we will also review some of the very recent results on how information theory and coding can be used in distributed computing to save communication bandwidth and increase computation speed. The ultimate goal is to become closer to an information-theoretic understanding on the tension and tradeoffs among computing, storage, and communication resources in distributed systems.

Biography: Mohammad Ali Maddah-Ali received the B.Sc. degree from Isfahan University of Technology, the M.A.Sc. degree from the University of Tehran, and PhD from University of Waterloo, Canada. He then worked for one year in Nortel Networks as a research scientist. From 2008 to 2010, he was a post-doc at the University of California, Berkeley. In September 2010, he joined Nokia Bell Labs, as a research scientist.

Dr. Maddah-Ali received several awards, including mention from the IEEE Information Theory Society for introducing interference alignment in 2009, a best paper award from IEEE International Conference on Communications (ICC) in 2014, the IEEE Communications Society and IEEE Information Theory Society Joint Paper Award in 2015, and the IEEE Information Theory Society Paper Award in 2016.

Homepage: https://www.bell-labs.com/usr/mohammadali.maddah-ali


  • Hostel check-in can be made in the afternoon of 9th July. We will have some ice-breaking activities in the evening.
  • Hostel check-out shall be made by the morning of 15th July.
  • See this page for more accommodation details.
  • Forum: Discussions, announcements, and study material will be posted on piazza. All participants have been sent a link to join the forum.
  • "Poster competition": All posters will be judged by panel. Prizes for the best poster. In addition, the audience can vote for the best poster. The one with most votes gets a prize.
  • Video abstract competition: participants can create 2-minute video abstracts of their poster presentation. Prizes for the best video abstract. example 1 example 2 example 3
  • Information theory Speed Dating
    • 13 JULY 2017 THUR - Room 1009, 10/F, William M.W. Mong Engineering Building
    • Participants take turns to explain information theory problems (personal research/any problems they are interested in) to others in 2 minutes in the "speed dating" style. "Partners" vote for the best person.

Information desk

The information desk will be set at the Foyer of ERB LT which is at the 8/F of William M.W. Mong Engineering Building and is the same place for the discussion area and refreshment area. This will be open during the following hours:
  • 10-11 JULY 2017 Mon-Tue - 8:30-16:00
  • 12 JULY 2017 Wed - 8:30-13:00
  • 13 JULY 2017 Thu - 8:45-16:00
  • 14 JULY 2017 Fri - 8:45-14:30
Participants can collect their registration kit during these hours. Name badges may also be collected from the volunteers at iHouse from 2:30-6:30PM on 9th July.

Poster session-lunch-dinner

  • Welcome Reception
    • 9 JULY 2017 SUN 19:00-21:00 - Orchid Lodge. Buffet style with canapes, salad, hot food, dessert and drinks
  • Lunches:
    • 10 JULY 2017 MON - Western Set Lunch at Staff Common Room
    • 11 JULY 2017 TUE - Vegetarian Lunch at Vegether, Benjamin Franklin Centre
    • 12 JULY 2017 WED - Pizza Lunch at Foyer
    • 13 JULY 2017 THUR - Chinese Style with Dim Sum at Benjamin Franklin Centre Staff Canteen
    • 14 JULY 2017 FRI - Buffet Lunch at Staff Common Room
  • Open Air Poster Forum with Dinner
  • Participants must collect posters from the information desk anytime during 1:00-4:00PM on the day of their poster presentation.
  • 10 JULY 2017 MON - BBQ by the Beach at Siu Lam, Tuen Mun. BBQ dinner
    • Depart: 4:00pm by Coaches from William M.W. Mong Engineering Building.
    • Return: Coaches are arranged to go back to Hyatt Regency Hong Kong Sha Tin and hostel after the dinner.
  • 11 JULY 2017 TUE - Boat trip at Victoria Harbour. Buffet style on the Boat
    • Depart: 4:00pm by Coaches from William M.W. Mong Engineering Building.
    • Return: Coaches are arranged to go back to Hyatt Regency Hong Kong Sha Tin and hostel after the dinner.
  • Excursion with Dinner
    • 12 JULY 2017 WED - Lamma Island. Chinese style seafood dinner.
    • Activities: Hiking, Playing at the Beach and Visiting Lamma Winds - Wind Farm, Tin Hau temples and Kamikaze Grottos - caves etc.
    • Depart: 1:00pm by Coaches from William M.W. Mong Engineering Building to the pier and then take boat to Lamma Island.
    • Return: Boat and Coaches are arranged to go back to Hyatt Regency Hong Kong Sha Tin and hostel after the dinner.
  • Information theory Speed Dating
    • 13 JULY 2017 THUR - Room 1009, 10/F, William M.W. Mong Engineering Building
  • Banquet
    • 13 JULY 2017 THUR - Serenada Chinese Restaurant, Tsim Sha Tsui. Chinese style dinner.
    • Depart: 6:30pm by Coaches from William M.W. Mong Engineering Building
    • Return: Coaches are arranged to go back to Hyatt Regency Hong Kong Sha Tin and hostel after the dinner.
    • You may also stay in Tsim Sha Tsui after the dinner but you have to arrange your own transportation. You can easily go back to University Station by taking MTR West Rail Line at Tsim Sha Tsui East station, and then changing to the MTR East rail line at Hung Hom station.