**Information
and Coding Theory Seminar** **
**

**Abstracts
of talks**

**Fall*** 2007*

**09/20 Alexander Barg (UMD), An Introductory Overview of Coding Theory**

ABSTRACT:

The talk is a leisurely overview of some coding theory
problems given for the benefit of the students who want to attend
regular meetings of the seminar but are not familiar with coding theory.
Students who do not plan to attend, but would like to know what
coding theory is are also invited. We will give basic definitions
and mention applications such as error correction, cryptography, puzzles.

**10/11, 10/18 Prakash Narayan (UMD), Information Theory for Beginners **

ABSTRACT:

In this introductory lecture, basic concepts in
information theory -- entropy, Kullback-Leibler
divergence, mutual information, channel capacity,
rate-distortion function -- will be introduced,
along with their operational significance in digital
communication. Illustrations will also be provided
of the the use of standard tools such as random coding
arguments, Fano's Lemma, method of types, data processing
lemma, etc. This effort is intended to serve as an
introduction to pertinent technical material for
our seminar series this semester, as well as a
warmup act for the graduate course in
information theory (ENEE 627) in Spring 2008.

**10/25 Wei Kang (UMD), The Relay Channel **

ABSTRACT:

As the simplest model of cooperative communications, relay channel has been extensively studied since 1971, when it was first introduced by van der Meulen. In 1979, Cover and El Gamal proposed two major achievability schemes for the relay channel, which are widely known as decode-and-forward (DAF) and compress-and-forward (CAF). In this talk, I will first review these two schemes. Then, I will present our recent result on improving CAF scheme by preserving the correlation between the channel inputs of the transmitter and the relay.

**11/01 Nuno C. Martins (UMD), Entropy-power inequalities in networked control **

ABSTRACT:

In this talk, I will cover a few examples of the use
of entropy-power inequalities in networked control.
I will start by deriving a few basic inequalities
relating the moments of a random variable and its
differential entropy. I will then show how they can
be used to establish the optimality of certain basic
control schemes. Subsequently, I will extend
such a paradigm to show the optimality of a
control policy in the presence of preview, where the
preview information is conveyed via a white Gaussian
channel with input power constraints. If time permits,
i will also show the optimality of a linear scheme
for a decentralized decision problem and relate it to
Witsenhausen's counterexample. The presentation is
calibrated so as to be accessible to anyone with basic
understanding of information theoretic principles, at
the level expounded by Prof. Narayan in prior lectures.
Examples were chosen to be as simple as possible.

**11/08 Brooke Shrader (UMD), Feedback Capacity of the Compound Channel**

ABSTRACT:

The compound channel is a model for communication over an unknown channel and for multicast transmission.
In this talk, we will explore the role of feedback in increasing the capacity of the compound channel.
We will begin with the memoryless compound channel, where a training sequence can be used together
with the feedback to allow the transmitter to "learn" the channel. We will then discuss the compound
channel with memory, including the compound finite-state channel and the compound Gilbert-Elliot channel.
We will present our recent results on the feedback capacity of the compound finite-state channel.
This talk is based on joint work with H. Permuter (Stanford).

**11/15 Venkatesan Guruswami (University of Washington & Institute for Advanced Study), List Decoding with Optimal Rate **

ABSTRACT:

Suppose you want to communicate a message of k packets on a noisy communication channel. So you judiciously encode it as a redundant collection of n = O(k) packets and transmit these. What is the fewest number of correct packets one needs to receive in order to have any hope of recovering the message?
Well, clearly one needs at least k correct packets. In this talk, I will describe an encoding scheme that attains this information-theoretic limit: for any desired eps > 0, it enables decoding the message (in an error-recovery
model called list decoding) as long as at least k(1+eps) packets are received intact. The location of the correct
packets and the errors on the remaining packets can be picked *adversarially* by the channel.
Additionally, our codes are very simple to describe: they are certain folded Reed-Solomon codes, which are just the well known Reed-Solomon codes, but viewed as a code over a larger alphabet via a careful bundling together of codeword symbols. The codes also have a very strong list recovery property (an
extension of list decodability) which is very useful in concatenation schemes, both to achieve the optimal trade-off over constant-sized alphabets and to construct good list-decodable binary codes.
Based on joint work with Atri Rudra.

**11/29 Raef Bahi Youssef (UMD), Broadcast Channels with Confidential Messages **

ABSTRACT:

The problem of reliable communication under a secrecy constraint has recently gained a substantial attention. The first information theoretic approach to the problem of secret communications was introduced by Shannon in his seminal "Communication theory of secrecy systems, 1949." Applying Shannon's concepts of secret communication to a communication system was pioneered by Wyner in "The wire-tap channel, 1975." In such communication system, it is required to communicate information reliably over a discrete memoryless channel that is subjected to a wire-tap at the receiver (usually called eavesdropper). Wyner used the conditional entropy rate of the signal received by the eavesdropper given the transmitted message (equivocation rate) to measure the level of confidentiality (secrecy) guaranteed by the system. Wyner gave a single letter characterization of the rate-equivocation region under a limiting assumption: the signal received by the eavesdropper is a degraded version of the one received by the intended receiver.
In this talk, I will discuss the corner-stone results due to I. Csiszar and J. Korner in their paper "Broadcast channels with confidential messages, 1978." This paper came up with new ideas that were used to generalize Wyner's results to the case when the signal received by the eavesdropper is no more degraded version of the one received by the receiver. Further, the paper asserts that Wyner's formula of the secrecy capacity can be extended to the weaker, "less noisy" and "more capable" channels.

**12/06 Richard Hyong-Jun La (UMD), On the Scalability of Cooperative Time Synchronization
in Pulse-connected Networks **

ABSTRACT:

In this talk we will discuss a time synchronization mechanism proposed by the authors
of the paper (Hu and Servetto). The proposed scheme exploits the spatial averaging in
order to improve the quality of timing information available at each node. As the
density of the network increases, the proposed mechanism is shown to average out random
errors and maintain global synchronization even in multi-hop networks (in the absence of
propagation or processing delay). In this talk, we go over the model developed by the
authors and used to define the time synchronization mechanism and some of key
ideas/observations used in the analysis.

Reference: An-Swol Hu and Sergio Servetto, ``On the scalability of cooperative time
synchronization in pulse-connected networks,'' IEEE Trans. Inform. Theory, vol. 52,
no. 6, June 2006, pp. 2725 -2748.

**12/13 Vladimir Blinovsky (Universität Bielefeld,
Germany), About Convexity of One Function from Coding Theory **

ABSTRACT:

When deriving the upper coding bound for multiple packing
in q-ary Hamming space we find out the necessity of proving
the convexity of some key polynome. In this talk we demonstrate
the properties of this polynome which help us to complete the
proof of upper bound and in some sense complete the multiple
packing theory in Hamming space.