**Information
and Coding Theory Seminar** **
**

**Abstracts
of talks**

*Spring 2006*

**2/2 Damianos Karakos (JHU), Estimating Conditional
Densities from Sparse Data for Statistical
Language Modeling**

ABSTRACT:

The Maximum Likelihood Set (MLS) was recently introduced in [Jedynak-Khudanpur05]
as an effective, parameter-free technique for estimating a probability mass
function (pmf) from sparse data. The MLS contains all pmfs that assign merely
a higher likelihood to the observed counts than to any other set of counts,
for the same sample size. In this paper, the MLS is extended to the case of
conditional density estimation. First, it is shown that, when the criterion
for selecting a pmf from the MLS is the KL-divergence, the selected conditional
pmf naturally has a back-off form, except for a ceiling on the probability of
high frequency unigrams that are not seen in particular contexts. Second, the
pmf has a sparse parameterization, leading to efficient algorithms for KL-divergence
minimization. Finally, a novel fattening of the MLS, called the High Likelihood
Set (HLS) is introduced. It contains the MLS, and some neighboring pmfs. Experimental
results from bigram and trigram estimation indicate that pmfs selected
from the HLS are competitive with state-of-the-art estimates.

This is joint work with Sanjeev Khudanpur.

**2/28 Achilleas Anastasopoulos
(UMich/UMD), Performance analysis and decoding complexity of turbo-like codes
in
noisy channels**

ABSTRACT:

** Turbo-like
codes have been shown (experimentally) to achieve performance close to capacity
with decoding complexity that is linear in the code length (i.e., constant complexity
per information bit). However it is only for the trivial binary erasure channel
(BEC) that we have provable results on analyzing the performance of message
passing algorithms.
This is attributed to the fact that the only tool available for performance
analysis of iterative decoding, namely density evolution, is a simple one-dimensional
recursion on the BEC, but it is a infinite-dimensional recursion on any other
noisy channel (evolution of a probability density function).
In this presentation we will review the latest results on the performance/complexity
tradeoff known for the BEC and noisy channels. We will then present a novel
method for analyzing the performance of low-density parity-check (LDPC) codes
transmitted over noisy channels and decoded using message passing decoding.
This new approach results in an one-dimensional recursion, almost identical
to the one derived for the BEC.**

**3/7 (i) Achilleas Anastasopoulos
(cont'd) (ii) Andrew Duggan (UMD) Performance bounds on the list decoding of
RS codes**

ABSTRACT: A method for Algebraic Soft-Decision
Decoding (ASD) of Reed-Solomon codes was first proposed by Guruswami and Sudan
in their landmark paper of 1999. This list-decoding method has shown promise
to correct well beyond half the minimum distance, but so far, its performance
has mostly been supported only by experimental results.

In this talk, I will present a decoding radius for ASD in the case of an additive
channel. This radius exceeds other hard-decision decoding methods for certain
code rates. The tradeoff between complexity and performance, as measured by
the decoding radius, will also be discussed.

**4/3 Time
11:30 Poorvi Vora (GWU) Related-Key Linear Cryptanalysis of Block Ciphers**

ABSTRACT: We will describe a coding theory
framework for related-key linear cryptanalytic attacks on block ciphers. The
framework treats linear cryptanalysis as communication, by theadversary, over
a low capacity channel, and a related key attack (RKA) as a concatenated code.
We will use the framework to show that, in a sense we will make precise, RKAs
are asymptotically more efficient than single key attacks. We will describe
a specific RKA that is predicted to produce a 30% improvement for some of Matsui's
original DES attacks. Experiments to test this prediction are ongoing.

Note that the above results do not require that the original cipher be vulnerable
to related-key attacks, only that it be vulnerable to linear cryptanalysis.

This is joint work with Darakhshan Mir. A paper in review describing these results
may be found at:

http://www.seas.gwu.edu/~poorvi/RKA06.pdf

** 4/10 Andrew
Duggan (UMD) Correcting Errors Beyond the Guruswami-Sudan Radius
(and the Johnson Bound) in Poynomial Time**

Parvaresh and Vardy present a new set of codes, based on Reed-Solomon (RS) codes, which can be list-decoded to a radius that exceeds the previousbest radius achieved by Guruswami and Sudan. Guruswami and Sudan were able to achieve their result by constructing bivariate polynomials and factoring them to obtain a list of candidate codewords. Parvaresh and Vardy extend this concept to trivariate polynomials (and beyond) through the introduction of new codes. This talk will discuss the properties of these new codes and derive the error-correcting bound presented in their paper.

**4/17 Pascal Vontobel
(MIT) Graph-Cover Decoding: Connecting Iterative Decoding and Linear
Programming Decoding**

ABSTRACT: Whenever information
is transmitted across a channel, we have to ensure its integrity against errors.
The ground-breaking work of Shannon showed (at least theoretically) how such
integrity can be achieved, namely by using an appropriately chosen encoder at
the sender side and an appropriately chosen decoder at the receiver side.

>From a practical point of view, so-called low-density parity-check (LDPC)
and turbo codes together with iterative decoders have become increasingly popular
in the last decade. It is fair to say that these codes and decoding algorithms
(and ideas related to them) have thoroughly changed much of modern communications.
Before this backdrop, a good understanding of these types of communication techniques
is obviously highly desirable, especially the understanding of iterative decoding
of finite-length codes.

Another interesting development in coding theory is the linear programming decoder
that was recently proposed by Feldman, Karger, and Wainwright. Simulation results
indicate that this decoding algorithm seems to have a similar decoding behavior
as iterative decoding. In this talk -- by introducing a theoretical tool called
the graph-cover decoder -- we will argue why this is so.

These connections allow one to characterize a given finite-length code under
iterative decoding, in particular, one can study a (possibly existing) error
floor in the word error rate vs. SNR curve. Let us emphasize that this is in
stark contrast to most other existing iterative decoding analysis techniques
which only characterize ensembles of codes or infinitely long codes.

Moreover, in the light of the above connections, one actually wonders if it
is possible to efficiently solve the linear program that appears in the linear
programming decoder. A priori this is not clear, and, in fact, experiments show
that general-purpose linear programming solvers do not seem to work efficiently
enough for codes of reasonable length. However, as recent results show, by taking
advantage of the specific structure of the linear program at hand, one can formulate
algorithms that efficiently find the decision made by the linear programming
decoder.

(This talk is based on joint work with Ralf Koetter, UIUC.)

** 4/24 Sirin Nitinawarat
(UMD) **

** ABSTRACT:
**

** 5/****8
Ahmed Sadek (UMD) Impact of Cooperation at the MAC Layer.**

Abstract: User cooperation has emerged recently as a promising means of achieving MIMO-like diversity gains. The basic model consists of three terminals: a source, a relay, and a destination. A number of strategies have been developed for this model including, for example, decode-and-forward, amplify-and-forward, selective relaying, incremental relaying, and distributed space-time codes. The theoretical analysis and simulations presented recently in different works reveal significant gains enabled by cooperation in terms of error-rate performance, outage-capacity, cell coverage, power efficiency, and many other performance metrics. Most of the work on cooperation has focused on the gains achieved at the physical layer, and there is very little work that studied cooperation at higher network layers. In this talk, we will present some recent results that tries to quantify the impact of cooperation at the multiple-access layer. In particular, we consider a multiple-access channel with a finite number of terminals and a single relay, and we address the question of whether the gains of cooperation can be leveraged to the MAC layer.