**
Communication, Control**
**
and Signal Processing Seminar****
**

**Abstracts
of talks**

*Spring 2011*

** 02/24 Arya Mazumdar (UMD)**, **Constructions of matrices with
restricted isometry property for compressive sampling**

ABSTRACT: Compressive sampling is a technique of recovering sparse
N-dimensional signals from low-dimensional sketches, i.e., their linear images
in R^m, m << N. The main question associated with this technique is
construction of linear operators that allow faithful recovery of the signal from
its sketch. The most frequently used sufficient condition for robust recovery
is the near-isometry property (RIP) of the operator when restricted to k-sparse
signals.

The talk will begin with a brief overview of the main problem of Compressive
Sampling and known results. Our emphasis will be on the construction of binary m
x N matrices that satisfy the restricted isometry property of order k for almost
all k-sparse signals (statistical k-RIP). Previous results were concerned with
specific constructions of matrices obtained from DFT-like arrays. Here we
discuss a general method of constructing sampling matrices for which a
statistical version of k-RIP holds, enabling one to recover almost all k-sparse
signals from sketches much shorter than those in previous works. Our
constructions are practical and cover a large set of parameters.

We also show that k-RIP matrices that recover all (rather than almost all)
sparse signals can be found constructively with time complexity polynomial (in
N), while the number of samples is close to the smallest possible dimension. The
construction involves derandomizing the construction of generator matrices of
Gilbert-Varshamov codes. Joint work with Alexander Barg

03/03 **No seminar**, instead:
**Marilyn Wolf** (Georgia Tech) **
Control and
Computing in Cyber-Physical Systems**

**03/07 (MONDAY, 11am, AVW1146) Pascal Vontobel (HP
Labs) Should We Believe in Numbers Computed
by Loopy Belief Propagation?**

ABSTRACT: Loopy belief propagation (LBP)
has been very popular in the last fifteen years in the area of coded data
transmission (and beyond) because of its low implementation complexity and its
outstanding performance. In this talk, we discuss an analysis technique that
allows one to obtain a better understanding of why LBP works so well for some
problems, yet also has some limitations.

Although this LBP analysis technique is broadly applicable, to be concrete, we
present it in a specific context. Namely, we focus on the use of LBP for
estimating the sum of so-called perfect matchings in a weighted complete
bipartite graph, a setup that subsumes many interesting counting problems.

Common wisdom would suggest that LBP does not work well in this context because
the underlying graph is dense and has many short cycles. However, it turns out
that LBP gives a very valuable estimate for this counting problem, and the
above-mentioned analysis technique allows us to see why this is so by providing
a combinatorial characterization of the LBP-based estimate: on the one hand,
this combinatorial characterization gives insights why the LBP-based estimate is
useful, on the other hand, it shows why the LBP-based estimate differs in
general from the correct value.

At the end, we will contemplate the use of the above LBP-based estimates for the
analysis of pseudo-codewords and for kernel-based techniques in machine
learning.

**03/10 Armand Makowsky (UMD)
Some remarks on random graphs**

**04/07 Isaak Mayergoyz (UMD),
Landau-Lifshitz Magnetization Dynamics
Driven by a Jump-Noise Process**

Abstract: In this self-contained talk, the basic mathematical and physical facts related to the Landau-Lifshitz equation will be reviewed first along with relevant technological applications. Then, stochastic Landau-Lifshitz dynamics driven by a jump-noise process will be discussed and some new results in this area will be presented.

**04/14 Benjamin Kedem (UMD)
Integration of information from multiple sources **

ABSTRACT: A great deal of the
statistical literature deals with a single sample coming from a distribution,
univariate or multivariate, and the problem

is to identify the distribution or parts thereof by an array of estimation and
testing procedures. As such, this practice neglects to bring in

information from other sources which could improve the desired inference. An
approach which fuses information from many sources will be

presented and applied in various statistical problems, including time series
forecasting. The basic idea: Finding relationships between

probability distributions using many data sources.

**04/21 Abram Kagan (UMD)
Semiparametric Estimation for Kernel Families
**

ABSRTACT: [pdf]

**04/22 Tsachy Weissmann (Stanford)
Mutual Information, Relative Entropy, and the Costs of
Causality and of Mismatch in Estimation**

ABSTRACT**:
**I will start with a brief tour through some of the literature -
ranging from the 1950s to the 2010s - on relations between information

and estimation in the Gaussian channel. A remarkably parallel set of relations
will then be seen to hold for the Poissonian channel. As I

will illustrate with a few examples, beyond their elegance, these relations give
insight into and a quantitative understanding of the

costs of causality and of mismatch in estimation, and allow to transfer both
theoretical tools and algorithmic know-how from one

field to the other. I will end by mentioning some future research directions and
speculating about how some of these results carry over

to other channels. The parts of the talk presenting new results are based on
joint work with Rami Atar.

**05/05 Himanshu Tyagi (UMD)**
**Relationships Between Certain
Quantities of Information Theory and Statistics**

ABSTRACT:
In this talk, we will first present the
de Bruijn's identity which relates differential entropy

and Fisher information. Then we will present the work of Guo-Shamai-Verdu which
relates Shannon's mutual information to the minimum mean square error for a
Gaussian channel. The equivalence of the two results will be established and the
de Bruijn's identity will be applied to prove the entropy-power inequality [Stam
'59].

05/12** Raef Bassily (UMD)
Cryptographic Security and Public key Encryption**

ABSTRACT: In this talk, the notion of
cryptographic security of public key encryption schemes will be defined. We will
then introduce the factoring problem, the RSA (Rivest-Shamir-Adleman) problem,
the Discrete Logarithm problem, and the DH (Deffie-Hellman) problem.

Based on the hardness assumption of the RSA problem, the RSA public key
encryption scheme will be presented. We will also present El Gamal scheme that
is based on the DH key exchange protocol and prove its security (in the
cryptographic sense) under the Chosen Ciphertext Attack based on the hardness of
a stronger version the DH problem, namely, the Decisional DH problem.

If time permits, the notion of Trapdoor Permutations (TDPs) and hardcore
predicates will be introduced and a method of constructing cryptographic secure
public key encryption schemes based on TDPs will be introduced.