Communication in the presence of sparsity

With Yiannis Kontoyiannis (Cambridge)

Communication in the presence of sparsity

In his seminal 1948 work, Claude Shannon determined the fundamental limits of
the best achievable performance in point-to-point communication. His analysis
depended on two assumptions: That data are communicated in asymptotically
large blocks, and that the information sources and the noise channels involved
satisfy certain statistical regularity properties. We revisit the central
information-theoretic problems of efficient data compression and channel
transmission without these assumptions. First, we describe a general
development of non-asymptotic coding theorems, providing useful expressions
for the fundamental limits of performance on finite data. These results may
play an important role in applications where performance or delay guarantees
are critical, and they offer valuable operational guidelines for the design of
practical compression algorithms and error correcting codes. Second, motivated
by modern applications involving sparse and often “big” data (e.g., in
neuroscience, web and social network analysis, medical imaging, and optical
media recording), we state and prove a series of theorems that determine the
best achievable rates of compression and transmission, when the information or
the noise are appropriately “sparse”. Interestingly, in these cases, the
classical results in terms of the entropy and channel capacity are shown to be
inaccurate, even as first-order approximations.

No background in information theory will be assumed.

Add to your calendar or Include in your list