Please note: In order to keep Hive up to date and provide users with the best features, we are no longer able to fully support Internet Explorer. The site is still available to you, however some sections of the site may appear broken. We would encourage you to move to a more modern browser like Firefox, Edge or Chrome in order to experience the site fully.

Combinatorial Methods and Models : Rudolf Ahlswede's Lectures on Information Theory 4, Hardback Book

Combinatorial Methods and Models : Rudolf Ahlswede's Lectures on Information Theory 4 Hardback

Edited by Alexander Ahlswede, Ingo Althoefer, Christian Deppe, Ulrich Tamm

Part of the Foundations in Signal Processing, Communications and Networking series

Hardback

Description

The fourth volume of Rudolf Ahlswede’s lectures on Information Theory is focused on Combinatorics.

Ahlswede was originally motivated to study combinatorial aspects of Information Theory via zero-error codes: in this case the structure of the coding problems usually drastically changes from probabilistic to combinatorial.

The best example is Shannon’s zero error capacity, where independent sets in graphs have to be examined.

The extension to multiple access channels leads to the Zarankiewicz problem. A code can be regarded combinatorially as a hypergraph; and many coding theorems can be obtained by appropriate colourings or coverings of the underlying hypergraphs.

Several such colouring and covering techniques and their applications are introduced in this book.

Furthermore, codes produced by permutations and one of Ahlswede’s favourite research fields -- extremal problems in Combinatorics -- are presented.   Whereas the first part of the book concentrateson combinatorial methods in order to analyse classical codes as prefix codes or codes in the Hamming metric, the second is devoted to combinatorial models in Information Theory.

Here the code concept already relies on a rather combinatorial structure, as in several concrete models of multiple access channels or more refined distortions.

An analytical tool coming into play, especially during the analysis of perfect codes, is the use of orthogonal polynomials. Classical information processing concerns the main tasks of gaining knowledge and the storage, transmission and hiding of data.

The first task is the prime goal of Statistics. For transmission and hiding data, Shannon developed an impressive mathematical theory called Information Theory, which he based on probabilistic models.

The theory largely involves the concept of codes with small error probabilities in spite of noise in the transmission, which is modeled by channels.

The lectures presentedin this work are suitable for graduate students in Mathematics, and also for those working in Theoretical Computer Science, Physics, and Electrical Engineering with a background in basic Mathematics.

The lectures can be used as the basis for courses or to supplement courses in many ways.

Ph.D. students will also find research problems, often with conjectures, that offer potential subjects for a thesis.

More advanced researchers may find questions which form the basis of entire research programs.

Information

Other Formats

Save 18%

£99.99

£81.69

Item not Available
 
Free Home Delivery

on all orders

 
Pick up orders

from local bookshops

Information