Universal Estimation of Information Measures for Analog Sources (Paperback)

, ,
Entropy, mutual information and divergence measure the randomness, dependence and dissimilarity, respectively, of random objects. In addition to their prominent role in information theory, they have found numerous applications, among others, in probability theory statistics, physics, chemistry, molecular biology, ecology, bioinformatics, neuroscience, machine learning, linguistics, and finance. Many of these applications require a universal estimate of information measures which does not assume knowledge of the statistical properties of the observed data. Over the past few decades, several nonparametric algorithms have been proposed to estimate information measures. Universal Estimation of Information Measures for Analog Sources presents a comprehensive survey of universal estimation of information measures for memoryless analog (real-valued or real vector-valued) sources with an emphasis on the estimation of mutual information and divergence and their applications. The book reviews the consistency of the universal algorithms and the corresponding sufficient conditions as well as their speed of convergence. Universal Estimation of Information Measures for Analog Sources provides a comprehensive review of an increasingly important topic in Information Theory. It will be of interest to students, practitioners and researchers working in Information Theory

R2,050

Or split into 4x interest-free payments of 25% on orders over R50
Learn more

Discovery Miles20500
Mobicred@R192pm x 12* Mobicred Info
Free Delivery
Delivery AdviceShips in 10 - 15 working days


Toggle WishListAdd to wish list
Review this Item

Product Description

Entropy, mutual information and divergence measure the randomness, dependence and dissimilarity, respectively, of random objects. In addition to their prominent role in information theory, they have found numerous applications, among others, in probability theory statistics, physics, chemistry, molecular biology, ecology, bioinformatics, neuroscience, machine learning, linguistics, and finance. Many of these applications require a universal estimate of information measures which does not assume knowledge of the statistical properties of the observed data. Over the past few decades, several nonparametric algorithms have been proposed to estimate information measures. Universal Estimation of Information Measures for Analog Sources presents a comprehensive survey of universal estimation of information measures for memoryless analog (real-valued or real vector-valued) sources with an emphasis on the estimation of mutual information and divergence and their applications. The book reviews the consistency of the universal algorithms and the corresponding sufficient conditions as well as their speed of convergence. Universal Estimation of Information Measures for Analog Sources provides a comprehensive review of an increasingly important topic in Information Theory. It will be of interest to students, practitioners and researchers working in Information Theory

Customer Reviews

No reviews or ratings yet - be the first to create one!

Product Details

General

Imprint

Now Publishers Inc

Country of origin

United States

Series

Foundations and Trends (R) in Communications and Information Theory

Release date

May 2009

Availability

Expected to ship within 10 - 15 working days

First published

May 2009

Authors

, ,

Dimensions

234 x 156 x 5mm (L x W x T)

Format

Paperback

Pages

104

ISBN-13

978-1-60198-230-8

Barcode

9781601982308

Categories

LSN

1-60198-230-5



Trending On Loot