Description

This book is about the definition of the Shannon measure of Information, and some derived quantities such as conditional information and mutual information. Unlike many books, which refer to the Shannon's Measure of information (SMI) as 'Entropy,' this book makes a clear distinction between the SMI and Entropy.In the last chapter, Entropy is derived as a special case of SMI.Ample examples are provided which help the reader in understanding the different concepts discussed in this book. As with previous books by the author, this book aims at a clear and mystery-free presentation of the central concept in Information theory — the Shannon's Measure of Information.This book presents the fundamental concepts of Information theory in a friendly-simple language and is devoid of all kinds of fancy and pompous statements made by authors of popular science books who write on this subject. It is unique in its presentation of Shannon's measure of information, and the clear distinction between this concept and the thermodynamic entropy.Although some mathematical knowledge is required by the reader, the emphasis is on the concepts and their meaning rather on the mathematical details of the theory.

Information Theory - Part I: An Introduction To The Fundamental Concepts

Product form

£28.00

Includes FREE delivery
Usually despatched within 3 days
Paperback / softback by Arieh Ben-naim

1 in stock

Short Description:

This book is about the definition of the Shannon measure of Information, and some derived quantities such as conditional information... Read more

    Publisher: World Scientific Publishing Co Pte Ltd
    Publication Date: 02/08/2017
    ISBN13: 9789813208834, 978-9813208834
    ISBN10: 981320883X

    Number of Pages: 368

    Non Fiction , Dictionaries, Reference & Language

    Description

    This book is about the definition of the Shannon measure of Information, and some derived quantities such as conditional information and mutual information. Unlike many books, which refer to the Shannon's Measure of information (SMI) as 'Entropy,' this book makes a clear distinction between the SMI and Entropy.In the last chapter, Entropy is derived as a special case of SMI.Ample examples are provided which help the reader in understanding the different concepts discussed in this book. As with previous books by the author, this book aims at a clear and mystery-free presentation of the central concept in Information theory — the Shannon's Measure of Information.This book presents the fundamental concepts of Information theory in a friendly-simple language and is devoid of all kinds of fancy and pompous statements made by authors of popular science books who write on this subject. It is unique in its presentation of Shannon's measure of information, and the clear distinction between this concept and the thermodynamic entropy.Although some mathematical knowledge is required by the reader, the emphasis is on the concepts and their meaning rather on the mathematical details of the theory.

    Customer Reviews

    Be the first to write a review
    0%
    (0)
    0%
    (0)
    0%
    (0)
    0%
    (0)
    0%
    (0)

    Recently viewed products

    © 2024 Book Curl,

      • American Express
      • Apple Pay
      • Diners Club
      • Discover
      • Google Pay
      • Maestro
      • Mastercard
      • PayPal
      • Shop Pay
      • Union Pay
      • Visa

      Login

      Forgot your password?

      Don't have an account yet?
      Create account