Behind the familiar surfaces of the telephone, radio, and television lies a sophisticated and intriguing body of knowledge known as information theory. This is the theory that has permeated the rapid development of all sorts of communication, from color television to the clear transmission of photographs from the vicinity of Jupiter. Even more revolutionary progress is expected in the future. To give a solid introduction to this burgeoning field, J. R. Pierce has revised his well-received 1961 study of information theory for an up-to-date second edition. Beginning with the origins of the field, Dr. Pierce follows the brilliant formulations of Claude Shannon and describes such aspects of the subject as encoding and binary digits, entropy. language and meaning, efficient encoding , and the noisy channel. He then goes beyond the strict confines of the topic to explore the ways in which information theory relates to physics, cybernetics, psychology, and art. Mathematical formulas are introduced at the appropriate points for the benefit of serious students. A glossary of terms and an appendix on mathematical notation are provided to help the less mathematically sophisticated. J. R. Pierce worked for many years at the Bell Telephone Laboratories, where he became Director of Research in Communications Principles. He is currently affiliated with the engineering department of the California Institute of Technology. While his background is impeccable, Dr. Pierce also possesses an engaging writing style that makes his book all the more welcome. An Introduction to Information Theory continues to be the most impressive non-technical account available and a fascinating introduction to the subject for laymen. "An uncommonly good study. . . . Pierce's volume presents the most satisfying discussion to be found."? Scientific American.
Written in an informal style, with a comprehensive glossary and tutorial appendices, this text is an ideal primer for novices who wish to learn the essential principles and applications of information theory.
MACKAY , D. J. C. , and NEAL , R. M. ( 1995 ) Good codes based for blind source separation . In ICA : Principles and Practice , ed . by S. Roberts and R. Everson . Cambridge Univ . Press . on very sparse matrices .
An effective blend of carefully explained theory and practical applications, this text imparts the fundamentals of both information theory and data compression.
... New York, 1995) C.M. Bishop, Pattern Recognition and Machine Learning (Springer, Berlin, 2006) R.E. Blahut, Computation of channel capacity and rate-distortion functions. IEEE Trans. Inf. Theory 18(4), 460–473 (1972) R.E. Blahut, ...
First comprehensive introduction to information theory explores the work of Shannon, McMillan, Feinstein, and Khinchin. Topics include the entropy concept in probability theory, fundamental theorems, and other subjects. 1957 edition.
The Second Edition features: Chapters reorganized to improve teaching 200 new problems New material on source coding, portfolio theory, and feedback capacity Updated references Now current and enhanced, the Second Edition of Elements of ...
This book is intended to introduce coding theory and information theory to undergraduate students of mathematics and computer science.
Introduction to Information Theory
Langdon, G.G., 396 Lapidoth, A., xv lATEX, xvi lattice theory, 124 Lauritzen, S.L., 396 laws of information theory, xiii, 264,321, 325 leaf, 46.46–57 Lebesgue measure, 278, 305 Lee, J.Y.-B., xvi Lee, T.T., 147, 401 Leibler, R.A., 39, ...
Highly useful text studies logarithmic measures of information and their application to testing statistical hypotheses. Includes numerous worked examples and problems. References. Glossary. Appendix. 1968 2nd, revised edition.