Information Theory And Coding -...
Transcript of Information Theory And Coding -...
Information Theory And Coding
Introduction:
What is “Information Theory” ?
“Information Theory” answers two fundamental questions in communication theory: • What is the ultimate data compression? (answer: the entropy of the data, H, is its compression limit.) •What is the ultimate transmission rate of communication? (answer: the channel capacity, C, is its rate limit.) It founds the most basic theoretical foundations of communication theory.
Information theory and coding
source coding channel coding cryptography
Reduce the number of binary digits required to convey a given amount of information
Add redundant bits to the message to protect the latter against transmission errors
Data compression
Error Correction and Error Detectio
Security/Privacy
• Example of information source:
Speech, Image , Video, Text file , Music
• Example of channels:
Airwaves (EM radiation) , Cable ,Telephone line ,Hard disk ,CD, DVD ,Flash memory
device ,Optical path ,Internet
• Example of information receiver :
TV screen,Audio system and listener, Computer file ,Image printer and viewer
Difference between information theory ,Communications theory and Signal processing.
Information theory is a field of science first developed by Clyde Shannon to determine the limits of information transfer. From information theory we learn what is the theoretical capacity of a channel and the envelope of performance that we can achieve. It drives the development of codes and efficient communications but says nothing about how this may be done. The important sub-fields of information theory are source coding, channel coding. Communications Theory is all about how to make the information transfer happen from A to B within the constraints of Information theory. It is concerned with choice of media, carriers, maximum number of bits that can be transferred in a given bandwidth, mapping of information to carriers, channel degradation mitigation and link performance. It uses transform theory as its mathematical basis. To be able to fully understand communications, one needs to know Fourier, Hilbert, LaPlace and Z transforms as well as convolution and filtering. Signal processing is a largely mathematical science of mapping one domain to another, analog to digital, frequency to time. To design digital hardware one needs to understand in detail how operation such as A/D, D/A conversions, and digital math are done. Signal processing provides the tools that implement communications designs. Source coding : Source encoder converts the symbols from the source to group of bits .It's required to be the least possible number of bits to save time of processing and to utilize the bandwidth in an efficient way and to reduce the energy consumption . Most sources of information contain redundant information that does not need to be actually transmitted (or stored).