Escolar Documentos
Profissional Documentos
Cultura Documentos
Leonardo Arajo u
05 Maro 2012 c
Leonardo Arajo u
1 Compresso a
Leonardo Arajo u
Compresso a
I have made this letter longer than usual because I lack the time to make it shorter. Blaise Pascal
Leonardo Arajo u
Compresso a
The concept of data compression comes naturally to people who are interested in communications. Data compression is the process of converting an input data stream (the source stream or the original raw data) into another data stream (the output, the bitstream, or the compressed stream) that has a smaller size. Storage and Transmission.
Leonardo Arajo u
Source Coding
The eld of data compression is often called source coding.
source
destina tion
source
coding
de coding
destina tion
Leonardo Arajo u
Source
The source can be memoryless (each symbol is independent of its predecessors) with memory (each symbol depends on some of its predecessors and, perhaps, also on its successors, so they are correlated).
Leonardo Arajo u
Redundancy
Basic principle : compress data by removing redundancy.
Figura 2: Redundancy.
Leonardo Arajo u
Redundancy
Leonardo Arajo u
Redundancy
Leonardo Arajo u
Structure
Leonardo Arajo u
Structure
Leonardo Arajo u
Contextual Redundancy
Contextual redundancy, is illustrated by the fact that the letter Q is almost always followed by the letter U (i.e., that certain digrams and trigrams are more common in plain English than others).
Leonardo Arajo u
Structure
general law of data compression : assign short codes to common events (symbols or phrases) and long codes to rare events
Leonardo Arajo u
2012-03-06
general law of data compression : assign short codes to common events (symbols or phrases) and long codes to rare events
Compression is therefore possible only because data is normally represented in the computer in a format that is longer than absolutely necessary. The reason that inecient (long) data representations are used all the time is that they make it easier to process the data, and data processing is more common and more important than data compression.
dilemmas
Leonardo Arajo u
2012-03-06
dilemmas
Modifying an algorithm to improve compression by 1% may increase the run time by 10% and the complexity of the program by more than that.
Conjectures1
Data compression may be interpreted as a process of removing unnecessary complexity (redundancy) in information, and thereby maximizing simplicity while preserving as much as possible of its nonredundant descriptive power.
A conjecture is a proposition that is unproven but is thought to be true and has not been disproven.
Leonardo Arajo u Processamento de Audio e V deo
Conjectures
All kinds of computing and formal reasoning may usefully be understood as information compression by pattern matching, unication, and search. The process of nding redundancy and removing it may always be understood at a fundamental level as a process of searching for patterns that match each other, and merging or unifying repeated instances of any pattern to make one.
Leonardo Arajo u
Leonardo Arajo u
Bibliography I
Leonardo Arajo u