A thru H |
Definition of Terms
|
Block Code |
Converts a fixed length of K data bits to a fixed length N code
word, where N > K. The rate of the code is K/N. |
Code Rate |
This is equal to the number of information symbols per code
word divided by the total number of symbols per code word. |
Code Word |
A block of n symbols. |
Coding Gain |
The amount of power saved by using Reed-Solomon prior to transmitting. |
Decoder |
The block responsible for stripping the extra bits appended by the
Encoder
to the digital data. |
Encoder |
The block responsible for appending extra bits to the digital data
before transmitting. |
Error Polynomial |
A polynomial used in one of the steps during decoding. |
Euclid |
A 3rd -4th century (ca. 300) BC Greek mathematician whose work served
as the basis for modern geometry. |
Evariste Galois |
Famous for his contributions to group theory, by producing a method
of determining when a general equation could be solved by radicals. |
FEC |
Forward Error Correction, a methodology that uses error
correction coding to transmission. This is the opposite of ARQ (automatic
repeat request) which uses retransmission of data. |
Galois Theory |
A branch of mathematics dealing with the general solution of equations. |
Hamming Codes |
These are the first class of linear binary codes used for error correction
in long-distance telephony.
|
I thru Z |
|
Noise |
An ever present unwanted background signal that needs to be compensated
for or removed |
Non encoded PST |
Data modulated by PSK (Phase Shift Keying) that is not modified by
the Reed-Solomon encoder before transmission. |
Shannon Limit |
Refers to Shannon's noisy channel coding theorem. Which states that
given a channel, one can associate a “channel capacity” such that there
exist control codes that allow transmission across the channel at rates
less than the channel capacity with an arbitrary small bit error rate. |
Symbol Width |
This is the number of bits per symbol. |