Explain the difference between single-bit errors and burst errors in error control in communications systems. (3 marks)

(a) If a noise event causes a burst error to occur that lasts for 0.1 ms (millisecond) and data is being transmitted at 100Mbps, how many data bits will be affected? (3 marks)

(b) Under what circumstances is the use of parity bits an appropriate error control technique?

Respuesta :

Answer: a) Single-bit errors: It affects only one bit in a symbol.

                   Burst errors: It affects to several sequential bits in a given

                   symbol

               b)  10000 bits.

               c) When the most typical error is a single bit error.

             

Explanation: a) It is explained in the answer.

                      b) If data is being transmitted at 100 Mbps, this means that in 1 second, 100x 10e6 bits are transmitted, and that one single bit lasts for 10e-8 sec.

So, if the noise event that causes the burst error lasts 0.1 msec, this means 10e-4 sec.

Number of bits in error= 10e-4/10e-8= 10e4 bits= 10000 bits.

                    c) If the error is in a single bit in a symbol, this means that if the right number of 1s is even, a single error will change this number to an odd number , and the error could be easily detected.