Answer: a) Single-bit errors: It affects only one bit in a symbol.
Burst errors: It affects to several sequential bits in a given
symbol
b) 10000 bits.
c) When the most typical error is a single bit error.
Explanation: a) It is explained in the answer.
b) If data is being transmitted at 100 Mbps, this means that in 1 second, 100x 10e6 bits are transmitted, and that one single bit lasts for 10e-8 sec.
So, if the noise event that causes the burst error lasts 0.1 msec, this means 10e-4 sec.
Number of bits in error= 10e-4/10e-8= 10e4 bits= 10000 bits.
c) If the error is in a single bit in a symbol, this means that if the right number of 1s is even, a single error will change this number to an odd number , and the error could be easily detected.