Effect of Dispersion on SS-WDM Systems
MetadataShow full item record
Effect of Dispersion on Spectrum-Sliced WDM Systems
Committee Chairman: Dr. Ira Jacobs
The purpose of this thesis is to investigate the effect of dispersion on a spectrum-sliced WDM (SS-WDM) system, specifically a system employing a single-mode optical fiber. The system performance is expressed in term of the receiver sensitivity defined as the average number of photon per bit Np required for a given probability of bit error Pe. The receiver sensitivity is expressed in terms of two normalized parameters: the ratio of the optical bandwidth per channel and the bit rate m=B0/Rb=B0T, and the transmission distance normalized by the dispersion distance z/LD. The former represents the effect of the excess beat noise caused by the signal fluctuation. The latter represents the effect of dispersion. The excess beat noise can be reduced by increasing the value of m (increasing the optical bandwidth B0 for a given bit rate Rb). However, a large m implies that the degradation due to the dispersion is severe in a system employing a single-mode fiber. Therefore, there should be an optimum m resulting from the two effects. The theoretical results obtained from our analysis have confirmed this prediction. It is also shown that the optimum m (mopt) decreases with an increase in the normalized distance. This suggests that the dispersion strongly affects the system performance. The increase in the excess beat noise is traded for the decrease in the dispersion effect. Additionally, the maximum transmission distance is relatively short, compared to that in a laser-based system. This suggests that the SS-WDM systems with single-mode fibers are suitable for short-haul systems, such as high-speed local-access network where the operating bit rate is high but the transmission distance is relatively short.