This thesis presents the effect of dispersion and Multi Access Interference (MAI) of optical fiber on the Bit Error Rate (BER) performance of a Direct Sequence Optical Code Division Multiple Access (DS-OCDMA) network by means of intensity modulation and optical receiver correlators. By using Matlab simulations, Signal-to-Noise Ratio (SNR) versus Received Optical Power (ROP) of an OCDMA transmission system can be evaluated with a so-called 7-chip m-sequence for different numbers of system users. This can be done for the ROP versus BER for various lengths of single mode optical fiber by taking into consideration the dispersion effect in the optical fiber. Matlab simulations can be performed in order to illustrate the reduction of the dispersion index gamma, or to visualize different scenarios, e.g., what amount of transmitted power is required in order to obtain a BER of 10-9 when the length of the optical fiber is increased.
Identifer | oai:union.ndltd.org:UPSALLA1/oai:DiVA.org:bth-5479 |
Date | January 2009 |
Creators | Gafur, Abdul |
Publisher | Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation |
Source Sets | DiVA Archive at Upsalla University |
Language | English |
Detected Language | English |
Type | Student thesis, info:eu-repo/semantics/bachelorThesis, text |
Format | application/pdf |
Rights | info:eu-repo/semantics/openAccess |
Page generated in 0.002 seconds