Simulating rare events in optical transmission systems

William Kath
Northwestern University
Applied Mathematics

Next-generation optical communication systems are being designed to carry information at astounding transmission speeds. For example, it
is anticipated that systems will have several tens of channels, each with a bit rate of 20 to 40 gigabits per second, for an aggregate total
capacity of several terabits per second. Because transmission errors are handled at slower electronic speeds, however, systems (even
current ones) must have extremely small error rates, typically one error per 10^(12) or more bits. The accurate modelling of system
performance when it is determined by such extremely rare events presents a severe mathematical and computational challenge.

In this talk recent work aimed at overcoming this difficulty will be described, in particular, the application of importance sampling (one
member of the general family of variance reduction techniques) to the numerical simulation of transmission impairments induced by amplified
spontaneous emission noise in soliton-based optical transmission systems. The method allows numerical simulations to be concentrated on
the noise realizations that are most likely to result in transmission errors, thus leading to speedups of several orders of magnitude over
standard Monte Carlo methods. The technique is demonstrated by using it to calculate the probability distribution function of amplitude and
timing fluctuations.


Back to Emerging Applications of the Nonlinear Schrödinger Equations