Abstract
We derive a model that optimizes the performance of a laser satellite communication link with an optical pre-amplifier in the presence of random jitter in the transmitter-receiver line of sight. The system utilizes a transceiver containing a single telescope with a circulator. The telescope is used for both transmitting and receiving and thus reduces communication terminal dimensions and weight. The optimization model was derived under the assumption that the dominant noise source was amplifier spontaneous-emission noise. It is shown that, given the required bit-error rate (BER) and the rms random pointing jitter, an optimal transceiver gain exists that minimizes transmitted power. We investigate the effect of the amplifier spontaneous-emission noise on the optimal transmitted power and gain by performing an optimization procedure for various combinations of amplifier gain and noise figure. We demonstrate that the amplifier noise figure determines the optimal transmitted power needed to achieve the desired BER but does not affect the optimal transceiver telescope gain. Our numerical example shows that for a BER of 10-9, doubling the amplifier noise figure results in an 80% increase in minimal transmitted power for a rms pointing jitter of 0.44 μrad.
Original language | English |
---|---|
Pages (from-to) | 1307-1315 |
Number of pages | 9 |
Journal | Journal of the Optical Society of America A: Optics and Image Science, and Vision |
Volume | 21 |
Issue number | 7 |
DOIs | |
State | Published - 1 Jan 2004 |
ASJC Scopus subject areas
- Electronic, Optical and Magnetic Materials
- Atomic and Molecular Physics, and Optics
- Computer Vision and Pattern Recognition