Guideline of choosing optical delay time to optimize the performance of an interferometry-based in-band OSNR monitor

Opt Lett. 2016 Sep 15;41(18):4178-81. doi: 10.1364/OL.41.004178.

Abstract

We propose and experimentally demonstrate a guideline of choosing optical delay time in an interferometry-based optical signal-to-noise ratio (OSNR) monitor to achieve the optimal monitoring performance by calculating the normalized autocorrelation function of the channel noise. According to the position of the first zero point of the calculated autocorrelation function, the delay time is determined; consequently the OSNR monitoring range up to 29 dB (within an error≤±0.5 dB) is achieved for 112 Gb/s PM-QPSK signal with a channel filter bandwidth of 100 GHz. The experimental results also show that the guideline is applicable to channel filters with different bandwidths and shapes. In simulation, the guideline is proven to be valid in OSNR monitoring for a 28 Gbaud PM-16QAM signal and a 50 Gbaud PM-QPSK signal.