"seeing" ENF From Neuromorphic Events: Modeling and Robust Estimation

IEEE Trans Pattern Anal Mach Intell. 2024 Apr 9:PP. doi: 10.1109/TPAMI.2024.3386813. Online ahead of print.

Abstract

Most artificial lights exhibit subtle fluctuations in intensity and frequency in response to the influence of the grid's alternating current, providing the potential to estimate the Electric Network Frequency (ENF) from conventional frame-based videos. Nevertheless, the performance of Video-based ENF (V-ENF) estimation largely relies on the imaging quality and thus may suffer from significant interference caused by non-ideal sampling, scene diversity, motion interference, and extreme lighting conditions. In this paper, we show that the ENF can be extracted without the above limitations from a new modality provided by the so-called event camera, a neuromorphic sensor that encodes the light intensity variations and asynchronously emits events with extremely high temporal resolution and high dynamic range. Specifically, we formulate and validate the physical mechanism for the ENF captured in events and then propose a simple yet robust Event-based ENF (E-ENF) estimation method through mode filtering and harmonic enhancement. To validate the effectiveness, we build the first Event-Video ENF Dataset (EV-ENFD) and its extension EV-ENFD+ with diverse scenarios, including static, dynamic, and extreme lighting scenes. Comprehensive experiments have been conducted on our proposed datasets, showcasing that our proposed E-ENF significantly outperforms the V-ENF in extracting accurate ENF traces, especially in challenging environments. The code and dataset are available at https://xlx-creater.github.io/Improved_E-ENF/.