Gravitational waves have opened new avenues in the study of the Universe. Advanced LIGO and Advanced Virgo interferometers are probing an increasingly larger volume of space, expanding the discovery potential for new gravitational wave souces. Characterizing these detectors is crucial to distinguish the main sources of noise and optimize the sensitivity. In particular, glitches are transient noise events impacting the data quality, and their detection and classification is very important to improve the performance of the interferometers. Deep learning techniques is a promising approach to recognize and classify glitches and to study noise in general. We will present an approach based on the study of time-frequency evolution of glitches and show its capabilities of investigating these noise source and thus contribute to the low-latency detector characterization.
Back to Computational Challenges in Gravitational Wave Astronomy