Background reduction in searches for gravitational-wave signals from supernovae: A machine learning approach

Marco Cavaglia
Missouri University of Science and Technology

About 20% of the data collected by Advanced LIGO and Virgo in the next observing runs will be single-interferometer data, i.e., they will be collected at times when only one detector in the network is operating in observing mode. Searches for gravitational wave signals from supernova events do not rely on matched filtering techniques as state-of-the-art simulations do not yet allow gravitational waveforms to be computed with the required precision and accuracy for a template search. If a galactic supernova occurs during single-interferometer times, separation of its unmodelled gravitational-wave signal from noise will be even more difficult due to lack of coherence between detectors. Noise background reduction will be crucial to confidently identify the signal and decrease the false alarm rate of the search. We present a method to improve the background of LIGO and Virgo single-interferometer supernova searches based on the standard LIGO-Virgo coherent WaveBurst (cWB) pipeline and genetic programming, a supervised machine learning algorithm that uses the strategy of natural selection to solve complex problems. We show that it is possible to discriminate galactic gravitational-wave supernova signals from noise transients with high efficiency, thus increasing the supernova detection reach of Advanced LIGO and Virgo.

Presentation (PDF File)

Back to Computational Challenges in Gravitational Wave Astronomy