A GPU Algorithm for Outliers Detection in TESS Light Curves.

2021
In recent years, Machine Learning (ML) algorithms have proved to be very helpful in several research fields, such as engineering, health-science, physics etc. Among these fields, Astrophysics also started to develop a stronger need of ML techniques for the management of big-data collected by ongoing and future all-sky surveys (e.g. Gaia, LAMOST, LSST etc.). NASA’s Transiting Exoplanet Survey Satellite (TESS) is a space-based all-sky time-domain survey searching for planets outside of the solar system, by means of transit method. During its first two years of operations, TESS collected hundreds of terabytes of photometric observations at a two minutes cadence. ML approaches allow to perform a fast planet candidates recognition into TESS light curves, but they require assimilated data. Therefore, different pre-processing operations need to be performed on the light curves. In particular, cleaning the data from inconsistent values is a critical initial step, but because of the large amount of TESS light curves, this process requires a long execution time. In this context, High-Performance computing techniques allow to significantly accelerate the procedure, thus dramatically improving the efficiency of the outliers rejection. Here, we demonstrate that the GPU-parallel algorithm that we developed improves the efficiency, accuracy and reliability of the outliers rejection in TESS light curves.
    • Correction
    • Source
    • Cite
    • Save
    20
    References
    1
    Citations
    NaN
    KQI
    []
    Baidu
    map