KDCTime: Knowledge distillation with calibration on InceptionTime for time-series classification

2022
Time-series classification approaches based on easily overfit UCR datasets, which is caused by the few-shot problem of those datasets. Therefore, to alleviate the overfitting phenomenon to further improve accuracy, we first propose label smoothing for InceptionTime (LSTime), which adopts the soft label information compared to only hard labels. Next, instead of manually adjusting soft labels by LSTime, knowledge distillation for InceptionTime (KDTime) is proposed to automatically generate soft labels by the teacher model while compressing the inference model. Finally, to rectify the incorrectly predicted soft labels from the teacher model, knowledge distillation with calibration for InceptionTime (KDCTime) is proposed, which contains two optional calibrating strategies, i.e., KDC by translating (KDCT) and KDC by reordering (KDCR). The experimental results show that the KDCTime accuracy is promising, while its inference time is orders of magnitude faster than state-of-the-art approaches.
    • Correction
    • Source
    • Cite
    • Save
    0
    References
    0
    Citations
    NaN
    KQI
    []
    Baidu
    map