Understanding Feature Selection and Feature Memorization in Recurrent Neural Networks.

2019
In this paper, we propose a test, called Flagged-1-Bit (F1B) test, to study the intrinsic capability of recurrent neural networksin sequence learning. Four different recurrent network models are studied both analytically and experimentally using this test. Our results suggest that in general there exists a conflict between feature selectionand feature memorizationin sequence learning. Such a conflict can be resolvedeither using a gating mechanism as in LSTM, or by increasing the state dimension as in Vanilla RNN. Gated models resolvethis conflict by adaptively adjusting their state-update equations, whereas Vanilla RNN resolvesthis conflict by assigning different dimensions different tasks. Insights into feature selectionand memorizationin recurrent networks are given.
    • Correction
    • Source
    • Cite
    • Save
    34
    References
    0
    Citations
    NaN
    KQI
    []
    Baidu
    map