Recombination of Artificial Neural Networks.

2019
We propose a genetic algorithm (GA) for hyperparameter optimizationof artificial neural networks which includes chromosomal crossoveras well as a decoupling of parameters (i.e., weights and biases) from hyperparameters(e.g., learning rate, weight decay, and dropout) during sexual reproduction. Children are produced from three parents; two contributing hyperparametersand one contributing the parameters. Our version of population-based training (PBT) combines traditional gradient-based approaches such as stochastic gradient descent(SGD) with our GA to optimize both parameters and hyperparametersacross SGD epochs. Our improvements over traditional PBT provide an increased speed of adaptation and a greater ability to shed deleterious genes from the population. Our methods improve final accuracy as well as time to fixed accuracy on a wide range of deep neural network architectures including convolutional neural networks, recurrent neural networks, dense neural networks, and capsule networks.
    • Correction
    • Source
    • Cite
    • Save
    25
    References
    2
    Citations
    NaN
    KQI
    []
    Baidu
    map