TY - JOUR
T1 - A Neuro-Genetic Technique for Pruning and Optimization of ANN Weights
AU - Sakshi, Sakshi
AU - Kumar, Ravi
N1 - Publisher Copyright:
© 2018, © 2018 Taylor & Francis.
PY - 2019/1/2
Y1 - 2019/1/2
N2 - A novel technique for optimization of artificial neural network (ANN) weights which combines pruning and Genetic Algorithm (GA) has been proposed. The technique first defines “relevance” of initialized weights in a statistical sense by introducing a coefficient of dominance for each weight and subsequently employing the concept of complexity penalty. Based upon complexity penalty for each weight, candidate solutions are initialized to participate in the Genetic optimization. The GA stage employs mean square error as the fitness function which is evaluated once for all candidate solutions by running the forward pass of backpropagation. Subsequent reproduction cycles generate fitter individuals and the GA is terminated after a small number of cycles. It has been observed that ANNs trained with GA optimized weights exhibit higher convergence, lower execution time, and higher success rate in the test phase. Furthermore, the proposed technique yields substantial reduction in computational resources.
AB - A novel technique for optimization of artificial neural network (ANN) weights which combines pruning and Genetic Algorithm (GA) has been proposed. The technique first defines “relevance” of initialized weights in a statistical sense by introducing a coefficient of dominance for each weight and subsequently employing the concept of complexity penalty. Based upon complexity penalty for each weight, candidate solutions are initialized to participate in the Genetic optimization. The GA stage employs mean square error as the fitness function which is evaluated once for all candidate solutions by running the forward pass of backpropagation. Subsequent reproduction cycles generate fitter individuals and the GA is terminated after a small number of cycles. It has been observed that ANNs trained with GA optimized weights exhibit higher convergence, lower execution time, and higher success rate in the test phase. Furthermore, the proposed technique yields substantial reduction in computational resources.
UR - http://www.scopus.com/inward/record.url?scp=85055558108&partnerID=8YFLogxK
U2 - 10.1080/08839514.2018.1525524
DO - 10.1080/08839514.2018.1525524
M3 - Article
AN - SCOPUS:85055558108
SN - 0883-9514
VL - 33
SP - 1
EP - 26
JO - Applied Artificial Intelligence
JF - Applied Artificial Intelligence
IS - 1
ER -