Abstract:
To solve the shortcomings of neural architecture search (NAS), such as high computing power requirements and long search time, combined with the manual design experience of deep neural networks, an NAS algorithm based on manual experience network architecture initialization was proposed. The algorithm redesigned the search space and selected VGG-11 as the initial architecture, which effectively reduced the invalid search caused by the random initialization of the parameters. Based on the above design scheme, experimental verification was carried out on the classic image classification dataset Cifar-10. The VGG-Lite structure was obtained by searching for 12 hours, and the error rate of this model was 2.63%. The model VGG-Lite was 0.83% more accurate than DenseNet-BC, the best-performing artificial design structure at this stage. The number of parameters of this architecture was 1.48 M, which was about 1/17 of the DenseNet-BC number of parameters. Results show that this method can search for excellent network architectures and significantly improve the search efficiency, which is of great significance to the popularization of NAS algorithms.