王兴刚

个人信息Personal Information

教授   博士生导师   硕士生导师  

性别:男

在职信息:在职

所在单位:电子信息与通信学院

学历:研究生(博士)毕业

学位:工学博士学位

毕业院校:华中科技大学

学科:通信与信息系统
信号与信息处理

Densely Connected Search Space for More Flexible Neural Architecture Search

点击次数:

论文类型:会议论文

第一作者:Sun,Yuzhu,Fang,Jiemin

通讯作者:Wang,Xinggang

合写作者:Liu,Wenyu,Li,Yuan,Zhang,Qian

发表刊物:2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)

发表时间:2020-06-13

摘要:Neural architecture search (NAS) has dramatically advanced the development of neural network design. We revisit the search space design in most previous NAS methods and find the number and widths of blocks are set manually. However, block counts and block widths determine the network scale (depth and width) and make a great influence on both the accuracy and the model cost (FLOPs/latency). In this paper, we propose to search block counts and block widths by designing a densely connected search space, i.e., DenseNAS. The new search space is represented as a dense super network, which is built upon our designed routing blocks. In the super network, routing blocks are densely connected and we search for the best path between them to derive the final architecture. We further propose a chained cost estimation algorithm to approximate the model cost during the search. Both the accuracy and model cost are optimized in DenseNAS. For experiments on the MobileNetV2-based search space, DenseNAS achieves 75.3% top-1 accuracy on ImageNet with only 361MB FLOPs and 17.9ms latency on a single TITAN-XP. The larger model searched by DenseNAS achieves 76.1% accuracy with only 479M FLOPs. DenseNAS further promotes the ImageNet classification accuracies of ResNet-18, -34 and -50-B by 1.5%, 0.5% and 0.3% with 200M, 600M and 680M FLOPs reduction respectively. The related code is available at https://github.com/JaminFong/DenseNAS.