Skip to content

Latest commit

 

History

History
60 lines (35 loc) · 1.7 KB

README.md

File metadata and controls

60 lines (35 loc) · 1.7 KB

Multinomial Distribution Learning for Effective Neural Architecture Search

Here we propose a method to extremely accelerate NAS, without reinforcement learning or gradient, just by sampling architectures from a distribution and comparing these architectures, estimating their relative performance rather than absolute performance, iteratively updating parameters of the distribution while training.

Our paper is available at 《Multinomial Distribution Learning for Effective Neural Architecture Search》

Here we provide our test codes and pretrained models, our code is based on DARTS and ProxylessNAS, pretrained models can be downloaded here

Requirements

  • PyTorch 1.0
  • DALI

Evaluate

You need to modified your path to dataset in data_providers/cifar10.py and data_providers/imagenet.pyconfig.sh is used to prepare your environment, you should write this file by yourself and here we use it to prepare dataset and packages

To evaluate the model in DARTS setting, just run

for cifar10 :

chmod +x run_darts_cifar.sh
./run_darts_cifar.sh

for imagenet :

chmod +x run_darts_imagenet.sh
./run_darts_imagenet.sh

To evaluate the model in Mobile setting, just run

for gpu :

chmod +x run_gpu_imagenet.sh
./run_gpu_imagenet.sh

for cpu :

chmod +x run_cpu_imagenet.sh
./run_cpu_imagenet.sh

Performance