Code Monkey home page Code Monkey logo

nasbench301's Introduction

NAS-Bench-301

This repository containts code for the paper: "NAS-Bench-301 and the Case for Surrogate Benchmarks for Neural Architecture Search".

The surrogate models can be downloaded on figshare. This includes the models for v0.9 and v1.0 as well as the dataset that was used to train the surrogate models. We also provide the full training logs for all architectures, which include learning curves on the train, validation and test sets. These can be automatically downloaded, please see nasbench301/example.py.

To install all requirements (this may take a few minutes), run

$ cat requirements.txt | xargs -n 1 -L 1 pip install
$ pip install nasbench301

If installing directly from github

$ git clone https://github.com/automl/nasbench301
$ cd nasbench301
$ cat requirements.txt | xargs -n 1 -L 1 pip install
$ pip install .

To run the example

$ python3 nasbench301/example.py

To fit a surrogate model run

$ python3 fit_model.py --model gnn_gin --nasbench_data PATH_TO_NB_301_DATA_ROOT --data_config_path configs/data_configs/nb_301.json  --log_dir LOG_DIR

Training models from scratch

To create the dataset used for the benchmark we trained the model using a version of AutoPyTorch which can be found here: https://github.com/automl/Auto-PyTorch/tree/nb301.

To train a model with its hyperparameters and architecture described using a configspace representation (.json file), firstly download the CIFAR10 data as used by AutoPyTorch and extract it at nasbench301/data_generation/datasets. Then run the following to start the training:

$ cd nasbench301/data_generation
$ python3 run_proxy.py --run_id 1 --config configs/config_0.json

The configuration file can be any other customly generated one. Check the file nasbench301/representations.py to convert an object in the Genotype representation from DARTS to a ConfigSpace object.

NOTE: This codebase is still subject to changes. Upcoming updates include improved versions of the surrogate models and code for all experiments from the paper. The API may still be subject to changes.

nasbench301's People

Contributors

arberzela avatar crwhite14 avatar eddiebergman avatar lmzimmer avatar neonkraft avatar walkerning avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

nasbench301's Issues

PyPi version does not seem to work (v0.2)

The PyPi version does not seem to include most of the source code.

This can be confirmed by going to PyPi, looking up nasbench301, download files and then downloading and viewing the tar ball.

I can also confirm it happens locally:

> mkdir ~/test

> cd test

> python -m venv ./.myvenv

> source ./.myvenv/bin/activate

> pip install --no-cache-dir nasbench301
Collecting nasbench301
  Downloading nasbench301-0.2-py3-none-any.whl (9.7 kB)
Installing collected packages: nasbench301
Successfully installed nasbench301-0.2

> ls .myvenv/lib/python3.8/site-packages/nasbench301
__pycache__  .  ..  api.py  example.py  __init__.py  representations.py

> python -c "import nasbench301"        
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/home/eddiebergman/test/.myvenv/lib/python3.8/site-packages/nasbench301/__init__.py", line 1, in <module>
    from nasbench301.api import load_ensemble
  File "/home/eddiebergman/test/.myvenv/lib/python3.8/site-packages/nasbench301/api.py", line 6, in <module>
    from surrogate_models import utils
ModuleNotFoundError: No module named 'surrogate_models'

There are only the 4 files located in the installed package.

Unfortunately I have not actually released a package to PyPi so I am not sure of the fix but I would imagine it is a relatively straight forward issue to do with the setup.py file.

Creating models from the search space

Could you release the code for generating the models based on a ConfigSpace Instance of the search space so that it is possible to retrain them yourself?

While it is probably possible to get something close to what you have used based on the code in the original DARTS repo I'd much rather get identical models if possible.

Thanks

About NasBenchDataset

I'm a beginner of deep learning, and i need an example to process data in this code, which means to process json and let json become the Graph the latter process needed in NASBenchDataset. Could someone please give me an example.

Mention requirements uses libraries specifically for cuda 10.2

It might be worth mentioning that the requirements.txt specifically installs torch libraries for cuda 10.2 as this took a while to debug.

I am using this with torch 1.8.0+cu111 and it all seems to work as intended.

> pip list | grep torch
torch                  1.8.0+cu111
torch-cluster          1.5.9
torch-geometric        1.6.3
torch-scatter          2.0.6
torch-sparse           0.6.9
torch-spline-conv      1.2.1
torchaudio             0.8.0
torchvision            0.9.0+cu111

Perfect installation

  • I've already installed nasbench301 on windows10 suceesfullly. It takes me a few moments to run the code without a bug. I find it is easy to begin nasbench 301 with example.py.

Adapt sys paths

import nasbench301 yields

Exception has occurred: ModuleNotFoundError
No module named 'surrogate_models'

due to /surrogate_models not being added via sys.path

Significance of MSE vs Kendall's Tau / Spearman's Rank Correlation

Hi folks. I am working on designing a surrogate benchmark for some hardware specific performance metrics based on the principles suggested in your work.

I am currently using a small dataset of between 500 - 1000 model architectures from within an MNASNet-like search space with XGB to evaluate the performance of this surrogate with only this small dataset. The hyperparameters utilized for XGB are copied from your work.

I am getting high validation/test MSE results (~ 0.4 to 0.6) but with a high Kendall's Tau (~0.92) and Spearman's rank correlation (~0.98).

When I utilize the same number of models selected randomly from nb301_dataset (from random search directory) offered by you to train the surrogate, I get low MSE (~0.16) but with low KT (0.60) and Spearman's (0.78).

I'm wondering if this disparity could potentially be due to sub-optimal values of the hyperparameters. Do you have some insights on what could cause such a huge difference in the performance of the predictor? Furthermore, for evaluating performance of a surrogate, do you think Kendall's Tau or Spearman's rank correlation is a better metric compared to MSE, or vice versa.

Direct pip install throws binary incompatibility error

If I do directly pip install nasbench301 it throws an error when I go to execute the example.

Distributor ID: Ubuntu
Description: Ubuntu 18.04.5 LTS
Release: 18.04
Codename: bionic

(nasbench301) REDMOND.dedey@GCRAZGDL0459:~/nasbench301$ python nasbench301/example.py 
Traceback (most recent call last):
  File "nasbench301/example.py", line 4, in <module>
    from ConfigSpace.read_and_write import json as cs_json
  File "/home/dedey/.local/lib/python3.8/site-packages/ConfigSpace/__init__.py", line 37, in <module>
    from ConfigSpace.configuration_space import Configuration, \
  File "ConfigSpace/configuration_space.pyx", line 39, in init ConfigSpace.configuration_space
  File "__init__.pxd", line 242, in init ConfigSpace.hyperparameters
ValueError: numpy.ndarray size changed, may indicate binary incompatibility. Expected 88 from C header, got 80 from PyObject

Hyperparameter 'q'

I am running into an error when calling example.py. Do you know what could cause this?

`==> Loading performance surrogate model...

c:\Users\nasbench301\nb_models_0.9\xgb_v0.9
Traceback (most recent call last):

File "c:/Users/nasbench301/example.py", line 38, in
performance_model = nb.load_ensemble(ensemble_dir_performance)
File "C:\Users\nasbench301\api.py", line 56, in load_ensemble
surrogate_model.load(model_paths=ensemble_member_dirs)
File "C:\Users\nasbench301\surrogate_models\ensemble.py", line 119, in load
ens_mem = utils.model_dict[self.member_model_name](log_dir=None, seed=self.starting_seed+ind, **self.member_model_init_dict)
File "C:\Users\nasbench301\surrogate_models\gradient_boosting\xgboost.py", line 13, in init
super(XGBModel, self).init(data_root, log_dir, seed, model_config, data_config)
File "C:\Users\nasbench301\surrogate_models\surrogate_model.py", line 31, in init
self.config_loader = utils.ConfigLoader(configspace_path)
File "C:\Users\nasbench301\surrogate_models\utils.py", line 83, in init
self.config_space = self.load_config_space(config_space_path)
File "C:\Users\nasbench301\surrogate_models\utils.py", line 153, in load_config_space
config_space = config_space_json_r_w.read(json_string)
File "C:\Users\miniconda3\envs\NASLib\lib\site-packages\ConfigSpace\read_and_write\json.py", line 444, in read
_construct_hyperparameter(
File "C:\Users\miniconda3\envs\NASLib\lib\site-packages\ConfigSpace\read_and_write\json.py", line 524, in _construct_hyperparameter
q=hyperparameter["q"],
KeyError: 'q'`

Training scripts for the architectures

Hi all,

Many thanks for the great work!

I am aware that the training logs of the architectures used as train data for the surrogate in NAS-Bench-301 are released, but I'm wondering whether I could access the original scripts used to train the architectures from scratch? I appreciate that the training details are mentioned in App A of the paper, but with some additional augmentations used (e.g. MixUp) compared to standard DARTS procedure, I think it'd be great if I could use the exact original training script used to construct the benchmark.

If they are already in the repo, I'd be grateful if you could point me to the right place.

Many thanks in advance for your help.

Xingchen

Errors when installing dependencies on Ubuntu 18.04 LTS, python 3.8

Hi,

When I go to install the dependencies via cat requirements.txt | xargs -n 1 -L 1 pip install on a new conda python 3.8 environment in Ubuntu 18.04 64-bit LTS I get a number of errors especially with respect to tensorflow version and then the various geometric libraries. See screenshots below. I am working through them but if you have any leads please let me know :-)

image

image

Distributor ID: Ubuntu
Description: Ubuntu 18.04.5 LTS
Release: 18.04
Codename: bionic

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.