kleiaaro Kavlahkaff commited on
Commit
09f2562
·
verified ·
1 Parent(s): 8c2c2f2

fix: add hpob and tabrepo to readme (#1)

Browse files

- fix: add hpob and tabrepo to readme (1676dc5a2d56e4919d57d381b6900b5171f27639)


Co-authored-by: Luca Thale-Bombien <[email protected]>

Files changed (1) hide show
  1. README.md +2 -0
README.md CHANGED
@@ -8,6 +8,8 @@ This dataset contains hyperparameter optimization (HPO) evaluations from several
8
  - nasbench201: NAS-Bench-201: Extending the scope of reproducible neural architecture search. Dong, X. and Yang, Y. 2020.
9
  - pd1: Pre-trained Gaussian processes for Bayesian optimization. Wang, Z. and Dahl G. and Swersky K. and Lee C. and Mariet Z. and Nado Z. and Gilmer J. and Snoek J. and Ghahramani Z. 2021.
10
  - yahpo: YAHPO Gym - An Efficient Multi-Objective Multi-Fidelity Benchmark for Hyperparameter Optimization. Pfisterer F., Schneider S., Moosbauer J., Binder M., Bischl B., 2022
 
 
11
 
12
  The evaluations can be accessed through [Syne Tune](https://github.com/syne-tune/syne-tune) HPO library by calling the following:
13
 
 
8
  - nasbench201: NAS-Bench-201: Extending the scope of reproducible neural architecture search. Dong, X. and Yang, Y. 2020.
9
  - pd1: Pre-trained Gaussian processes for Bayesian optimization. Wang, Z. and Dahl G. and Swersky K. and Lee C. and Mariet Z. and Nado Z. and Gilmer J. and Snoek J. and Ghahramani Z. 2021.
10
  - yahpo: YAHPO Gym - An Efficient Multi-Objective Multi-Fidelity Benchmark for Hyperparameter Optimization. Pfisterer F., Schneider S., Moosbauer J., Binder M., Bischl B., 2022
11
+ - tabrepo: TabRepo: A Large Scale Repository of Tabular Model Evaluations and its AutoML Applications. Salinas D., Erickson N., 2024.
12
+ - hpob: HPO-B: A Large-Scale Reproducible Benchmark for Black-Box HPO based on OpenML. Arango S., Jomaa H., Wistuba M., Grabocka J., 2021.
13
 
14
  The evaluations can be accessed through [Syne Tune](https://github.com/syne-tune/syne-tune) HPO library by calling the following:
15