google-vizier0.1.20
Published
Open Source Vizier: Distributed service framework for blackbox optimization and research.
pip install google-vizier
Package Downloads
Authors
Project URLs
Requires Python
>=3.8
Dependencies
- attrs
==23.1.0
- absl-py
>=1.0.0
- numpy
<2.0,>=1.21.5
- protobuf
>=3.6
- portpicker
>=1.3.1
- grpcio
>=1.35.0
- grpcio-tools
>=1.35.0
- googleapis-common-protos
>=1.56.4
- sqlalchemy
>=1.4
- cvxpy
==1.2.1; extra == "algorithms"
- cvxopt
==1.3.0; extra == "algorithms"
- scikit-learn
==1.1.2; extra == "algorithms"
- evojax
==0.2.15; extra == "algorithms"
- lightgbm
==2.2.3; extra == "algorithms"
- jax
>=0.4.34; extra == "all"
- jaxlib
>=0.4.34; extra == "all"
- jaxopt
>=0.8.3; extra == "all"
- flax
>=0.10.0; extra == "all"
- optax
>=0.2.3; extra == "all"
- chex
>=0.1.87; extra == "all"
- tfp-nightly
[jax]; extra == "all"
- equinox
==0.11.7; extra == "all"
- jaxtyping
>=0.2.34; extra == "all"
- typeguard
<=2.13.3; extra == "all"
- tensorflow
>=2.9.1; extra == "all"
- cvxpy
==1.2.1; extra == "all"
- cvxopt
==1.3.0; extra == "all"
- scikit-learn
==1.1.2; extra == "all"
- evojax
==0.2.15; extra == "all"
- lightgbm
==2.2.3; extra == "all"
- matplotlib
; extra == "all"
- pandas
; extra == "all"
- ale-py
; extra == "all"
- nats-bench
; extra == "all"
- xgboost
==1.5.1; extra == "all"
- ray
==2.3.1; extra == "all"
- optproblems
==1.3; extra == "all"
- diversipy
==0.9; extra == "all"
- coverage
<=6.4.2,>=4.5; extra == "all"
- mock
<=4.0.3,>=3.0; extra == "all"
- pytest
; extra == "all"
- matplotlib
; extra == "benchmarks"
- pandas
; extra == "benchmarks"
- ale-py
; extra == "benchmarks"
- nats-bench
; extra == "benchmarks"
- xgboost
==1.5.1; extra == "benchmarks"
- ray
==2.3.1; extra == "benchmarks"
- optproblems
==1.3; extra == "benchmarks"
- diversipy
==0.9; extra == "benchmarks"
- jax
>=0.4.34; extra == "jax"
- jaxlib
>=0.4.34; extra == "jax"
- jaxopt
>=0.8.3; extra == "jax"
- flax
>=0.10.0; extra == "jax"
- optax
>=0.2.3; extra == "jax"
- chex
>=0.1.87; extra == "jax"
- tfp-nightly
[jax]; extra == "jax"
- equinox
==0.11.7; extra == "jax"
- jaxtyping
>=0.2.34; extra == "jax"
- typeguard
<=2.13.3; extra == "jax"
- coverage
<=6.4.2,>=4.5; extra == "test"
- mock
<=4.0.3,>=3.0; extra == "test"
- pytest
; extra == "test"
- tensorflow
>=2.9.1; extra == "tf"
Open Source Vizier: Reliable and Flexible Black-Box Optimization.
Google AI Blog | Getting Started | Documentation | Installation | Citing and Highlights
What is Open Source (OSS) Vizier?
OSS Vizier is a Python-based service for black-box optimization and research, based on Google Vizier, one of the first hyperparameter tuning services designed to work at scale.
Getting Started
As a basic example for users, below shows how to tune a simple objective using all flat search space types:
from vizier.service import clients
from vizier.service import pyvizier as vz
# Objective function to maximize.
def evaluate(w: float, x: int, y: float, z: str) -> float:
return w**2 - y**2 + x * ord(z)
# Algorithm, search space, and metrics.
study_config = vz.StudyConfig(algorithm='DEFAULT')
study_config.search_space.root.add_float_param('w', 0.0, 5.0)
study_config.search_space.root.add_int_param('x', -2, 2)
study_config.search_space.root.add_discrete_param('y', [0.3, 7.2])
study_config.search_space.root.add_categorical_param('z', ['a', 'g', 'k'])
study_config.metric_information.append(vz.MetricInformation('metric_name', goal=vz.ObjectiveMetricGoal.MAXIMIZE))
# Setup client and begin optimization. Vizier Service will be implicitly created.
study = clients.Study.from_study_config(study_config, owner='my_name', study_id='example')
for i in range(10):
suggestions = study.suggest(count=2)
for suggestion in suggestions:
params = suggestion.parameters
objective = evaluate(params['w'], params['x'], params['y'], params['z'])
suggestion.complete(vz.Measurement({'metric_name': objective}))
Documentation
OSS Vizier's interface consists of three main APIs:
- User API: Allows a user to optimize their blackbox objective and optionally setup a server for distributed multi-client settings.
- Developer API: Defines abstractions and utilities for implementing new optimization algorithms for research and to be hosted in the service.
- Benchmarking API: A wide collection of objective functions and methods to benchmark and compare algorithms.
Additionally, it contains advanced API for:
- Tensorflow Probability: For writing Bayesian Optimization algorithms using Tensorflow Probability and Flax.
- PyGlove: For large-scale evolutionary experimentation and program search using OSS Vizier as a distributed backend.
Please see OSS Vizier's ReadTheDocs documentation for detailed information.
Installation
Quick start: For tuning objectives using our state-of-the-art JAX-based Bayesian Optimizer, run:
pip install google-vizier[jax]
Advanced Installation
Minimal version: To install only the core service and client APIs from requirements.txt
, run:
pip install google-vizier
Full installation: To support all algorithms and benchmarks, run:
pip install google-vizier[all]
Specific installation: If you only need a specific part "X" of OSS Vizier, run:
pip install google-vizier[X]
which installs add-ons from requirements-X.txt
. Possible options:
requirements-jax.txt
: Jax libraries shared by both algorithms and benchmarks.requirements-tf.txt
: Tensorflow libraries used by benchmarks.requirements-algorithms.txt
: Additional repositories (e.g. EvoJAX) for algorithms.requirements-benchmarks.txt
: Additional repositories (e.g. NASBENCH-201) for benchmarks.requirements-test.txt
: Libraries needed for testing code.
Check if all unit tests work by running run_tests.sh
after a full installation. OSS Vizier requires Python 3.10+, while client-only packages require Python 3.8+.
Citing and Highlights
Citing Vizier: Please consider citing the appropriate paper(s): Algorithm, OSS Package, and Google System if you found any of them useful.
Highlights: We track notable users and media attention - let us know if OSS Vizier was helpful for your work.
Thanks!
@article{gaussian_process_bandit,
author = {Xingyou Song and
Qiuyi Zhang and
Chansoo Lee and
Emily Fertig and
Tzu-Kuo Huang and
Lior Belenki and
Greg Kochanski and
Setareh Ariafar and
Srinivas Vasudevan and
Sagi Perel and
Daniel Golovin},
title = {The Vizier Gaussian Process Bandit Algorithm},
journal = {Google DeepMind Technical Report},
year = {2024},
eprinttype = {arXiv},
eprint = {2408.11527},
}
@inproceedings{oss_vizier,
author = {Xingyou Song and
Sagi Perel and
Chansoo Lee and
Greg Kochanski and
Daniel Golovin},
title = {Open Source Vizier: Distributed Infrastructure and API for Reliable and Flexible Black-box Optimization},
booktitle = {Automated Machine Learning Conference, Systems Track (AutoML-Conf Systems)},
year = {2022},
}
@inproceedings{google_vizier,
author = {Daniel Golovin and
Benjamin Solnik and
Subhodeep Moitra and
Greg Kochanski and
John Karro and
D. Sculley},
title = {Google Vizier: {A} Service for Black-Box Optimization},
booktitle = {Proceedings of the 23rd {ACM} {SIGKDD} International Conference on
Knowledge Discovery and Data Mining, Halifax, NS, Canada, August 13
- 17, 2017},
pages = {1487--1495},
publisher = {{ACM}},
year = {2017},
url = {https://doi.org/10.1145/3097983.3098043},
doi = {10.1145/3097983.3098043},
}