Oven logo

Oven

evosax: Evolution Strategies in JAX 🦎

Pyversions PyPI version Ruff codecov Paper

Tired of having to handle asynchronous processes for neuroevolution? Do you want to leverage massive vectorization and high-throughput accelerators for Evolution Strategies? evosax provides a comprehensive, high-performance library that implements Evolution Strategies (ES) in JAX. By leveraging XLA compilation and JAX's transformation primitives, evosax enables researchers and practitioners to efficiently scale evolutionary algorithms to modern hardware accelerators without the traditional overhead of distributed implementations.

The API follows the classical ask-eval-tell cycle of ES, with full support for JAX's transformations (jit, vmap, lax.scan). The library includes 30+ evolution strategies, from classics like CMA-ES and Differential Evolution to modern approaches like OpenAI-ES and Diffusion Evolution.

Get started here 👉 Colab

Basic evosax API Usage 🍲

import jax
from evosax.algorithms import CMA_ES


# Instantiate the search strategy
es = CMA_ES(population_size=32, solution=dummy_solution)
params = es.default_params

# Initialize state
key = jax.random.key(0)
state = es.init(key, params)

# Ask-Eval-Tell loop
for i in range(num_generations):
    key, key_ask, key_eval = jax.random.split(key, 3)

    # Generate a set of candidate solutions to evaluate
    population, state = es.ask(key_ask, state, params)

    # Evaluate the fitness of the population
    fitness = ...

    # Update the evolution strategy
    state = es.tell(population, fitness, state, params)

# Get best solution
state.best_solution, state.best_fitness

Implemented Evolution Strategies 🦎

StrategyReferenceImportExample
Simple Evolution StrategyRechenberg (1978)SimpleESColab
OpenAI-ESSalimans et al. (2017)Open_ESColab
CMA-ESHansen & Ostermeier (2001)CMA_ESColab
Sep-CMA-ESRos & Hansen (2008)Sep_CMA_ESColab
xNESWierstra et al. (2014)XNESColab
SNESWierstra et al. (2014)SNESColab
MA-ESBayer & Sendhoff (2017)MA_ESColab
LM-MA-ESLoshchilov et al. (2017)LM_MA_ESColab
Rm_ESLi & Zhang (2017)Rm_ESColab
PGPESehnke et al. (2010)PGPEColab
ARSMania et al. (2018)ARSColab
ESMCMerchant et al. (2021)ESMCColab
Persistent ESVicol et al. (2021)PersistentESColab
Noise-Reuse ESLi et al. (2023)NoiseReuseESColab
CR-FM-NESNomura & Ono (2022)CR_FM_NESColab
Guided ESMaheswaranathan et al. (2018)GuidedESColab
ASEBOChoromanski et al. (2019)ASEBOColab
Discovered ESLange et al. (2023a)DESColab
Learned ESLange et al. (2023a)LESColab
EvoTFLange et al. (2024)EvoTF_ESColab
iAMaLGaM-FullBosman et al. (2013)iAMaLGaM_FullColab
iAMaLGaM-UnivariateBosman et al. (2013)iAMaLGaM_UnivariateColab
Gradientless DescentGolovin et al. (2019)GLDColab
Simulated AnnealingRasdi Rere et al. (2015)SimAnnealColab
Hill ClimbingRasdi Rere et al. (2015)SimAnnealColab
Random SearchBergstra & Bengio (2012)RandomSearchColab
SV-CMA-ESBraun et al. (2024)SV_CMA_ESColab
SV-OpenAI-ESLiu et al. (2017)SV_OpenESColab
Simple Genetic AlgorithmSuch et al. (2017)SimpleGAColab
MR15-GARechenberg (1978)MR15_GAColab
SAMR-GAClune et al. (2008)SAMR_GAColab
GESMR-GAKumar et al. (2022)GESMR_GAColab
LGALange et al. (2023b)LGAColab
Diffusion EvolutionZhang et al. (2024)DiffusionEvolutionColab
Differential EvolutionStorn & Price (1997)DEColab
Particle Swarm OptimizationKennedy & Eberhart (1995)PSOColab

Installation ⏳

You will need Python 3.10 or later, and a working JAX installation.

Then, install evosax from PyPi:

pip install evosax

To upgrade to the latest version of evosax, you can use:

pip install git+https://github.com/RobertTLange/evosax.git@main

Examples 📖

Key Features 💎

  • Comprehensive Algorithm Collection: 30+ classic and modern evolution strategies with a unified API
  • JAX Acceleration: Fully compatible with JAX transformations for speed and scalability
  • Vectorization & Parallelization: Fast execution on CPUs, GPUs, and TPUs
  • Production Ready: Well-tested, documented, and used in research environments
  • Batteries Included: Comes with optimizers like ClipUp, fitness shaping, and restart strategies

Related Resources 📚

Citing evosax ✏️

If you use evosax in your research, please cite the following paper:

@article{evosax2022github,
    author = {Robert Tjarko Lange},
    title = {evosax: JAX-based Evolution Strategies},
    journal={arXiv preprint arXiv:2212.04180},
    year = {2022},
}

Acknowledgements 🙏

We acknowledge financial support by the Google TRC and the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) under Germany's Excellence Strategy - EXC 2002/1 "Science of Intelligence" - project number 390523135.

Contributing 👷

Contributions are welcome! If you find a bug or are missing your favorite feature, please open an issue or submit a pull request following our contribution guidelines 🤗.

Disclaimer ⚠️

This repository contains independent reimplementations of LES and DES based and is unrelated to Google DeepMind. The implementation has been tested to reproduce the official results on a range of tasks.