Skip to content

lasgroup/ombrl

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

OMBRL: Optimistic Exploration in Model-based RL

This repository provides an open-source implementation of algorithms for optimistic exploration in model-based reinforcement learning. It is based on the MaxInfoRL repository, which itself builds on jaxrl.

Among others, the repository includes implementations of the following algorithms:

Did this help you?

If you have any questions, feel free to reach out to us at kiten[at]ethz[dot]ch.

If you found this repository useful in your work, we would appreciate a citation:

@misc{sukhija2025sombrl,
  title         = {SOMBRL: Scalable and Optimistic Model-Based RL},
  author        = {Bhavya Sukhija and Lenart Treven and Carmelo Sferrazza and Florian Dörfler and Pieter Abbeel and Andreas Krause},
  year          = {2025},
  eprint        = {2511.20066},
  archivePrefix = {arXiv},
  primaryClass  = {cs.LG},
  url           = {https://arxiv.org/abs/2511.20066},
}

@misc{iten2026sampleefficient,
  title         = {Sample-efficient and Scalable Exploration in Continuous-Time RL},
  author        = {Klemens Iten and Lenart Treven and Bhavya Sukhija and Florian Dörfler and Andreas Krause},
  year          = {2026},
  eprint        = {2510.24482},
  archivePrefix = {arXiv},
  primaryClass  = {cs.LG},
  url           = {https://arxiv.org/abs/2510.24482},
}

Getting started

Local Installation

  1. Requirements:

    • Python >=3.11
    • CUDA >= 12.1
    • cudnn >= 8.9
  2. Install JAX either on CPU or GPU:

    pip install -U "jax[cpu]"
    pip install -U "jax[cuda12]"
  3. Install with a conda environment:

    conda create -n ombrl python=3.11 -y
    conda activate ombrl
    git clone https://github.com/lasgroup/ombrl.git
    pip install -e .

    This also installs the jaxrl and MaxInfoRL libraries.

  4. set up wandb

  5. add ombrl to your python path: PYTHONPATH=$PYTHONPATH:/path/to/ombrl.

  6. Launch experiments with the launcher:

    python path/to/ombrl/experiments/experiment_name/launcher.py
    

Remote Deployment on euler.ethz.ch

  1. Set up remote development from your computer to Euler in either PyCharm or VSCode.

  2. Set up git protocols on Euler: Connecting to GitHub with SSH

  3. Set up a .ombrl_setup file on your login node:

    export XLA_PYTHON_CLIENT_MEM_FRACTION=.7
    export TF_FORCE_GPU_ALLOW_GROWTH=true
    export TF_DETERMINISTIC_OPS=0
    
    module load stack/2024-06
    module load gcc/12.2.0
    module load eth_proxy
    module load python/3.11.6
    
    PYTHONPATH=$PYTHONPATH:path/on/euler/to/ombrl
    export PYTHONPATH

    Source it with source .ombrl_setup.

  4. Create a miniconda environment or a python virtual environment.

  5. activate virtual environment:

    source path/on/euler/to/venv/bin/activate
  6. Install Jax for GPU (see the JAX documentation)

    pip install "jax[cuda12]"
  7. git clone and pip install the ombrl library:

    git clone https://github.com/lasgroup/model-based-rl.git
    pip install .
  8. set up wandb on Euler

  9. add ombrl to your python path: PYTHONPATH=$PYTHONPATH:/path/on/euler/to/ombrl. You can also add this to your .ombrl_setup file.

  10. Launch experiments with the launcher:

    python path/on/euler/to/ombrl/experiments/experiment_name/launcher.py
    

Releases

No releases published

Packages

 
 
 

Contributors

Languages