Skip to content

aurelio-amerio/GenSBI

Repository files navigation

GenSBI: Generative Methods for Simulation-Based Inference in JAX

Build Coverage Version Downloads

GenSBI Logo

Overview

GenSBI is a JAX-based library for Simulation-Based Inference (SBI) using state-of-the-art generative methods, currently revolving around Optimal Transport Flow Matching and Diffusion Models.

It is designed for researchers and practitioners who need a flexible, high-performance toolkit to solve complex inference problems where the likelihood function is intractable.

Key Features

  • Modern SBI Algorithms: Implements cutting-edge techniques like Optimal Transport Conditional Flow Matching and Diffusion Models for robust and flexible posterior inference.
  • Built on JAX and Flax NNX: Leverages the power of JAX for automatic differentiation, vectorization, and seamless execution on CPUs, GPUs, and TPUs.
  • High-Level Recipes API: A simplified interface for common workflows, allowing you to train models and run inference with just a few lines of code.
  • Powerful Transformer Models: Includes implementations of recent, high-performing models like Flux1, Flux1Join, and Simformer for handling complex, high-dimensional data.
  • Modular and Extensible: A clean, well-structured codebase that is easy to understand, modify, and extend for your own research.

Installation

Using uv (recommended):

uv add gensbi
# or, for a standalone install:
uv pip install gensbi

For GPU support:

uv add gensbi[cuda12]
# or
uv pip install gensbi[cuda12]

Or using pip:

pip install gensbi

For GPU support and other options, including how to install uv, see the Installation Guide.

Quick Start

To get started immediately, you can use the high-level API to train a model.

Tip

Check out the my_first_model.ipynb notebook for a complete, step-by-step introductory tutorial.

from flax import nnx
from gensbi.recipes import Flux1FlowPipeline
from gensbi.models import Flux1Params

train_dataset = ... # define a training dataset (infinite iterator)
val_dataset = ...   # define a validation dataset (infinite iterator)
dim_obs = ...       # dimension of the parameters (theta)
dim_cond = ...      # dimension of the simulator observations (x)
params = Flux1Params(...) # the parameters for your model

# Instantiate the pipeline
pipeline = Flux1FlowPipeline(
    train_dataset,
    val_dataset,
    dim_obs,
    dim_cond,
    params=params,
)

# Train the model
# Note: GenSBI uses Flax NNX, so we pass a random key generator
pipeline.train(rngs=nnx.Rngs(0))

# After training, get a sampler for posterior sampling
key = jax.random.PRNGKey(42)
samples = pipeline.sample(key, x_observed, num_samples=10_000)

Examples

Examples for this library are available separately in the GenSBI-examples repository.

Some key examples include:

Getting Started:

  • my_first_model.ipynb Open In Colab
    A beginner-friendly notebook introducing the core concepts of GenSBI on a simple problem.

Unconditional Density Estimation:

  • flow_matching_2d_unconditional.ipynb Open In Colab
    Demonstrates how to use flow matching in 2D for unconditional density estimation.
  • diffusion_2d_unconditional.ipynb Open In Colab
    Demonstrates how to use diffusion models in 2D for unconditional density estimation.

Conditional Density Estimation:

  • two_moons_flow_flux.ipynb Open In Colab
    Uses the Flux1 model for posterior density estimation on the two-moons benchmark.
  • two_moons_diffusion_flux.ipynb Open In Colab
    Uses the Diffusion model for posterior density estimation on the two-moons benchmark.

Note

A complete list of the currently available examples can be found at the examples documentation page.

Citing GenSBI

If you use this library, please consider citing this work and the original methodology papers, see references.

@misc{GenSBI,
  author       = {Amerio, Aurelio},
  title        = "{GenSBI: Generative Models for Simulation-Based Inference in JAX}",
  year         = {2026}, 
  publisher    = {GitHub},
  journal      = {GitHub repository},
  howpublished = {\url{https://github.com/aurelio-amerio/GenSBI}}
}

Reference implementations

Note

AI Usage Disclosure
This project utilized large language models, specifically Google Gemini and GitHub Copilot, to assist with code suggestions, documentation drafting, and grammar corrections. All AI-generated content has been manually reviewed and verified by human authors to ensure accuracy and adherence to scientific standards.

Contributors

Languages