This repository contains supporting code and examples for the paper:
Generative approaches to optimization Victor Alves and John R. Kitchin Preprint available at: https://chemrxiv.org/doi/full/10.26434/chemrxiv-2025-hk886
We demonstrate how generative machine learning models can solve optimization and inverse design problems by learning joint distributions over inputs and outputs, enabling bidirectional inference. The repository includes implementations and examples using:
- Gaussian Mixture Models (GMM) for conditional generation via Gaussian Mixture Regression
- Conditional Flow Matching for inverse design using neural ODE-based generative models
Both approaches learn to generate inputs conditioned on desired outputs, providing:
- Multi-modal solution discovery (finding all solutions, not just one)
- Uncertainty quantification through sample distributions
- No gradient requirements for the objective function
This project uses uv for Python environment management.
# Install uv (if needed)
curl -LsSf https://astral.sh/uv/install.sh | sh
# Or with Homebrew (macOS)
brew install uv
# Create environment and install dependencies
uv syncThis creates a .venv/ directory with all required dependencies including:
- NumPy, SciPy, Matplotlib
- scikit-learn, gmr (Gaussian Mixture Regression)
- PyTorch (Flow Matching neural networks)
- JAX
- Jupyter
# Run a Python script
uv run python your_script.py
# Start Jupyter
uv run jupyter notebook
# Or activate the environment
source .venv/bin/activateThe repository is pre-configured for VS Code. After running uv sync:
- Reload VS Code (
Cmd+Shift+P→ "Developer: Reload Window") - Select the interpreter:
Cmd+Shift+P→ "Python: Select Interpreter" →.venv/bin/python - For Jupyter notebooks, select the
.venvkernel from the kernel picker
├── generative_optimization.py # Core module with GMM and Flow Matching utilities
├── readme.ipynb # Interactive setup and navigation guide
├── SKILL.md # Claude Code skill for expert guidance
├── pyproject.toml # uv/Python project configuration
│
├── GMM Examples (00*.ipynb):
│ ├── 00a_root_finding # PR-EOS cubic root finding
│ ├── 00b_optimization # Unconstrained optimization
│ ├── 00c_equality_constraints # Gasoline blending (Lagrangian)
│ ├── 00d_inequality_constraints # Barrier method + reaction equilibrium
│ ├── 00e_parameter_estimation # Series reaction rate constants
│ └── 00h_space_mapping # CSTR input/output mapping
│
├── Flow Matching Examples (01*.ipynb):
│ ├── 01a_fm_root_finding # Polynomial root finding
│ ├── 01b_fm_optimization # Unconstrained optimization
│ ├── 01c_fm_equality_constraints
│ ├── 01d_fm_inequality_constraints
│ ├── 01e_fm_parameter_estimation
│ ├── 01f_fm_space_mapping # CSTR design
│ └── 01g_fm_eos_fitting # Van der Waals EOS fitting
│
├── Visualizations:
│ ├── 02_flow_field_visualization
│ └── 03_space_mapping_flow_matching
│
└── Results (results-*.ipynb): # Publication figures comparing GMM and FM
import numpy as np
from generative_optimization import generate_samples, ConditionalFlowMatching
# Define a forward model: y = x^2
def forward_model(x):
return x ** 2
# Generate training data
x_data = generate_samples(bounds=[[-2, 2]], n_samples=512)
y_data = forward_model(x_data)
# Train flow matching for inverse: given y, find x
fm = ConditionalFlowMatching(x_dim=1, c_dim=1, hidden_dim=64, n_layers=3)
fm.fit(x_data, y_data, epochs=500)
# Inverse problem: find x where y = 1
samples = fm.sample(c_values=[[1.0]], n_samples=500)
print(f"Found x values: {samples.mean():.3f} ± {samples.std():.3f}")
# Expected: x = ±1.0 (both solutions discovered)The repository includes SKILL.md, a Claude Code skill that provides expert guidance on applying generative optimization to your own problems.
- Problem assessment framework: When to use generative optimization vs traditional methods
- Method selection criteria: GMM vs Flow Matching decision guide
- Implementation patterns: Complete code for common problem types (unconstrained, equality/inequality constraints, inverse problems, parameter estimation, multi-objective)
- Best practices: Training tips, hyperparameter selection, troubleshooting
Option 1: Local skill (this repository)
The SKILL.md file is already present. When you use Claude Code in this directory, it will automatically have access to the skill.
Option 2: Global skill (use across all projects)
mkdir -p ~/.claude/skills
cp SKILL.md ~/.claude/skills/generative-optimization.mdThen describe your optimization problem to Claude Code, and it will help you formulate, implement, and analyze solutions using these techniques.
If you use this code in your research, please cite our preprint:
@article{alves2025generative,
title={Generative approaches to optimization},
author={Alves, Victor and Kitchin, John R.},
journal={ChemRxiv},
year={2025},
doi={10.26434/chemrxiv-2025-hk886}
}See the repository for license information.