Skip to content

Examples

OpenModal is API-compatible with Modal. Replace import modal with import openmodal and everything works.

All examples work on both local Docker (--local) and GCP.

For hundreds of additional examples, see the Modal examples gallery.

Getting started

GPU serving

  • vLLM serving — deploy an LLM on a GPU with auto scale-to-zero

Sandboxes

  • Sandboxes — isolated containers for SWE agents, parallel execution

Training

Benchmarks

  • CooperBench — run multi-agent coding benchmarks with OpenModal (one-line import swap)
  • SWE-bench with Harbor — run SWE-bench evaluations with OpenModal as compute backend