Getting Started with OpenAI
This guide shows how to set up a minimal deployment to use the TensorZero Gateway with the OpenAI API.
Setup
For this minimal setup, you’ll need just two files in your project directory:
Directoryconfig/
- tensorzero.toml
- docker-compose.yml
For production deployments, see our Deployment Guide.
Configuration
Create a minimal configuration file that defines a model and a simple chat function:
See the list of models available on OpenAI.
See the Configuration Reference for optional fields (e.g. overwriting api_base
).
Credentials
You must set the OPENAI_API_KEY
environment variable before running the gateway.
Deployment (Docker Compose)
Create a minimal Docker Compose configuration:
You can start the gateway with docker compose up
.
Inference
Make an inference request to the gateway: