Getting Started with Together AI
This guide shows how to set up a minimal deployment to use the TensorZero Gateway with the Together AI API.
Setup
For this minimal setup, you’ll need just two files in your project directory:
Directoryconfig/
- tensorzero.toml
- docker-compose.yml
For production deployments, see our Deployment Guide.
Configuration
Create a minimal configuration file that defines a model and a simple chat function:
See the list of models available on Together AI. Dedicated endpoints and custom models are also supported.
See the Configuration Reference for optional fields (e.g. overwriting api_base
).
Credentials
You must set the TOGETHER_API_KEY
environment variable before running the gateway.
Deployment (Docker Compose)
Create a minimal Docker Compose configuration:
You can start the gateway with docker compose up
.
Inference
Make an inference request to the gateway: