Skip to content

TensorZero UI Deployment Guide

The TensorZero UI is a self-hosted web application that streamlines the use of TensorZero with features like observability and optimization. It’s easy to get started with the TensorZero UI.

Setup

To use the TensorZero UI, you only need your ClickHouse database URL (TENSORZERO_CLICKHOUSE_URL) and TensorZero Gateway URL (TENSORZERO_GATEWAY_URL). Optionally, you can also provide credentials for fine-tuning APIs.

Credentials for Fine-tuning

The TensorZero UI integrates with model providers like OpenAI to streamline workflows like fine-tuning. To use these features, you need to provide credentials for the relevant model providers as environment variables. You don’t need to provide credentials if you’re not using the fine-tuning features for those providers.

The supported fine-tuning providers and their required credentials (environment variables) are:

ProviderRequired Credentials
Fireworks AIFIREWORKS_ACCOUNT_ID FIREWORKS_API_KEY
OpenAIOPENAI_API_KEY
Together AITOGETHER_API_KEY
GCP VertexGCP account credentials

We’re planning to add support for more fine-tuning providers in the near future.

Optional Environment Variables

The TensorZero UI supports the following optional environment variables.

You can set TENSORZERO_UI_CONFIG_PATH to a custom path to the TensorZero configuration file. When using the official Docker image, this value defaults to /app/config/tensorzero.toml.

Deployment

The TensorZero UI is available on Docker Hub as tensorzero/ui.

Running with Docker Compose (Recommended)

You can easily run the TensorZero UI using Docker Compose:

services:
ui:
image: tensorzero/ui
# Mount your configuration folder (e.g. tensorzero.toml) to /app/config
volumes:
- ./config:/app/config:ro
# Add your environment variables the .env file
env_file:
- ${ENV_FILE:-.env}
# Publish the UI to port 4000
ports:
- "4000:4000"
restart: unless-stopped

Make sure to create a .env file with the relevant environment variables.

For more details, see the example docker-compose.yml file in the GitHub repository.

Running with Docker (Recommended)

Alternatively, you can launch the UI directly with the following command:

Terminal window
docker run \
--volume ./config:/app/config:ro \
--env-file ./.env \
--publish 4000:4000 \
tensorzero/ui

Make sure to create a .env file with the relevant environment variables.

Running with Kubernetes (k8s) and Helm

We provide a reference Helm chart contributed by the community in our GitHub repository. You can use it to run TensorZero in Kubernetes.

Building from source

Alternatively, you can build the UI from source. See our GitHub repository for more details.