Skip to main content
The OpenAI Agents SDK is an open source, production-grade library for building agentic applications leveraging the Codex harness. One of the most interesting features of this SDK is the ability to create agents that are backed by remote execution environments (sandboxes) in the cloud. This ability allows the harness to get access to its own computer to execute commands, work with files and directories, write code, and perform computations. This tutorial explores how you can deploy your OpenAI Agents SDK projects to Blaxel with minimal code editing (and zero configuration), enabling you to colocate them close to the sandboxes the agents work on.

Prerequisites

The OpenAI Agents SDK is provider-agnostic and can be used with both OpenAI and non-OpenAI models. This tutorial uses OpenAI models, but you can also read about integrating other models.

1. Install the Blaxel CLI and log in to Blaxel

The main way to deploy an agent on Blaxel is with the Blaxel CLI. This method is detailed in this tutorial. Alternatively you can connect a GitHub repository - any push to the main branch of the repository will automatically update the deployment on Blaxel - or deploy from a variety of pre-built templates using the Blaxel Console. Install the Blaxel CLI and log in to Blaxel using this command:
bl login

2. Install required dependencies

Create a directory for the project:
mkdir sandbox-agent && cd sandbox-agent
In your project directory, install the OpenAI Agents SDK with Blaxel support for the agent loop:
python3 -m venv .venv && source .venv/bin/activate # if new project
pip install openai-agents[blaxel]
Agents deployed on Blaxel Agents Hosting must expose an HTTP endpoint for requests. An easy way to do this is with FastAPI - install it as below:
pip install fastapi uvicorn

3. Configure the environment

Add your OpenAI API key, Blaxel API key and Blaxel workspace name to a .env file in the project directory:
echo "OPENAI_API_KEY=your_key_here" > .env
echo "BL_API_KEY=your_key_here" >> .env
echo "BL_WORKSPACE=your_workspace_name_here" >> .env

4. Build a simple agent

In your project directory, create a file named main.py with the following code.
import os
import uvicorn
from fastapi import FastAPI, Request

from agents import ModelSettings, Runner, set_tracing_disabled
from agents.run import RunConfig
from agents.sandbox import Manifest, SandboxAgent, SandboxRunConfig
from agents.extensions.sandbox import BlaxelSandboxClient, BlaxelSandboxClientOptions


app = FastAPI()
host = os.getenv("HOST", "0.0.0.0")
port = int(os.getenv("PORT", "8000"))

@app.post("/query")
async def query(request: Request):
    body = await request.json()
    task = body["task"]

    client = BlaxelSandboxClient()
    run_config = RunConfig(
        sandbox=SandboxRunConfig(
            manifest=Manifest(root="/blaxel"),
            client=client,
            options=BlaxelSandboxClientOptions(
                name="agent-sandbox",
                image="blaxel/base-image",
                region="us-pdx-1",
                memory=4096,
            ),
        )
    )

    agent = SandboxAgent(
        name="Sandbox agent",
        model="gpt-5.4",
        instructions="You have access to a minimal sandbox environment. You can execute commands, manage files, and inspect processes inside the sandbox.",
        model_settings=ModelSettings(tool_choice="auto"),
        default_manifest=Manifest(root="/blaxel"),
    )

    result = await Runner.run(agent, task, run_config=run_config)
    return {"result": result.final_output}


if __name__ == "__main__":
    set_tracing_disabled(True)
    print(f"Server listening on {host}:{port}")
    uvicorn.run(app, host=host, port=port)
This creates a simple sandbox-backed agent with the OpenAI Agents SDK. Queries that you send the agent will be executed in a sandbox named agent-sandbox that is dynamically created and managed in your workspace on Blaxel’s infrastructure.
The agent’s HTTP service must be bound to the host and port provided by Blaxel. Blaxel automatically injects these values as HOST and PORT variables into the runtime environment. It is important to read these variables in your code and ensure that the agent’s HTTP service binds to the correct host and port.

5. Enable telemetry (optional)

Instrumentation happens automatically when workloads run on Blaxel. To enable telemetry:
  • Add the required package to your project:
    pip install "blaxel[telemetry]"
    
  • Import the package in your code:
    import blaxel.telemetry
    

6. Test the agent locally

Test the agent by making the endpoint available locally:
bl serve --hotreload &
This starts the agent locally while handling the core agent logic, function calls and model API calls exactly as they would be when deployed on Blaxel. The --hotreload flag monitors and reloads the agent if the source code changes. Note the host port on which the agent is running. In another terminal, send the agent a request (update the endpoint URL below with the correct port number for your agent):
curl -X POST http://0.0.0.0:8000/query \
  -H "Content-Type: application/json" \
  -d '{"task": "Install a complete Go development environment. Return the Go version and Go environment variables."}'
The agent will create and deploy a Blaxel sandbox using the blaxel/base-image image, then investigate the sandbox and install all the tools required for Go development. Once done, it will return a report of its work and automatically delete the sandbox.

7. Deploy the agent on Blaxel

You’re now ready to deploy the agent on Blaxel. When deploying to Blaxel, your workloads are served optimally to dramatically accelerate cold-start and latency while enforcing your deployment policies. First, create a requirements.txt file in the project directory with the following dependencies:
fastapi>=0.135.3
uvicorn>=0.35.0
openai-agents[blaxel]
Then, deploy the agent by running the following command:
bl deploy
The Blaxel CLI will prompt for the type of resource you are deploying (“agent”). It will then auto-detect other details of your project and begin the deployment. The deployment process usually takes a few minutes, and you can watch progress from the Blaxel Console. Once the deployment process is complete, log in to the Blaxel Console to find the global endpoint for your agent service. Typically, this will be of the form https://run.blaxel.ai/WORKSPACE/agents/AGENT.

8. Test the agent on Blaxel

By default, agents deployed on Blaxel are not public. All agent requests must be authenticated via a bearer token. Requests can be made either via the Blaxel API or the Blaxel CLI. Test the deployed agent by sending an authenticated request to its global endpoint (update the endpoint URL below with the correct endpoint URL for your agent, and modify the task as desired):
curl -X POST https://run.blaxel.ai/$(bl workspace --current)/agents/sandbox-agent/query  \
  -H "X-Blaxel-Workspace: $(bl workspace --current)" \
  -H "X-Blaxel-Authorization: Bearer $(bl token)" \
  -H "Content-Type: application/json" \
  -d '{"task": "Write a Python program to reverse words in a string. Test it. Return the test results and the final working code."}'
As before, the agent will create a sandbox, generate the code and provide the result. You can also send a request through the Blaxel CLI:
bl run agent sandbox-agent/query --data '{"task": "Write a Python program to reverse words in a string. Test it. Return the test results and the final working code."}'
Although deployed agents are private by default, it is possible to make an agent publicly available.
That’s it! You’re ready to start building and deploying Blaxel sandbox-backed agents with OpenAI Agents SDK on Blaxel.

Resources

Want more info on developing and deploying agents on Blaxel? Check out the following resources:

Use OpenAI Agents SDK with Blaxel Sandboxes

Build compute-capable agents backed by Blaxel sandboxes using OpenAI Agents SDK.

Deploy your agent code to Blaxel

Complete tutorial for deploying AI agents to Blaxel.

Manage environment variables

Complete tutorial for managing variables and secrets when deploying to Blaxel.
Last modified on April 15, 2026