Blaxel is a serverless platform that enables developers to push and run any AI agent in a production-critical environment in a single click. This tutorial demonstrates how to deploy your first AI agent on Blaxel.

Quickstart

Welcome there! 👋 Make sure you have created an account on Blaxel (here → https://app.blaxel.ai), and created a first workspace.

Upon creating a workspace, Blaxel automatically adds a sandbox connection to a model API to get you started. Model APIs provide developers with a centralized interface that simplifies credential management and LLM provider access.
1

Install Blaxel CLI

2

Login

3

Install a package manager

4

Create your first AI agent

In Python, you will need to have uv installed; in TypeScript, you will need to have npm installed for this.

Let’s initialize a first app. The following command creates a pre-scaffolded local repository ready for developing and deploying your agent on Blaxel.

bl create-agent-app my-agent

You can now use the Blaxel SDK to develop your agent’s core logic in /my-agent/src/agent.py (or /my-agent/src/agent.ts) and define your functions in /my-agent/src/functions — functions in this folder are automatically bound to the main agent.

5

Test and deploy your AI agent

Run the following command to serve your agent locally:

cd my-agent;
bl serve;
Add the hot reload option in command bl serve —hotreload to have live reload so you can patch changes onto the agent server while it runs.

Query your agent locally by making a POST request to this endpoint: http://localhost:1338 with the following payload format: {"inputs": "Hello world!"}.

To push to Blaxel, run the following command. Blaxel will handle the build and deployment:

bl deploy

Your agent is available behind a global endpoint 🌎 . Read this guide on how to use it for HTTP requests.

6

Make a first inference

Run a first inference on your Blaxel agent with the following command:

bl chat my-agent

This gives you a chat-like interface where you can interact with your agent!

Alternatively, you can send requests to your agent by running bl run agent my-agent --data '{"inputs":"Hello world!"}' .

Next steps

You are ready to run AI with Blaxel! Here’s a curated list of guides which may be helpful for you to make the most of the Blaxel platform, but feel free to explore the product on your own!

Deploy agents

Complete guide for deploying AI agents on Blaxel.

Integrate and query agents

Complete guide for querying your AI agents on the Global Inference Network.

Manage policies

Complete guide for managing deployment and routing policies on the Global Inference Network.

Any question?

Although we designed this documentation to be as comprehensive as possible, you are welcome to contact support@blaxel.ai or the community on Discord with any questions or feedback you have.

Want to deploy Blaxel on-prem? Schedule a call with us.