Get started
Deploy your first AI agent worldwide in just 3 minutes.
Blaxel is a serverless platform that enables developers to push and run any AI agent in a production-critical environment in a single click. This tutorial demonstrates how to deploy your first AI agent on Blaxel.
Quickstart
Welcome there! 👋 Make sure you have created an account on Blaxel (here → https://app.blaxel.ai), and created a first workspace.
Install Blaxel CLI
Login
Install a package manager
Create your first AI agent
Let’s initialize a first app. The following command creates a pre-scaffolded local repository ready for developing and deploying your agent on Blaxel.
You can now use the Blaxel SDK to develop your agent’s core logic in /my-agent/src/agent.py
(or /my-agent/src/agent.ts
) and define your functions in /my-agent/src/functions
— functions in this folder are automatically bound to the main agent.
Test and deploy your AI agent
Run the following command to serve your agent locally:
Query your agent locally by making a POST request to this endpoint: http://localhost:1338
with the following payload format: {"inputs": "Hello world!"}
.
To push to Blaxel, run the following command. Blaxel will handle the build and deployment:
Your agent is available behind a global endpoint 🌎 . Read this guide on how to use it for HTTP requests.
Make a first inference
Run a first inference on your Blaxel agent with the following command:
This gives you a chat-like interface where you can interact with your agent!
Alternatively, you can send requests to your agent by running bl run agent my-agent --data '{"inputs":"Hello world!"}'
.
Next steps
You are ready to run AI with Blaxel! Here’s a curated list of guides which may be helpful for you to make the most of the Blaxel platform, but feel free to explore the product on your own!
Deploy agents
Complete guide for deploying AI agents on Blaxel.
Integrate and query agents
Complete guide for querying your AI agents on the Global Inference Network.
Manage policies
Complete guide for managing deployment and routing policies on the Global Inference Network.
Any question?
Although we designed this documentation to be as comprehensive as possible, you are welcome to contact support@blaxel.ai or the community on Discord with any questions or feedback you have.
Want to deploy Blaxel on-prem? Schedule a call with us.