Quickstart
It is required to have npm installed to use the following command.
bl new
.
index.ts
. While you typically wonβt need to modify this file, you can add specific logic there if needed. Your main work will focus on the agent.ts
file. Blaxelβs development paradigm lets you leverage its hosting capabilities without modifying your agentβs core logic.
Requirements & limitations
Agents Hosting have few requirements or limitations:-
The only requirement to deploy an app on Agents Hosting is that it exposes an HTTP API server which is bound on
BL_SERVER_HOST
(for the host) andBL_SERVER_PORT
(for the port). These two environment variables are required for the host+port combo. - Deployed agents have a runtime limit after which executions time out. This timeout duration is determined by your chosen infrastructure generation. For Mk 2 generation, the maximum timeout is 10 minutes.
- The synchronous endpoint has a timeout of 100 seconds for keeping the connection open when no data flows through the API. If your agent streams back responses, the 100-second timeout resets with each chunk streamed. For example, if your agent processes a request for 5 minutes while streaming data, the connection stays open. However, if it goes 100 seconds without sending any data β even while calling external APIs β the connection will timeout.
Accessing resources with Blaxel SDK
Blaxel SDK provides methods to programmatically access and integrate various resources hosted on Blaxel into your agentβs code, such as: model APIs, tool servers, sandboxes, batch jobs, or other agents. The SDK handles authentication, secure connection management and telemetry automatically.Connect to a model API
Blaxel SDK provides a helper to connect to a model API defined on Blaxel from your code. This allows you to avoid managing a connection with the model API by yourself. Credentials remain stored securely on Blaxel.FRAMEWORK_NAME
specified in the import.
Available frameworks :
- LangChain/LangGraph :
langgraph
- LlamaIndex :
llamaindex
- VercelAI :
vercel
- Mastra:
mastra
my-model
in a LlamaIndex agent:
Connect to tools
Blaxel SDK provides a helper to connect to pre-built or custom tool servers (MCP servers) hosted on Blaxel from your code. This allows you to avoid managing a connection with the server by yourself. Credentials remain stored securely on Blaxel. The following method retrieves all the tools discoverable in the tool server.langgraph
(LangChain/LangGraph), llamaindex
(LlamaIndex), vercel
(Vercel AI) and mastra
(Mastra).
You can develop agents by mixing tools defined locally in your agents, and tools defined as remote servers. Using separated tools prevents monolithic designs which make maintenance easier in the long run. Letβs look at a practical example combining remote and local tools. The code below uses two tools:
blaxel-search
: A remote tool server on Blaxel providing web search functionality (learn how to create your own MCP servers here)weather
: A local tool that accepts a city parameter and returns a mock weather response (always βsunnyβ)
Connect to another agent (multi-agent chaining)
Rather than using a βquick and dirtyβ approach where you would combine all your agents and capabilities into a single deployment, Blaxel provides a structured development paradigm based on two key principles:- Agents can grow significantly in complexity. Monolithic architectures make long-term maintenance difficult.
- Individual agents should be reusable across multiple projects.
blAgent().run()
rather than combining all functionality into a single codebase.
Customize the agent deployment
You can set custom parameters for an agent deployment (e.g. specify the agent name, etc.) in theblaxel.toml
file at the root of your directory.
Read the file structure section down below for more details.
Deploy an agent
Learn how to deploy your custom AI agents on Blaxel as a serverless endpoint.
Instrumentation
Instrumentation happens automatically when workloads run on Blaxel. To enable telemetry, simply import the SDK in your projectβs entry point.Template directory reference
Overview
package.json
Here the most notable imports are the scripts. They are used for thebl serve
and bl deploy
commands.
scripts
are not required. With TypeScript, all 4 of them are used.
start
: start the server locally through the TypeScript command, to avoid having to build the project when developing.build
: build the project. It is done automatically when deploying.prod
: start the server remotely from the dist folder, the project needs to be have been built before.dev
: same as start, but with hotreload. Itβs useful when developing locally, each file change is reflected immediately.
blaxel.toml
This file is used to configure the deployment of the agent on Blaxel. The only mandatory parameter is thetype
so Blaxel knows which kind of entity to deploy. Others are not mandatory but allow you to customize the deployment.
name
,workspace
, andtype
fields are optional and serve as default values. Any bl command run in the folder will use these defaults rather than prompting you for input.agents
,functions
, andmodels
fields are also optional. They specify which resources to deploy with the agent. These resources are preloaded during build, eliminating runtime dependencies on the Blaxel control plane and dramatically improving performance.[env]
section defines environment variables that the agent can access via the SDK. Note that these are NOT secrets.[runtime]
section allows to override agent deployment parameters: timeout (in s) or memory (in MB) to allocate.[[triggers]]
and[triggers.configuration]
sections defines ways to send requests to the agent. You can create both synchronous and asynchronous trigger endpoints (respectivelytype = "http"
ortype = "http-async"
). You can also make them either private (default) or public (authenticationType = "public"
). A private synchronous HTTP endpoint is always created by default, even if you donβt define any trigger here.
Troubleshooting
Wrong port or host
BL_SERVER_HOST
& BL_SERVER_PORT
. Blaxel automatically injects these variables during deployment.
For example, if your current server code looks something like this:
Deploy an agent
Learn how to deploy your custom AI agents on Blaxel as a serverless endpoint.