Changelog
Keep track of changes and updates on Beamlit.
Revisions & canary deployments
Added revisions for agents, model APIs and functions:
- Each build of an object creates a new immutable revision, which can be deployed to redirect request traffic to it
- Ability to deploy a new revision using blue-green strategy
- Ability to rollback to previous revision
- Ability to split a percentage of traffic to a second revision (canary deployments)
Beamlit is now Blaxel!
We’re excited to announce our official name change from Beamlit to Blaxel!
Sunsetting environments
We are discontinuing Beamlit environments. Stay tuned for an upcoming major update that will help you better manage your deployment lifecycles.
New pricing plans
Beamlit officially launched new pricing plans. Learn more about the pricing plans on our website, or login to the Beamlit console to start your free tier or upgrade.
DeepSeek integration
- Connect to DeepSeek models from Beamlit
HuggingFace integration
- Connect to public or private models from HuggingFace Inference API (serverless) and Inference Endpoints (dedicated)
- Deploy a model in HuggingFace Inference Endpoints (dedicated)
Azure AI Foundry integration
- New integration: Azure AI Services (for OpenAI models and others)
- New integration: Azure Marketplace (for Llama models and others)
9 new pre-built toolkits
Added 9 new prebuilt templates of functions:
- Brave Search
- Google Maps
- Slack
- Linear
- AWS SES
- Cloudflare
- PostgreSQL
- AWS S3
- Dall-E
🚀 We launched Beamlit beta!
Beamlit Beta is available publicly! Coming with the following new features:
- New world-class infrastructure for minimal cold starts and maximum reliability
- World map showing the origin of requests on your agents AI
- Many additional charts
Agentic traces
- Added Agent Observability suite, with traces of agents’ requests.
- Added ability to launch a request in debug mode from the Playground to save the trace
Beamlit is currently in private alpha. Join our waitlist for access today.
Latency metric tracking
- Added a new set of metrics is available for your agents, functions and models: end-to-end latencies (average, p50, p90, p99).
Beamlit is currently in private alpha. Join our waitlist for access today.
Beamlit extension for VScode
Click here to download the Beamlit extension for Visual Studio Code.
- View and retrieve your workspace’s resources directly from the VSCode IDE
Beamlit is currently in private alpha. Join our waitlist for access today.
New Beamlit console
- Beamlit console has been completely reworked, in order to center the experience around agents and their deployments.
- Removed custom model deployments and renamed external models “model APIs”
- Added statuses for deployments
- Added playbook to deploy an agent from local code in the Beamlit console
Beamlit is currently in private alpha. Join our waitlist for access today.
Visual aspect of charts is improved
- Improved visual aspect of charts and options to change time window of screens
- API reference for agents’, models’ and functions’ deployments has been updated
Beamlit is currently in private alpha. Join our waitlist for access today.
CLI support for running agents and functions
- You can now use the Beamlit CLI to run requests on agents (
bl run agent
) and functions (bl run function
)
Beamlit is currently in private alpha. Join our waitlist for access today.
- Added new metric: total number of requests
Beamlit is currently in private alpha. Join our waitlist for access today.
- You can now develop (and run) your own AI agents locally with the Beamlit SDK.
- Use CLI command
bl create-agent-app your-agent-name
to initialize a repository with the code skeleton to get started - Use CLI command
bl serve --local
to run your AI agent on your local machine
Beamlit is currently in private alpha. Join our waitlist for access today.
2 new integrations are available to connect to external AI model providers:
- Cohere
- xAI
Beamlit is currently in private alpha. Join our waitlist for access today.
You can now use a GitHub Action to directly deploy Beamlit objects (agents, functions, models) from your GitHub Workflows.
Beamlit is currently in private alpha. Join our waitlist for access today.
Happy Thanksgiving! You can now connect external model providers as integrations in your Beamlit workspace in order to leverage Beamlit’s accelerated Global Inference Network to unify calls to third-party APIs.
- OpenAI
- Anthropic
- Mistral AI
Build and run agents globally while centralizing model access, credentials and observability behind our production-ready global gateway. 🌐
Beamlit is currently in private alpha. Join our waitlist for access today.
Experience Beamlit first-hand and discover what it can you for you. Join the private alpha waitlist for Beamlit now.
Features
- Agents: run AI agents across multiple locations, so your consumers get the lowest latency and the highest availability.
- Models: connect or deploy generative AI models behind our global gateway, from public endpoints to custom fine-tuned models
- Functions: run serverless functions that provide your agents with the tools to interact with their environment
- Global Inference Network: make your agents available globally and locally with a network designed for AI inferences that optimizes for mission-critical latency while respecting your deployment, routing and cost policies.
- Environments: manage compliance at enterprise level by enforcing policies directly in your development life-cycle
- Policies: define global rules and strategies regarding your deployment placement, inference request routing, and hardware usage.
Beamlit is currently in private alpha. Join our waitlist for access today.