LangChain
Learn how to leverage Blaxel with LangChain and LangGraph.
LangChain is a composable framework to build LLM applications. It can be combined with LangGraph is a stateful, orchestration framework that brings added control to agent workflows. You can deploy your LangChain or LangGraph projects to Blaxel with minimal code edition (and zero configuration), enabling you to use Serverless Deployments, Agentic Observability, Policies, and more.
Get started with LangChain on Blaxel
To get started with LangChain/LangGraph on Blaxel:
- if you already have a LangChain or LangGraph agent, adapt your code with Blaxel SDK commands to use Blaxel features at deployment and runtime
- clone one of our LangChain example templates and deploy it by connecting to your git provider via the Blaxel console.
- initialize an example project in LangChain by using Blaxel CLI command
bl create-agent-app
and deploy it usingbl deploy
Explore our template gallery
Browse LangChain agents and deploy them with Blaxel.
Use Blaxel features with LangChain
Since Blaxel SDK commands use LangGraph format by default when no framework is specified, these commands integrate seamlessly with the LangChain ecosystem. All commands in this section will be referred to in Python syntax.
get_chat_model()
: returns a model API in LangChain format ChatModelget_functions()
: returns a list of tools in LangChain format BaseTool.
To go further
Learn more about deploying LangChain and LangGraph agents on Blaxel with the following resources:
Deploy a LangChain agent
Complete guide for deploying a LangChain AI agent on Blaxel.