Jobs
Scheduled jobs of batch processing tasks for your AI workflows.
Jobs allow you to run many AI tasks in parallel using batch processing.
Concepts
- Job: A code definition that specifies a batch processing task. Jobs can run multiple times within a single execution and accept optional input parameters.
- Execution: A specific instance of running a batch job at a given timestamp. Each execution consists of multiple tasks running in parallel.
- Task: A single instance of a job definition running as part of an execution.
Quickstart
You can quickly initialize a new job from scratch by using CLI command bl create-job
.
This will create a pre-scaffolded local directory where your entire code can be added. In the generated folder, you’ll find a boilerplate job with multiple steps in the entrypoint file index.ts
.
Start the job locally:
Deploy a job with Blaxel CLI
This section assumes you have developed a job locally.
The Blaxel SDK allows you to connect to and orchestrate other resources (such as model APIs, tool servers, multi-agents) during development, and ensures telemetry, secure connections to third-party systems or private networks, smart global placement of workflows, and much more when jobs are deployed.
Serve locally
You can serve the job locally in order to make the entrypoint function (by default: index.py
/ index.ts
) available on a local endpoint.
Run the following command to serve the job:
Calling the provided endpoint will execute the job locally. Add the flag --hotreload
to get live changes.
Deploy on production
You can deploy the job in order to make the entrypoint function (by default: index.ts
/ server.py
) callable on a global endpoint. When deploying to Blaxel, you get a dedicated endpoint that enforces your deployment policies.
Run the following command to build and deploy a local job on Blaxel:
Run a job
Start a batch job execution by running:
You can cancel a job execution from the Blaxel Console or via API.
Retries
You can set a maximum number of retries per task in the job definition. Check out the reference for blaxel.toml
configuration file down below.
Deploy with a Dockerfile
While Blaxel uses predefined, optimized container images to build and deploy your code, you can also deploy your agent using your own Dockerfile.
Deploy using Dockerfile
Deploy resources using a custom Dockerfile.
Template directory reference
Overview
package.json
Here the most notable imports are the scripts. They are used for the bl serve
and bl deploy
commands.
Depending of what you do, all of the scripts
are not required. With TypeScript, all 4 of them are used.
start
: start the job locally through the TypeScript command, to avoid having to build the project when developing.prod
: start the job remotely from the dist folder, the project needs to be have been built before.build
: build the project. It is done automatically when deploying.
The remaining fields in package.json follow standard JavaScript/TypeScript project conventions. Feel free to add any dependencies you need, but keep in mind that devDependencies are only used during the build process and are removed afterwards.
blaxel.toml
This file is used to configure the deployment of the job on Blaxel. The only mandatory parameter is the type
so Blaxel knows which kind of entity to deploy. Others are not mandatory but allow you to customize the deployment.
name
,workspace
, andtype
fields are optional and serve as default values. Any bl command run in the folder will use these defaults rather than prompting you for input.policies
fields is also optional. It allow you to specify a Blaxel policy to customize the deployment. For example, deploy it only in a specific region of the world.[env]
section defines environment variables that the job can access via the SDK. Note that these are NOT secrets.[runtime]
section allows to override job execution parameters: maximum number of concurrent tasks, maximum number of retries for each task, timeout (in s), or memory (in MB) to allocate.