The Burn Central CLI (burn) is the command-line tool for interacting with Burn Central, the centralized platform for experiment tracking, model sharing, and deployment for Burn users.
This CLI works in conjunction with the Burn Central SDK to provide a seamless workflow for:
- Running training jobs locally or remotely
- Managing experiments and tracking metrics
- Packaging and deploying models
- Integrating with compute providers
- Managing project configurations
cargo install burn-central-cligit clone https://github.com/tracel-ai/burn-central-cli.git
cd burn-central-cli
cargo install --path crates/burn-central-cliAfter installation, the burn command will be available in your terminal.
- Burn Central Account: Create an account at central.burn.dev
- Rust: Version 1.87.0 or higher
- Burn Central SDK: Add the SDK to your Burn project (see Quick Start)
Add the SDK to your Cargo.toml:
[dependencies]
burn-central = "0.1.0"Use the #[register] macro to make your training function discoverable:
use burn_central::{
experiment::ExperimentRun,
macros::register,
runtime::{Args, ArtifactLoader, Model, MultiDevice},
};
#[register(training, name = "mnist")]
pub fn training<B: AutodiffBackend>(
client: &ExperimentRun,
config: Args<YourExperimentConfig>,
MultiDevice(devices): MultiDevice<B>,
loader: ArtifactLoader<ModelArtifact<B>>,
) -> Result<Model<impl ModelArtifact<B::InnerBackend>>, String> {
// Your training logic here...
Ok(Model(model_artifact))
}See the SDK documentation for complete integration details.
Navigate to your Burn project directory and run:
burn initThis will:
- Link your local project to Burn Central
- Create or select a project on the platform
- Configure your local environment
burn loginThis opens your browser to authenticate with Burn Central and stores your credentials locally.
burn trainThe CLI will:
- Discover registered training functions in your project
- Prompt you to select a function (if multiple are found)
- Execute the training locally
- Send metrics, logs, and checkpoints to Burn Central in real-time
Run a training or inference job locally or trigger a remote execution.
# Run with interactive prompts
burn train
# Run a specific function
burn train mnist
# Run with argument overrides
burn train --override epochs=100Package your project for deployment on remote compute providers.
burn packageThis creates a deployable artifact containing your code, dependencies, and configurations.
Authenticate with the Burn Central platform.
burn loginInitialize or reinitialize a Burn Central project in the current directory.
# Interactive initialization
burn initUnlink the current directory from Burn Central.
burn unlinkDisplay information about the currently authenticated user.
burn meDisplay information about the current project.
burn projectThe Burn Central CLI is organized as a Cargo workspace:
burn-central-cli/
├── crates/
│ ├── burn-central-cli/ # Main CLI binary
│ └── burn-central-workspace/ # Core library for project management
└── xtask/ # Build utilities
The burn-central-workspace crate is a standalone library that provides:
- Project discovery and management
- Code generation and function discovery
- Job execution (local and remote)
- Client integration with Burn Central
- Compute provider integration
This library can be used independently in other applications. See the workspace README for detailed documentation.
- Function Discovery: The CLI analyzes your Rust code to find functions annotated with
#[register] - Code Generation: Generates the necessary glue code to execute your functions
- Execution: Runs your training/inference locally or submits to remote compute
- Tracking: Integrates with the SDK to send metrics, logs, and checkpoints to Burn Central
- Management: Provides tools to manage projects, experiments, and deployments
The CLI works seamlessly with the Burn Central SDK. Here's how they connect:
- SDK Integration: Add the SDK to your project and use the
#[register]macro - CLI Discovery: The CLI finds your registered functions
- Execution: The CLI generates and runs the necessary code
- Tracking: The SDK sends data to Burn Central during execution
For detailed SDK usage, see the SDK README.
cargo run --bin burn -- --helpcargo testFor testing against a local Burn Central instance:
burn --dev trainThis connects to http://localhost:9001 and uses separate development credentials.
Contributions are welcome! Please feel free to:
- Report issues or bugs
- Request new features
- Submit pull requests
- Improve documentation
Licensed under either of:
- Apache License, Version 2.0 (LICENSE-APACHE or http://www.apache.org/licenses/LICENSE-2.0)
- MIT license (LICENSE-MIT or http://opensource.org/licenses/MIT)
at your option.