CLI Reference
The Distil CLI is a command-line tool for fine-tuning compact language models using the distil labs platform. It enables you to train specialized models up to 70x smaller than teacher models while maintaining comparable accuracy, without requiring ML expertise.
Installation
Section titled “Installation”Install the Distil CLI with a single command:
curl -fsSL https://cli-assets.distillabs.ai/install.sh | sh
Supported Platforms
Section titled “Supported Platforms”The Distil CLI supports the following operating systems:
| Platform | Supported |
|---|---|
| Linux (x86_64) | Yes |
| macOS (Intel) | Yes |
| macOS (Apple Silicon) | Yes |
| Windows | No |
Claude Skill
Section titled “Claude Skill”Use our Claude Skill to train models directly from Claude Code or Claude.ai. The skill teaches Claude how to guide you through the entire training workflow.
Installation
Section titled “Installation”Claude Code:
/plugin marketplace add https://github.com/distil-labs/distil-cli-skill
/plugin install distil-cli@distil-cli-skill
Claude.ai / Claude Desktop:
- Download the skill as ZIP from its GitHub page or directly here.
- Go to claude.ai → Settings → Capabilities → Skills
- Click “Upload skill” and select the ZIP file
- Toggle the skill ON
Capabilities
Section titled “Capabilities”| Environment | What Claude Can Do |
|---|---|
| Claude Code | Full end-to-end workflow: task selection, data preparation, running CLI commands, training, and deployment |
| Claude.ai | Task selection and data preparation: help choose the right task type and create data files. You run CLI commands yourself. |
Once installed, ask Claude to help you train a model:
“Help me train a classification model for customer support intent detection”
Claude will guide you through creating a model, preparing data files, uploading data, running teacher evaluation, training, and deployment.
Model Identifiers
Section titled “Model Identifiers”When working with the Distil platform, you’ll encounter several types of identifiers. Understanding these is key to navigating the CLI effectively.
Model Name
Section titled “Model Name”The model name is a human-readable identifier you choose when creating a model. It helps you organize and recognize your models.
distil model create my-qa-model
Model names should be descriptive of the task or project (e.g., customer-support-classifier, product-qa-bot).
Model ID
Section titled “Model ID”The model ID is a unique identifier automatically assigned when you create a model. This is the primary identifier used in most CLI commands.
# Create a model and receive its ID
distil model create my-model-name
# Output: Model created with ID: d64ee301-76d2-4f06-8e7d-398e40c0d7de
# Use the model ID in subsequent commands
distil model show d64ee301-76d2-4f06-8e7d-398e40c0d7de
You can find your model IDs by listing all models:
distil model list
Component IDs
Section titled “Component IDs”Each model tracks multiple components through the training workflow. These component IDs are automatically managed but useful for debugging and API integration:
| Component | Description |
|---|---|
| Upload ID | Identifies a specific data upload. Created when you run upload-data. A model can have multiple uploads if you iterate on your data. |
| Teacher Evaluation ID | Identifies a teacher evaluation run. Created when you run run-teacher-evaluation. Shows how well the teacher model performs on your task. |
| Training ID | Identifies a training job. Created when you run run-training. Tracks the distillation process that creates your small model. |
View all component IDs for a model with:
distil model show <model-id>
CLI Commands
Section titled “CLI Commands”Authentication
Section titled “Authentication”| Command | Description |
|---|---|
distil login | Authenticate with the distil labs platform. Opens a browser for login. |
distil register | Create a new distil labs account. |
distil whoami | Display the currently authenticated user. |
distil logout | Log out from the platform and clear credentials. |
Model Management
Section titled “Model Management”| Command | Description |
|---|---|
distil model create <name> | Create a new model with the specified name. Returns the model ID. |
distil model list | List all your models with their IDs, names, and status. |
distil model show <model-id> | Show detailed information about a specific model, including all component IDs. |
Data Upload
Section titled “Data Upload”| Command | Description |
|---|---|
distil model upload-data <model-id> --data <directory> | Upload all data files from a directory. Expects standard filenames (job_description.json, train.csv, test.csv, config.yaml). |
distil model upload-data <model-id> --job-description <file> --train <file> --test <file> [--config <file>] [--unstructured <file>] | Upload data files individually with explicit paths. |
distil model download-data <model-id> | Download the uploaded data files for a model. |
Traces Upload
Section titled “Traces Upload”Training from traces is a two-step process: first, traces are uploaded and stored as a PreparedTraces resource; then, the traces are processed to produce training and test data (an Upload). The upload-traces command performs both steps in one go. To re-run only the processing step with different parameters, use reprocess-traces.
Traces are an alternative to structured data uploads and consist of three files: a traces file (.jsonl), a job description (.json), and a config file (.json or .yaml).
| Command | Description |
|---|---|
distil model upload-traces <model-id> --data <directory> | Upload all trace files from a directory. Expects standard filenames (traces.jsonl, job_description.json, config.json/config.yaml). |
distil model upload-traces <model-id> --traces <file> --job-description <file> --config <file> | Upload trace files individually with explicit paths. |
Flags:
| Flag | Required | Description |
|---|---|---|
--data | Yes* | Directory containing trace files (traces.jsonl, job_description.json, config.json/config.yaml). |
--traces | Yes* | Path to traces file (.jsonl). |
--job-description | Yes* | Path to job description file (.json). |
--config | Yes* | Path to config file (.json or .yaml). |
See Training from traces for guidance on preparing trace files and trace processing configuration for available parameters.
Reprocess Traces
Section titled “Reprocess Traces”Reprocess previously uploaded traces with a new trace processing config. This is useful when you want to change how traces are processed without re-uploading the trace files. The command uses the most recently uploaded prepared traces for the model.
| Command | Description |
|---|---|
distil model reprocess-traces <model-id> --trace-processing-config <file> | Reprocess traces for a model with a new trace processing config. |
Flags:
| Flag | Alias | Required | Description |
|---|---|---|---|
--trace-processing-config | -t | Yes | Path to trace processing config file (.json or .yaml). |
Upload Status
Section titled “Upload Status”| Command | Description |
|---|---|
distil model upload-status <model-id> | Show the current upload status for a model, including processing state. |
distil model upload-status <model-id> --output json | Output the upload status in JSON format. |
Teacher Evaluation
Section titled “Teacher Evaluation”| Command | Description |
|---|---|
distil model run-teacher-evaluation <model-id> | Start a teacher evaluation to validate that a large model can solve your task. |
distil model teacher-evaluation <model-id> | Check the status and results of the teacher evaluation. |
Training
Section titled “Training”| Command | Description |
|---|---|
distil model run-training <model-id> | Start training to distill knowledge into a compact model. |
distil model training <model-id> | Check the status and results of the training job. |
Retuning
Section titled “Retuning”| Command | Description |
|---|---|
distil model retune <model-id> --name <name> --student-model <model> --tuning-parameters <file> | Retune an existing model with new tuning parameters. Creates a new model based on a previously trained one. |
distil model retune <model-id> --name <name> --student-model <model> --config <file> | Retune using a config file (only the tuning section is used). |
Flags:
| Flag | Alias | Required | Description |
|---|---|---|---|
--name | -n | Yes | Name of the retuned model to be created. |
--student-model | -s | Yes | Student model to use for retuning. |
--tuning-parameters | -t | Yes* | Path to tuning parameters file (.json or .yaml). |
--config | -c | Yes* | Path to config file (.json or .yaml) — only the tuning section will be used. |
Model Deployment
Section titled “Model Deployment”| Command | Description |
|---|---|
distil model deploy local <model-id> | Deploy a model locally (experimental). Requires llama-cpp on your machine. |
distil model deploy remote <model-id> | Deploy a model to distil labs playground infrastructure. |
distil model invoke <model-id> | Python client script to invoke a deployed model. |
Model Download
Section titled “Model Download”| Command | Description |
|---|---|
distil model download <model-id> | Download your trained model. |
Global Options
Section titled “Global Options”| Option | Description |
|---|---|
--output json | Output results in JSON format for scripting and automation. |
--help | Display help information for any command. |
Update & Docs
Section titled “Update & Docs”| Command | Description |
|---|---|
distil docs | Opens distil documentation in your default browser. |
distil model update | Update distil CLI to latest version. |