Designed for ML Researchers. Local, on-prem, or in the cloud. Open source.
β¬οΈ Install for Individuals Β Β·Β π’ Install for Teams Β Β·Β π Documentation Β Β·Β π¬ Demo Β Β·Β π¬ Discord
Transformer Lab is an open-source machine learning platform that unifies the fragmented AI tooling landscape into a single, elegant interface. It is available in two editions:
|
Perfect for researchers and hobbyists working on a single machine.
|
Built for research labs scaling across GPU clusters.
|
π§ Foundation Models & LLMs
- Universal Support: Download and run Llama 3, DeepSeek, Mistral, Qwen, Phi, and more.
- Inference Engines: Support for MLX, vLLM, Ollama, and HuggingFace Transformers.
- Format Conversion: Seamlessly convert between HuggingFace, GGUF, and MLX formats.
- Chat Interface: Multi-turn chat, batched querying, and function calling support.
π Training & Fine-tuning
- Unified Interface: Train on local hardware or submit tasks to remote clusters using the same UI.
- Methods: Full fine-tuning, LoRA/QLoRA, RLHF (DPO, ORPO, SIMPO), and Reward Modeling.
- Hardware Agnostic: Optimized trainers for Apple Silicon (MLX), NVIDIA (CUDA), and AMD (ROCm).
- Hyperparameter Sweeps: Define parameter ranges in YAML and automatically schedule grid searches.
π¨ Diffusion & Image Generation
- Generation: Text-to-Image, Image-to-Image, and Inpainting using Stable Diffusion and Flux.
- Advanced Control: Full support for ControlNets and IP-Adapters.
- Training: Train custom LoRA adaptors on your own image datasets.
- Dataset Management: Auto-caption images using WD14 taggers.
π Evaluation & Analytics
- LLM-as-a-Judge: Use local or remote models to score outputs on bias, toxicity, and faithfulness.
- Benchmarks: Built-in support for EleutherAI LM Evaluation Harness (MMLU, HellaSwag, GSM8K, etc.).
- Red Teaming: Automated vulnerability testing for PII leakage, prompt injection, and safety.
π Plugins & Extensibility
- Plugin System: Extend functionality with a robust Python plugin architecture.
- Lab SDK: Integrate your existing Python training scripts (
import lab) to get automatic logging, progress bars, and artifact tracking. - CLI: Power-user command line tool for submitting tasks and monitoring jobs without a browser.
π£οΈ Audio Generation
- Text-to-Speech: Generate speech using Kokoro, Bark, and other state-of-the-art models.
- Training: Fine-tune TTS models on custom voice datasets.
curl https://lab.cloud/install.sh | bashcd ~/.transformerlab/src
./run.shOpen your browser to http://localhost:8338.
| Platform | Requirements |
|---|---|
| macOS | Apple Silicon (M1/M2/M3/M4) |
| Linux | NVIDIA or AMD GPU |
| Windows | NVIDIA GPU via WSL2 (setup guide) |
Transformer Lab for Teams runs as an overlay on your existing infrastructure. It does not replace your scheduler; it acts as a modern control plane for it.
To configure Transformer Lab to talk to Slurm or SkyPilot:
- Follow the Teams Install Guide.
- Configure your compute providers in the Team Settings.
- Use the CLI (
lab) or Web UI to queue tasks across your cluster.
Frontend
# Requires Node.js v22
npm install
npm startBackend (API)
cd api
./install.sh # Sets up Conda env + Python deps
./run.sh # Start the API serverLab SDK
pip install transformerlabWe are an open-source initiative backed by builders who care about the future of AI research. We welcome contributions! Please check our issues for open tasks.
AGPL-3.0 Β· See LICENSE for details.
@software{transformerlab,
author = {Asaria, Ali and Salomone, Tony},
title = {Transformer Lab: The Operating System for AI Research},
year = 2023,
url = {https://github.com/transformerlab/transformerlab-app}
}Built with β€οΈ by Transformer Lab in Canada π¨π¦
