Transformer X-Ray is an educational, research-style interpretability app that visualizes transformer internals step-by-step.
- Backend: FastAPI + PyTorch
- Frontend: React + D3.js (Vite)
- Model: Tiny custom transformer (small dimensions for readability)
- Tokenization view (tokens + IDs)
- Embedding vectors + PCA 2D projection
- Positional encoding view
- Attention math pipeline (Q, K, V, dot-product, scaling, softmax, weighted sum)
- Multi-head toggling + re-run
- Residual stream monitor
- FFN internals (linear1, activation, linear2)
- Logit lens per layer
- Sampling simulator (temperature, top-k, top-p, entropy)
- Head behavior analyzer (entropy, focus, induction-like flag)
- Model comparison mode (tiny transformer vs optional distilgpt2)
- Step-by-step playback + autoplay
- Educational tooltips
backend/FastAPI app, model internals, endpointsfrontend/React + D3 visualization appdocs/architecture and endpoint docs
cd backend
python -m venv .venv
# Windows PowerShell
.\.venv\Scripts\Activate.ps1
pip install -r requirements.txt
uvicorn app.main:app --reload --port 8000cd frontend
npm install
npm run devDefault frontend URL: http://localhost:5173
cd "c:\Users\harman deep singh\OneDrive\Desktop\codex transformer"
.\scripts\setup_backend.ps1
.\scripts\setup_frontend.ps1Run each in separate terminals:
.\scripts\run_backend.ps1
.\scripts\run_frontend.ps1If backend URL differs, set:
# frontend/.env
VITE_API_URL=http://127.0.0.1:8000POST /forwardPOST /attentionPOST /logitsPOST /activationsPOST /compare(optional GPT2 comparison)GET /health
See docs/API.md for request/response examples.
cd backend
pytestgit init
git add .
git commit -m "Initial commit: Transformer X-Ray full-stack interpretability app"
git branch -M main
git remote add origin https://github.com/<your-username>/transformer-xray.git
git push -u origin main- GPT-2 comparison requires
transformerspackage and model download access. - Tiny transformer is intentionally small and untrained by default for transparent visualization.