A privacy-first terminal built with Tauri that runs AI models locally — no telemetry, no cloud, no data leaving your machine.
Type commands naturally — the AI translates them:
"show me all large files" → find . -type f -size +100M -exec ls -lh {} \;
"what's using the most CPU?" → top -o cpu
"check git status and stage changes" → git status && git add .
AI slash commands:
/explain <command>— explain what any command does/fix— analyse the last error and suggest a fix/optimize— suggest a more efficient alternative
- Natural Language Commands — Type plain English; get shell commands
- Smart Completions — Context-aware Tab suggestions
- Error Assistance — AI automatically suggests fixes when commands fail
- Local LLM Processing — All inference runs on your machine; nothing leaves it
- Pattern Learning — Learns your workflows and adapts suggestions over time
- Multi-Session —
Cmd+Tnew session,Cmd+Wclose,Cmd+1–9switch
- Rust 1.70+ — rustup.rs
- Node.js 18+ — nodejs.org
- RAM 4GB minimum (8GB recommended for AI models)
- Storage ~5GB free (for models and dependencies)
git clone https://github.com/EfficientTools/pH7Console.git
cd pH7Console
chmod +x setup.sh && ./setup.sh
npm run tauri dev# Production build
npm run tauri build
# Universal macOS binary (Intel + Apple Silicon)
npm run tauri build -- --target universal-apple-darwinBuild outputs land in src-tauri/target/release/bundle/.
npm run lint # TypeScript/React linting
npm run type-check # TypeScript type checking
npm test # Frontend tests
cd src-tauri && cargo test # Rust backend tests
cd src-tauri && cargo fmt # Format Rust code
cd src-tauri && cargo clippy # Lint Rust code
npm run test:e2e # Integration tests| Model | Size | RAM | Speed | Best for |
|---|---|---|---|---|
| Phi-3 Mini | 3.8 GB | 4–6 GB | 200–500 ms | Complex reasoning, code generation |
| Llama 3.2 1B | 1.2 GB | 2–3 GB | 100–200 ms | General commands, explanations |
| TinyLlama | 1.1 GB | 1.5–2 GB | 50–100 ms | Real-time completions |
| CodeQwen | 1.5 GB | 2–4 GB | 150–300 ms | Programming tasks, code analysis |
- Frontend: React 18 + TypeScript + Tailwind CSS
- Backend: Rust + Tauri 2.0
- AI Runtime: Candle (Rust-native ML framework)
- Terminal: xterm.js + cross-platform PTY
Made with ❤️ by Pierre-Henry Soria. A super passionate & enthusiastic problem-solver engineer. Also a true cheese 🧀, ristretto ☕️, and dark chocolate lover! 😋
pH7Console is generously distributed under MIT license 🎉 Wish you happy, happy productive time! 🤠