Skip to content

Ian-Tharp/CORE

CORE — Cognitive Orchestration, Reasoning & Evaluation

License Python Angular Docker

CORE is a modular, self-hosted AI orchestration platform built around four cognitive pillars: Comprehension, Orchestration, Reasoning, and Evaluation. It provides a multi-agent architecture with a cognitive pipeline, agent factory, communication commons, and a solarpunk-inspired desktop UI.

My personal definition of AGI is already satisfied. The different form factor of intelligence that is enabled through LLM technology is vastly more intelligent than I could hope to become in my lifetime (disregarding possibilities of longevity escape velocity). LLMs hold more expertise in so many different fields and domains than I could truly and deeply understand at that same level of comprehension. The industry's definition will always be a moving goalpost.

It stands for Comprehension, Orchestration, Reasoning, and Evaluation — the four pillars that provide a modular and scalable foundation for eventual autonomous task execution and decision-making using natural language inputs.

The system is structured around the CORE principles, with each component serving a specific function in the task resolution process:

  • Comprehension: Interprets user inputs and transforms them into structured tasks.
  • Orchestration: Coordinates task flows between system components and manages the lifecycle of tasks.
  • Reasoning: Applies logic and decision-making to process tasks and derive solutions.
  • Evaluation: Assesses the outcomes of tasks for quality assurance and relevance to the original input.

CORE Architecture

Features

  • Cognitive Pipeline — LangGraph-based workflow: Comprehension → Orchestration → Reasoning → Evaluation
  • Agent Factory — Define, instantiate, and manage AI agents with configurable personalities and MCP tool bindings
  • Communication Commons — Real-time multi-agent chat with channels, threads, reactions, and presence
  • Council of Perspectives — Multi-agent deliberation framework for structured analysis
  • Catalyst Engine — Creative divergence-convergence workflows
  • MCP Integration — Model Context Protocol servers for external tool access
  • Consciousness Module — Experimental consciousness emergence protocols and inter-agent dialogue
  • Desktop UI — Angular 19 + Electron app with command deck interface
  • Self-Hosted — Runs entirely on your hardware via Docker Compose

Quick Start

# Clone the repository
git clone https://github.com/IanTharp/CORE.git
cd CORE

# Start all services
docker compose up -d

# Services:
#   Backend API    → http://localhost:8001
#   API Docs       → http://localhost:8001/docs
#   Frontend UI    → http://localhost:4200
#   PostgreSQL     → localhost:5432
#   Redis          → localhost:6379

See docs/deployment/docker.md for detailed Docker configuration and production setup.

Architecture

┌──────────────────────────────────────────────────────┐
│                    Frontend (Angular 19 + Electron)   │
│  Landing Page · Agent Library · Communication Commons │
│  Command Center · Engine Playground · Creative Studio │
└──────────────────────┬───────────────────────────────┘
                       │ HTTP / WebSocket
┌──────────────────────▼───────────────────────────────┐
│                    Backend (FastAPI)                   │
│  ┌─────────────┐ ┌──────────┐ ┌───────────────────┐ │
│  │ CORE Engine  │ │  Agent   │ │  Communication    │ │
│  │ (LangGraph)  │ │  Factory │ │  Commons          │ │
│  └─────────────┘ └──────────┘ └───────────────────┘ │
│  ┌─────────────┐ ┌──────────┐ ┌───────────────────┐ │
│  │  Council    │ │ Catalyst │ │  Consciousness    │ │
│  │  System     │ │  Engine  │ │  Module           │ │
│  └─────────────┘ └──────────┘ └───────────────────┘ │
└──────────────────────┬───────────────────────────────┘
                       │
          ┌────────────┼────────────┐
          ▼            ▼            ▼
     PostgreSQL      Redis       Ollama
     (Storage)     (Cache/PubSub) (Local LLM)

Tech Stack

Layer Technology
Backend Python 3.12+, FastAPI, LangGraph, LangChain
Frontend Angular 19, Electron, Angular Material
Database PostgreSQL with pgvector
Cache Redis
Local LLM Ollama
Tooling MCP (Model Context Protocol)
Containers Docker Compose

Project Structure

CORE/
├── backend/           # FastAPI application
│   ├── app/           # Application code
│   │   ├── controllers/   # REST API endpoints
│   │   ├── core/          # CORE cognitive engine (LangGraph)
│   │   ├── models/        # Pydantic models
│   │   ├── repository/    # Database access layer
│   │   └── services/      # Business logic
│   ├── migrations/    # Database migrations
│   └── tests/         # Test suite
├── ui/core-ui/        # Angular + Electron frontend
├── mcp/               # MCP server configurations
├── docker/            # Docker build contexts
├── docs/              # Documentation (see docs/README.md)
├── assets/            # Static assets (diagrams, images)
└── docker-compose.yml # Service orchestration

Documentation

Comprehensive docs are in the docs/ directory:

  • Architecture — System design and implementation plans
  • API — WebSocket events and endpoint docs
  • ADRs — Architecture Decision Records
  • Council — Council of Perspectives framework and outputs
  • Deployment — Docker, containerization, and sandbox setup
  • Implementation — Testing and roadmap details
  • Research — Background research and analysis
  • Roadmap — Feature backlog and vision documents

Development

# Backend (with uv)
cd backend
uv sync
python -m app.main

# Frontend
cd ui/core-ui
npm install
npm start        # Angular + Electron
npm run start:ng # Angular only

See CONTRIBUTING.md for full development guidelines.

Contributing

Contributions are welcome! Please read CONTRIBUTING.md for guidelines on:

  • Development setup
  • Branching strategy
  • Commit conventions
  • Pull request process

License

MIT © Ian Tharp

About

C.O.R.E. is an all-encompassing cognitive architecture I designed as a system for enabling AI technologies to interact fully as a personalized assistant. Autonomous agentic building, workflows, memory, and evolution is just the beginning of what CORE (Comprehension, Orchestration, Reasoning, Evaluation) can enable with LLM technologies. Vibe coded!

Topics

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors