CORE · FREE
The Cognithor agent OS is Apache 2.0. Clone it, fork it, ship your own distribution. You never pay to run it and never will.

A local-first agent operating system built for people who want AI on their own terms. Open-source. Offline-capable. Privacy by architecture.
Cognithor is an agent operating system. It runs entirely on your machine, uses the local LLM of your choice, and manages the full stack a serious agent needs — memory, tool execution, channel adapters, safety gates, and a persistent knowledge graph.
The project ships 145 tools across twelve MCP modules, eighteen channel adapters (Telegram, Slack, Discord, iMessage, Matrix and more), nineteen supported LLM providers (local via Ollama / LM Studio / vLLM / llama.cpp, cloud via OpenAI / Anthropic / Google / Groq / Mistral), and a six-tier memory stack that distinguishes identity, core, knowledge, episodic, entities, and tactical storage.
Every request walks through the PGE Trinity: a Planner that decomposes intent into tool calls, a Gatekeeper that classifies risk against a four-level policy (green / yellow / orange / red), and an Executor that runs approved calls in a sandbox. Yellow risks require human-in-the-loop approval. Red risks escalate to a pending-review queue.
The Evolution Engine runs in idle time. It re-indexes new content, summarizes long episodes, updates the entity graph, and surfaces meta-learnings back to the user. No cloud roundtrip. No telemetry. No subscription pricing.
It is not a chatbot. It is a person's operating system for AI.
Open-weight LLMs became genuinely useful on consumer hardware. Qwen, Llama, Mistral proved that a solo developer could host a serious model at home. The question was no longer whether, but how to build an operating system around them.
First commits. The PGE (Planner / Gatekeeper / Executor) architecture was established. A six-layer memory stack was built. Eighteen channel adapters went online. The project was originally named Jarvis.
145 tools across 12 MCP modules, 19 LLM providers, a Flutter Command Center, the Evolution Engine with autonomous learning and idle-loop meta-learning. The project was renamed Cognithor to make room for paid agent packs.
First paid pack shipped: Reddit Lead Hunter Pro. The pack infrastructure landed — MDX schemas, signed distribution, Gumroad checkout. A 70/30 creator marketplace was announced for Q4.
“Make the most capable AI assistant in the world run entirely on one person's computer.”
— the project, 2024
A public contract between the project and its users. These twelve lines are the non-negotiables — the things that would be broken first if the project ever turned into a SaaS or a venture play.
Every commit is visible. Every test runs in CI. Every release is signed.
Transparent revenue model. No venture capital, no ads, no telemetry data sold to third parties. The core is free; packs fund the maintainer; creators get the majority split.
The Cognithor agent OS is Apache 2.0. Clone it, fork it, ship your own distribution. You never pay to run it and never will.
Specialized agent packs are paid one-time purchases. They fund the maintainer, the test infrastructure, and the documentation effort.
Q4 2026: third-party creators will publish their own packs with a 70/30 revenue share. The project becomes sustainable without venture capital.
Cognithor is maintained by Alexander Söllner, an independent developer based in Germany, and contributors across the open-source community.
Source code, releases, the full commit history, and the issue tracker where bug reports land.
Community Q&A, pack ideas, release notifications, and informal conversation about the project.
Direct line for partnership, press, security disclosures, and anything that does not fit in a public thread.