CORE: A Unified Memory System for Your AI Applications
Summary
CORE by RedPlanetHQ is an open-source project designed to provide a persistent, unified memory layer for AI applications. It leverages a temporal knowledge graph to prevent context loss across various AI tools, ensuring LLMs retain past conversations, preferences, and project history. This system significantly enhances AI interactions by making context available across different sessions and platforms.
Repository Info
Tags
Click on any tag to explore related repositories
Introduction
CORE, developed by RedPlanetHQ, is an innovative open-source project designed to build a personal memory system for your AI applications. In a world where Large Language Models (LLMs) often lose context when switching tools or ending conversations, CORE provides a crucial solution. It acts as a unified memory layer, placing persistent context directly into your LLM's awareness. This eliminates the need to re-explain project details, prevents lost conversations, and ensures your AI tools remember past discussions, preferences, and project history. At its heart, CORE leverages a temporal knowledge graph to intelligently store and retrieve information.
Installation
CORE offers flexible deployment options, including self-hosting for complete data control and a convenient cloud service.
Self-Hosting:
For those who prefer to manage their own infrastructure, CORE can be self-hosted using Docker and Docker Compose. You'll need Docker (20.10.0+) and an OpenAI API key.
- Clone the repository:
git clone https://github.com/RedPlanetHQ/core.git && cd core - Configure
OPENAI_API_KEYincore/.env. - Start the service:
docker-compose up -d
A complete self-hosting guide is available in the official documentation.
CORE Cloud:
Alternatively, you can use CORE Cloud for an instant setup without infrastructure management. Simply sign up at core.heysol.ai, visualize your memory graph, and connect to your preferred tools.
CORE integrates with a wide range of AI tools and platforms, including:
- CLIs: Claude Code CLI, Codex CLI, Gemini CLI, Copilot CLI
- IDEs: Cursor, VS Code, VS Code Insiders, Windsurf, Zed
- Coding Agents: Amp, Augment Code, Cline, Kilo Code, Kiro, Qwen Coder, Roo Code, Opencode, Copilot Coding Agent, Qodo Gen
- Terminals: Warp, Crush
- Desktop Apps: Claude Desktop, ChatGPT (via browser extension), Gemini (via browser extension), Perplexity Desktop
- Development Tools: Factory, Rovo Dev CLI, Trae
Refer to the official documentation for detailed installation instructions for each integration.
Examples
CORE enhances your AI workflow by allowing you to interact with your persistent memory directly within your prompts.
Searching Core Memory:
To retrieve relevant context, simply add search core memory to your prompt:
What were the architecture decisions we made for the payment service last week? Search core memory
Adding to Core Memory:
To store new information or preferences, use add to core memory:
Set up authentication for my API service using JWT tokens. Remember my preference for TypeScript strict mode. Add to core memory
CORE builds a temporal knowledge graph that remembers everything, making it available across various MCP-compatible tools.
Why Use CORE?
CORE stands out for several compelling reasons, addressing critical challenges in AI application usage:
- Persistent Memory Across Tools: Your conversations and decisions are unified across platforms like ChatGPT, Cursor, and Claude, creating a single, coherent memory graph.
- Temporal Knowledge Graph: Beyond simple storage, CORE tracks the "who, what, when, and why" behind every piece of information, offering full provenance and showing how decisions evolve over time.
- High Benchmark Accuracy: Achieves 88.24% average accuracy on the LoCoMo benchmark, significantly outperforming competitors in multi-hop and temporal reasoning tasks.
- Your Data, Your Control: As an open-source and self-hostable solution, CORE ensures you maintain complete ownership and control over your personal memory data.
Key Features include:
- Unified, Portable Memory: Seamlessly add and recall memory across numerous CLIs, IDEs, and desktop applications via MCP.
- Temporal + Reified Knowledge Graph: Understand the story behind every fact with rich relationships and full provenance.
- Browser Extension: Save conversations and content from popular platforms like ChatGPT, Grok, Gemini, Twitter, YouTube, and any webpage directly into your CORE memory.
- Chat with Memory: Interact with your knowledge graph by asking questions and receiving instant insights.
- Auto-Sync from Apps: Automatically capture relevant context from integrations like Linear, Slack, Notion, and GitHub.
- MCP Integration Hub: Connect apps like Linear, Slack, and GitHub to CORE once, then access their tools across all MCP clients with a single URL.
Links
- GitHub Repository: https://github.com/RedPlanetHQ/core
- Official Website: https://heysol.ai
- Documentation: https://docs.getcore.me
- Discord Community: https://discord.gg/YGUZcvDjUa