OpenMemory
OpenMemory is your personal memory layer for LLMs - private, portable, and open-source. Your memories live locally, giving you complete control over your data. Build AI applications with personalized memories while keeping your data secure.
Prerequisites
- Docker and Docker Compose
- Python 3.9+ (for backend development)
- Node.js (for frontend development)
- OpenAI API Key (required for LLM interactions, run
cp api/.env.example api/.envthen change OPENAI_API_KEY to yours)
Quickstart
You can run the project using the following two commands:
make build # builds the mcp server and ui
make up # runs openmemory mcp server and ui
After running these commands, you will have:
- OpenMemory MCP server running at: http://localhost:8765 (API documentation available at http://localhost:8765/docs)
- OpenMemory UI running at: http://localhost:3000
Project Structure
api/- Backend APIs + MCP serverui/- Frontend React application
Contributing
We are a team of developers passionate about the future of AI and open-source software. With years of experience in both fields, we believe in the power of community-driven development and are excited to build tools that make AI more accessible and personalized.
We welcome all forms of contributions:
- Bug reports and feature requests
- Documentation improvements
- Code contributions
- Testing and feedback
- Community support
How to contribute:
- Fork the repository
- Create your feature branch (
git checkout -b openmemory/feature/amazing-feature) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin openmemory/feature/amazing-feature) - Open a Pull Request
Join us in building the future of AI memory management! Your contributions help make OpenMemory better for everyone.