163 lines
4.7 KiB
Plaintext
163 lines
4.7 KiB
Plaintext
---
|
|
title: Quickstart
|
|
icon: "terminal"
|
|
iconType: "solid"
|
|
---
|
|
|
|
<Snippet file="paper-release.mdx" />
|
|
|
|
## 🚀 Hosted OpenMemory MCP Now Available!
|
|
|
|
#### Sign Up Now - [app.openmemory.dev](https://app.openmemory.dev)
|
|
|
|
Everything you love about OpenMemory MCP but with zero setup.
|
|
|
|
✅ Works with all MCP-compatible tools (Claude Desktop, Cursor...)
|
|
✅ Same standard memory ops: `add_memories`, `search_memory`, etc
|
|
✅ One-click provisioning, no Docker required
|
|
✅ Powered by Mem0
|
|
|
|
Add shared, persistent, low-friction memory to your MCP-compatible clients in seconds.
|
|
|
|
### 🌟 Get Started Now
|
|
**Sign up and get your access key at [app.openmemory.dev](https://app.openmemory.dev)**
|
|
|
|
Example installation: `npx @openmemory/install --client claude --env OPENMEMORY_API_KEY=your-key`
|
|
|
|
## Getting Started with Hosted OpenMemory
|
|
|
|
The fastest way to get started is with our hosted version - no setup required:
|
|
|
|
### 1. Get your API key
|
|
Visit [app.openmemory.dev](https://app.openmemory.dev) to sign up and get your `OPENMEMORY_API_KEY`.
|
|
|
|
### 2. Install and connect to your preferred client
|
|
Example commands (replace `your-key` with your actual API key):
|
|
|
|
For Claude Desktop: `npx @openmemory/install --client claude --env OPENMEMORY_API_KEY=your-key`
|
|
|
|
For Cursor: `npx @openmemory/install --client cursor --env OPENMEMORY_API_KEY=your-key`
|
|
|
|
For Windsurf: `npx @openmemory/install --client windsurf --env OPENMEMORY_API_KEY=your-key`
|
|
|
|
That's it! Your AI client now has persistent memory across sessions.
|
|
|
|
## Local Setup (Self-Hosted)
|
|
|
|
Prefer to run OpenMemory locally? Follow the instructions below for a self-hosted setup.
|
|
|
|
## OpenMemory Easy Setup
|
|
|
|
### Prerequisites
|
|
- Docker
|
|
- OpenAI API Key
|
|
|
|
You can quickly run OpenMemory by running the following command:
|
|
|
|
```bash
|
|
curl -sL https://raw.githubusercontent.com/mem0ai/mem0/main/openmemory/run.sh | bash
|
|
```
|
|
|
|
You should set the `OPENAI_API_KEY` as a global environment variable:
|
|
|
|
```bash
|
|
export OPENAI_API_KEY=your_api_key
|
|
```
|
|
|
|
You can also set the `OPENAI_API_KEY` as a parameter to the script:
|
|
|
|
```bash
|
|
curl -sL https://raw.githubusercontent.com/mem0ai/mem0/main/openmemory/run.sh | OPENAI_API_KEY=your_api_key bash
|
|
```
|
|
|
|
This will start the OpenMemory server and the OpenMemory UI. Deleting the container will lead to the deletion of the memory store.
|
|
We suggest you follow the instructions below to set up OpenMemory on your local machine, with more persistant memory store.
|
|
|
|
## Setting Up OpenMemory
|
|
|
|
Getting started with OpenMemory is straight forward and takes just a few minutes to set up on your local machine. Follow these steps:
|
|
|
|
### Getting started
|
|
|
|
|
|
### 1. First clone the repository and then follow the instructions:
|
|
```bash
|
|
# Clone the repository
|
|
git clone https://github.com/mem0ai/mem0.git
|
|
cd mem0/openmemory
|
|
```
|
|
|
|
### 2. Set Up Environment Variables
|
|
|
|
Before running the project, you need to configure environment variables for both the API and the UI.
|
|
|
|
You can do this in one of the following ways:
|
|
|
|
- **Manually**:
|
|
Create a `.env` file in each of the following directories:
|
|
- `/api/.env`
|
|
- `/ui/.env`
|
|
|
|
- **Using `.env.example` files**:
|
|
Copy and rename the example files:
|
|
|
|
```bash
|
|
cp api/.env.example api/.env
|
|
cp ui/.env.example ui/.env
|
|
```
|
|
|
|
- **Using Makefile** (if supported):
|
|
Run:
|
|
|
|
```bash
|
|
make env
|
|
```
|
|
- #### Example `/api/.env`
|
|
|
|
``` bash
|
|
OPENAI_API_KEY=sk-xxx
|
|
USER=<user-id> # The User Id you want to associate the memories with
|
|
```
|
|
- #### Example `/ui/.env`
|
|
|
|
```bash
|
|
NEXT_PUBLIC_API_URL=http://localhost:8765
|
|
NEXT_PUBLIC_USER_ID=<user-id> # Same as the user id for environment variable in api
|
|
```
|
|
|
|
### 3. Build and Run the Project
|
|
You can run the project using the following two commands:
|
|
```bash
|
|
make build # builds the mcp server and ui
|
|
make up # runs openmemory mcp server and ui
|
|
```
|
|
|
|
After running these commands, you will have:
|
|
- OpenMemory MCP server running at: http://localhost:8765 (API documentation available at http://localhost:8765/docs)
|
|
- OpenMemory UI running at: http://localhost:3000
|
|
|
|
#### UI not working on http://localhost:3000?
|
|
|
|
If the UI does not start properly on http://localhost:3000, try running it manually:
|
|
|
|
```bash
|
|
cd ui
|
|
pnpm install
|
|
pnpm dev
|
|
```
|
|
|
|
|
|
You can configure the MCP client using the following command (replace username with your username):
|
|
|
|
```bash
|
|
npx @openmemory/install local "http://localhost:8765/mcp/cursor/sse/username" --client cursor
|
|
```
|
|
|
|
The OpenMemory dashboard will be available at http://localhost:3000. From here, you can view and manage your memories, as well as check connection status with your MCP clients.
|
|
|
|
Once set up, OpenMemory runs locally on your machine, ensuring all your AI memories remain private and secure while being accessible across any compatible MCP client.
|
|
|
|
### Getting Started Today
|
|
|
|
- Github Repository: https://github.com/mem0ai/mem0/tree/main/openmemory
|