--- title: Custom Prompts description: 'Enhance your product experience by adding custom prompts tailored to your needs' icon: "pencil" iconType: "solid" --- ## Introduction to Custom Prompts Custom prompts allow you to tailor the behavior of your Mem0 instance to specific use cases or domains. By defining a custom prompt, you can control how information is extracted, processed, and stored in your memory system. To create an effective custom prompt: 1. Be specific about the information to extract. 2. Provide few-shot examples to guide the LLM. 3. Ensure examples follow the format shown below. Example of a custom prompt: ```python Python custom_prompt = """ Please only extract entities containing customer support information, order details, and user information. Here are some few shot examples: Input: Hi. Output: {{"facts" : []}} Input: The weather is nice today. Output: {{"facts" : []}} Input: My order #12345 hasn't arrived yet. Output: {{"facts" : ["Order #12345 not received"]}} Input: I'm John Doe, and I'd like to return the shoes I bought last week. Output: {{"facts" : ["Customer name: John Doe", "Wants to return shoes", "Purchase made last week"]}} Input: I ordered a red shirt, size medium, but received a blue one instead. Output: {{"facts" : ["Ordered red shirt, size medium", "Received blue shirt instead"]}} Return the facts and customer information in a json format as shown above. """ ``` ```typescript TypeScript const customPrompt = ` Please only extract entities containing customer support information, order details, and user information. Here are some few shot examples: Input: Hi. Output: {"facts" : []} Input: The weather is nice today. Output: {"facts" : []} Input: My order #12345 hasn't arrived yet. Output: {"facts" : ["Order #12345 not received"]} Input: I am John Doe, and I would like to return the shoes I bought last week. Output: {"facts" : ["Customer name: John Doe", "Wants to return shoes", "Purchase made last week"]} Input: I ordered a red shirt, size medium, but received a blue one instead. Output: {"facts" : ["Ordered red shirt, size medium", "Received blue shirt instead"]} Return the facts and customer information in a json format as shown above. `; ``` Here we initialize the custom prompt in the config: ```python Python from mem0 import Memory config = { "llm": { "provider": "openai", "config": { "model": "gpt-4o", "temperature": 0.2, "max_tokens": 2000, } }, "custom_prompt": custom_prompt, "version": "v1.1" } m = Memory.from_config(config_dict=config, user_id="alice") ``` ```typescript TypeScript import { Memory } from 'mem0ai/oss'; const config = { version: 'v1.1', llm: { provider: 'openai', config: { apiKey: process.env.OPENAI_API_KEY || '', model: 'gpt-4-turbo-preview', temperature: 0.2, maxTokens: 1500, }, }, customPrompt: customPrompt }; const memory = new Memory(config); ``` ### Example 1 In this example, we are adding a memory of a user ordering a laptop. As seen in the output, the custom prompt is used to extract the relevant information from the user's message. ```python Python m.add("Yesterday, I ordered a laptop, the order id is 12345", user_id="alice") ``` ```typescript TypeScript await memory.add('Yesterday, I ordered a laptop, the order id is 12345', { userId: "user123" }); ``` ```json Output { "results": [ { "memory": "Ordered a laptop", "event": "ADD" }, { "memory": "Order ID: 12345", "event": "ADD" }, { "memory": "Order placed yesterday", "event": "ADD" } ], "relations": [] } ``` ### Example 2 In this example, we are adding a memory of a user liking to go on hikes. This add message is not specific to the use-case mentioned in the custom prompt. Hence, the memory is not added. ```python Python m.add("I like going to hikes", user_id="alice") ``` ```typescript TypeScript await memory.add('I like going to hikes', { userId: "user123" }); ``` ```json Output { "results": [], "relations": [] } ``` The custom prompt will process both the user and assistant messages to extract relevant information according to the defined format.