What Memory Is For
Memory is best suited for:- Storing structured or semi-structured context that should persist over time
- Personalizing responses based on previous interactions
- Tracking progress in multi-step workflows
- Saving summaries instead of replaying full chat history every time
Create a Memory Object
Configure scoping
Optionally configure how the memory should be isolated. See Memory Scoping.
Use Memory in an Agent
There are two ways to work with Memory in an agent:- Memory Steps
- Memory Tools
Use Memory Steps when memory access should happen deterministically on every execution.
- Open your agent in Agent Studio.
- Drag a Memory Load or Memory Store step onto the canvas.
- Open the step configuration and select the Memory Object.
- Connect the step into your workflow.
- In the AI Model step’s Instructions field, add
<memory value="Your Memory Name" />exactly where you want the memory contents inserted
- Place Memory Load near the start of the workflow
- Send both the user input and loaded memory into the AI Model step
- Place Memory Store after the AI Model step to save updated context
Required: Reference Memory in Your Prompt
Use this syntax in the AI Model step’s Instructions field:value must match the exact name of the Memory Object, not its ID.
Example:
The canvas connection shows data flow between steps, but it does not inject memory into the AI Model prompt automatically. The
<memory> tag in the AI Model Instructions field is what makes the loaded memory available to the model.Load vs Store
Memory Load
Retrieves the current contents of a Memory Object and passes them downstream.Memory Store
Writes data into a Memory Object. By default, Memory Store overwrites the full contents of the Memory Object. If you enable Append Text, new content is added to the existing memory instead of replacing it.Memory Scoping
Scoping determines how memory is isolated.| Scope | isUserSpecific | isConversationScoped | Behavior |
|---|---|---|---|
| Global | false | false | One shared memory for all users and conversations |
| User-specific | true | false | Each user has a separate memory |
| Conversation-scoped | false | true | Each conversation has a separate memory |
| User + Conversation | true | true | Separate memory per user per conversation |
When both
isUserSpecific and isConversationScoped are enabled, memory is isolated by the combination of user and conversation.Pass Identity for Scoped Memory
If your Memory Object uses user or conversation scoping, execution requests must include the relevant identifiers.userIdorexternalUserIdfor user-specific memoryconversationIdfor conversation-scoped memory
Common Use Cases
User preferences
User preferences
Store persistent settings such as tone, language, interests, or formatting preferences.
Session state
Session state
Track progress in multi-step tasks such as onboarding, form completion, or guided workflows.
Conversation summaries
Conversation summaries
Store a rolling summary of prior conversations instead of sending raw message history on every run.
Shared project context
Shared project context
Chat History vs Memory
For most chatbot use cases, start with the AI Model step’s built-in Include Chat History setting. Use Chat History when:- You want recent messages automatically included
- You do not need custom storage logic
- You want the fastest setup
- You want to persist summaries or extracted facts
- You need custom control over what gets stored
- You want memory to outlive a single conversation
- You need deterministic read/write behavior in a workflow
Troubleshooting
Memory is not persisting
Memory is not persisting
Check the following:
- A Memory Store step is connected and actually executes
- The correct Memory Object is selected
- Scoped executions are passing the same
userIdand/orconversationId
Memory is empty
Memory is empty
Common causes:
- No prior write has occurred
- A Store step overwrote the memory with empty or partial content
- Scoped identifiers do not match the expected user or conversation
Loaded memory is not affecting the AI Model output
Loaded memory is not affecting the AI Model output
This usually means the AI Model prompt does not include a
<memory> tag.Wrong memory is being returned
Wrong memory is being returned
Verify:
- Scoping is configured correctly
- The execution request includes the expected identifiers
- The same user and conversation values are used consistently across runs
Best Practices
- Default to user-specific memory unless the data should truly be shared
- Keep memory concise to reduce token usage and latency
- Prefer storing summaries, facts, and state, not raw transcript dumps
- Load before Store when you need to preserve existing content
- Use Memory Steps for deterministic workflows
- Use Memory Tools when the LLM should decide when memory is relevant
- Always add a
<memory>tag in the AI Model prompt when using Memory Load steps
Recommended Pattern
For most agents, this is the safest starting pattern:- Load memory at the beginning of the workflow
- Add
<memory value="Your Memory Name" />in the AI Model step’s Instructions field where you want the memory to appear - Let the AI Model generate a response and/or updated state
- Store the updated memory after the model step
