Advanced Memory Manager in Prompt Widget
Many creators use context to manage LLM chat memory, which can be complicated and redundant.
We've introduced a new, more convenient method for managing memory in the prompt widget that offers enhanced capabilities.
Here's an example of chatting with auto-memory using Claude 3 Haiku:
The key to this configuration is using auto
in the memory
field within the module config:
We automatically manage your chat memory in our store, eliminating the need to store it again in context. In more detail, memory management treats each LLM task as the smallest unit. For example, if you use auto-memory in different states or different tasks, their memory management is separate and distinct—tasks do not share memory. Additionally, if you change the state name to which a task belongs or alter the order of the tasks array, memory may be lost after saving.
Additional Prompt
For users who want to use the fulfill mechanism or add prefixes and suffixes to user input, automatic memory management now includes a configuration to achieve this function.
The key is prompt_addition
. It has four fields that allow you to create a more complex LLM chat system:
user_prompt_prefix
string
Adds prefix to user's prompt
user_prompt_suffix
string
Adds suffix to user's prompt
pre_messages
array<map>
Adds messages before memory
post_messages
array<map>
Adds messages after memory and user's prompt
If you set all of these fields, the request messages order will be:
Message Editing
With using auto memory, you can delete or edit single message whether it is sent by the user or the bot to change its corresponding memory. No need to clear all context anymore.
Last updated