LLM Module
Last updated
Last updated
Most fields are compatible with OpenAI API.
LLMConfig
Field's Name | JSON Type (Required/Optional) | Description | Example |
---|---|---|---|
MemoryItem
Field's Name | JSON Type (Required/Optional) | Description | Example |
---|---|---|---|
model
"gpt-35-turbo-16k"
"gpt-4-1116-preview" (Required)
Determines the OpenAI language model to be used in LLMConfig.
temperature
number (Optional)
Controls randomness in the LLMConfig model's responses.
0.6
top_p
number (Optional)
Controls diversity via nucleus sampling in LLMConfig.
0.8
max_tokens
number (Optional)
Determines the maximum length of the model’s response in LLMConfig.
150
presence_penalty
number (Optional)
Influences the likelihood of the model to talk about new topics in LLMConfig.
0.6
frequency_penalty
number (Optional)
Controls how often the model makes use of infrequent words in LLMConfig.
0.5
memory
MemoryItem[] (Optional)
Stores information acquired across multiple rounds of dialog in LLMConfig.
[{ role: 'user', content: 'Hello World' }]
need_memory
boolean (Optional)
Determines if memory usage is required in LLMConfig.
true
system_prompt
string
Expression (Required)
Provides system prompt for the user in LLMConfig.
user_prompt
string
Expression (Required)
Provides user prompt for the system in LLMConfig.
output_name
string (Required)
Determines the name of the module output in LLMConfig, defaults to 'reply'.
"reply"
role
user
assistant (Required)
Specifies the source of the memory item in MemoryItem.
content
string
Expression (Required)
Specifies the actual value of the memory item in MemoryItem.