LLM Widget
Last updated
Last updated
Most fields are compatible with OpenAI API.
Fields besides widget_id
and output_name
temperature
number (Optional)
Controls randomness in the LLMConfig model's responses.
0.6
top_p
number (Optional)
Controls diversity via nucleus sampling in LLMConfig.
0.8
max_tokens
number (Optional)
Determines the maximum length of the model’s response in LLMConfig.
150
presence_penalty
number (Optional)
Influences the likelihood of the model to talk about new topics in LLMConfig.
0.6
frequency_penalty
number (Optional)
Controls how often the model makes use of infrequent words in LLMConfig.
0.5
memory
MemoryItem[] (Optional)
Stores information acquired across multiple rounds of dialog in LLMConfig.
[{ role: 'user', content: 'Hello World' }]
need_memory
boolean (Optional)
Determines if memory usage is required in LLMConfig.
true
system_prompt
string
Expression (Required)
Provides system prompt for the user in LLMConfig.
user_prompt
string
Expression (Required)
Provides user prompt for the system in LLMConfig.
function_name
string (Optional)
Specifies the name of the function in LLMFunctionConfig.
'get_current_weather'
function_description
string (Optional)
Provides a description of the function in LLMFunctionConfig.
'Get the current weather in a given location'
function_parameters
FunctionParameter[] (Optional)
Specifies the parameters of the function in LLMFunctionConfig.
[{name: 'city', type: 'string', description: 'The city, e.g. San Francisco, CA'}]
knowledge_base_token
string (Optional)
Specify the knowledge base the widget uses.
FunctionParameter
name
string (Required)
Specifies the name of the function parameter in FunctionParameter.
'city'
type
string (Required)
Specifies the type of the function parameter in FunctionParameter.
'list', 'string' or 'number'
description
string (Required)
Provides a description of the function parameter in FunctionParameter.
'The city, e.g. San Francisco, CA'
If you want to execute LLM function calls, you may specify function_name
, function_description
and function_parameters
fields.
Function calls can be seen as a formatting wrapper provided by LLM (currently only supports OpenAI series models), allowing parameters to be input via JSON, and outputting a serial JSON.
Reference:
For LLM function calls, the output of the widget is a JSON object, which includes the LLM returned results and will be automatically output to the state machine output in the form of key-value pairs.