LLM Widget

Widgets in Workshop

Config

Most fields are compatible with OpenAI API.

Fields besides widget_id and output_name

Field's NameJSON Type (Required/Optional)DescriptionExample

temperature

number (Optional)

Controls randomness in the LLMConfig model's responses.

0.6

top_p

number (Optional)

Controls diversity via nucleus sampling in LLMConfig.

0.8

max_tokens

number (Optional)

Determines the maximum length of the model’s response in LLMConfig.

150

presence_penalty

number (Optional)

Influences the likelihood of the model to talk about new topics in LLMConfig.

0.6

frequency_penalty

number (Optional)

Controls how often the model makes use of infrequent words in LLMConfig.

0.5

memory

MemoryItem[] (Optional)

Stores information acquired across multiple rounds of dialog in LLMConfig.

[{ role: 'user', content: 'Hello World' }]

need_memory

boolean (Optional)

Determines if memory usage is required in LLMConfig.

true

system_prompt

string

Expression (Required)

Provides system prompt for the user in LLMConfig.

user_prompt

string

Expression (Required)

Provides user prompt for the system in LLMConfig.

function_name

string (Optional)

Specifies the name of the function in LLMFunctionConfig.

'get_current_weather'

function_description

string (Optional)

Provides a description of the function in LLMFunctionConfig.

'Get the current weather in a given location'

function_parameters

FunctionParameter[] (Optional)

Specifies the parameters of the function in LLMFunctionConfig.

[{name: 'city', type: 'string', description: 'The city, e.g. San Francisco, CA'}]

output_name

string (Required)

Determines the name of the module output in LLMConfig, defaults to 'reply'.

"reply"

FunctionParameter

Field's NameJSON Type (Required/Optional)DescriptionExample

name

string (Required)

Specifies the name of the function parameter in FunctionParameter.

'city'

type

string (Required)

Specifies the type of the function parameter in FunctionParameter.

'list', 'string' or 'number'

description

string (Required)

Provides a description of the function parameter in FunctionParameter.

'The city, e.g. San Francisco, CA'

Function Call

If you want to execute LLM function calls, you may specify function_name, function_description and function_parameters fields.

Function calls can be seen as a formatting wrapper provided by LLM (currently only supports OpenAI series models), allowing parameters to be input via JSON, and outputting a serial JSON.

Reference:

For LLM function calls, the output of the widget is a JSON object, which includes the LLM returned results and will be automatically output to the state machine output in the form of key-value pairs.

Example

{
  "id": "prompt_widget_template",
  "initial": "home_state",
  "states": {
    "home_state": {
      "inputs": {
        "input_message": {
          "type": "IM",
          "user_input": true
        }
      },
      "tasks": [
        {
          "name": "llm_widget_example_task",
          "module_type": "AnyWidgetModule",
          "module_config": {
            "widget_id": "1744214047475109888",
            "user_prompt": "{{input_message}}", // the text inputted into prompt widget, you can get it from user input or upper state
            "system_prompt": "Act as ...", // Optional field. You can input system prompt of bot.
            "top_p": 0.5, // Optional field. Default value is 0.5
            "temperature": 0.5, // Optional field. Default value is 0.5
            "frequency_penalty": 0, // Optional field. Default value is 0
            "presence_penalty": 0, // Optional field. Default value is 0
            "output_name": "result"
          }
        }
      ],
      "render": {
        "text": "{{result}}", // it's a string produced by prompt widget.
        "buttons": [
          {
            "content": "Chat Again",
            "description": "",
            "on_click": "rerun"
          }
        ]
      },
      "transitions": {
        "rerun": "home_state",
        "CHAT": "home_state"
      }
    }
  }
}

Last updated