OpenAI-Compatible (Beta)

The following table describes the parameters for configuring an OpenAI-Compatible LLM Agent.

Field Description
OpenAI-Compatible LLM Connection Details

Endpoint URL

The LLM Provider API endpoint URL. Example: https://api.openai.com/v1

Model ID

The specific version or name of the LLM to use. Example: openai/gpt-4o

Secret Access Key

The API Secret Access Key for the LLM provider.

User Prompt

A template for the user prompt to request a particular response from the model. Use the {{text://input.payload}} placeholder to include the message payload in the prompt.

For example:

  • You are a helpful AI assistant. Please summarize the information within the <content> tags: <content> {{text://input.payload}}</content>

  • You are a senior customer support specialist. Score the comments in the <content> tags on a scale of 1 (very negative) to 5 (very positive): <content> {{text://input.payload}}</content>

Max Tokens

The max_tokens hyperparameter, which controls the maximum length of the LLM's response.

Temperature

The temperature hyperparameter, which controls the randomness of the LLM's output. Valid values are between 0.0 and 1.0.

Event Broker Destination for AI Responses

Destination Type

Specifies whether the destination on the event broker service is a topic endpoint or queue.

Destination Name

The name of the topic or queue to publish AI response messages to.

For the Beta release, you can publish only to queues.