PromptQL Configuration
Introduction
Your PromptQlConfig is a metadata object that defines the configuration of PromptQL for your project. It includes the LLM to be used, the system instructions, and other settings.
Example PromptQlConfig
kind: PromptQlConfig
version: v2
definition:
llm:
provider: openai
model: o3-mini
ai_primitives_llm:
provider: openai
model: gpt-4o
system_instructions: |
You are a helpful AI Assistant.
Metadata structure
PromptQlConfigV2
Definition of the configuration of PromptQL, v2
Key | Value | Required | Description |
---|---|---|---|
kind | PromptQlConfig | true | |
version | v2 | true | |
definition | PromptQlConfigV2 | true | Definition of the configuration of PromptQL for the project |
PromptQlConfigV2
Definition of the configuration of PromptQL for the project
Key | Value | Required | Description |
---|---|---|---|
systemInstructions | string / null | false | Custom system instructions provided to every PromptQL thread that allows tailoring of behavior to match to the project's specific needs. |
llm | LlmConfig | true | Configuration of the LLM to be used for PromptQL |
aiPrimitivesLlm | LlmConfig / null | false | Configuration of the LLM to be used for AI primitives, such as classification, summarization etc |
featureFlags | object / null | false | Feature flags to be used for PromptQL |
LlmConfig
Configuration of the LLM to be used for PromptQL
One of the following values:
Value | Description |
---|---|
HasuraLlmConfig | Configuration settings for the Hasura-configured LLM |
OpenAiLlmConfig | Configuration settings for an OpenAI LLM |
AnthropicLlmConfig | Configuration settings for an Anthropic LLM |
AzureLlmConfig | Configuration settings for an Azure-provided LLM |
GeminiLlmConfig | Configuration settings for a Gemini LLM |
BedrockLlmConfig | Configuration settings for an AWS Bedrock-provided LLM |
BedrockLlmConfig
Configuration settings for an AWS Bedrock-provided LLM
Key | Value | Required | Description |
---|---|---|---|
provider | bedrock | true | |
modelId | string | true | The specific AWS Bedrock model to use. |
regionName | string | true | The specific AWS Bedrock region to use. |
awsAccessKeyId | EnvironmentValue | true | The AWS access key ID to use for the AWS Bedrock API |
awsSecretAccessKey | EnvironmentValue | true | The AWS secret access key to use for the AWS Bedrock API |
GeminiLlmConfig
Configuration settings for a Gemini LLM
Key | Value | Required | Description |
---|---|---|---|
provider | gemini | true | |
model | string / null | false | The specific Gemini model to use. If not specified, the default model will be used. |
apiKey | EnvironmentValue | true | The API key to use for the Gemini API |
AzureLlmConfig
Configuration settings for an Azure-provided LLM
Key | Value | Required | Description |
---|---|---|---|
provider | azure | true | |
apiVersion | string / null | false | The specific Azure API version to use. If not specified, the default version will be used. |
model | string / null | false | The specific Azure model to use. If not specified, the default model will be used. |
endpoint | string | true | The endpoint to use for the Azure LLM API |
apiKey | EnvironmentValue | true | The API key to use for the Azure API |
AnthropicLlmConfig
Configuration settings for an Anthropic LLM
Key | Value | Required | Description |
---|---|---|---|
provider | anthropic | true | |
model | string / null | false | The specific Anthropic model to use. If not specified, the default model will be used. |
baseUrl | string / null | false | The base URL to use for the Anthropic API. If not specified, the default URL will be used. |
apiKey | EnvironmentValue | true | The API key to use for the Anthropic API |
OpenAiLlmConfig
Configuration settings for an OpenAI LLM
Key | Value | Required | Description |
---|---|---|---|
provider | openai | true | |
model | string / null | false | The specific OpenAI model to use. If not specified, the default model will be used. |
baseUrl | string / null | false | The base URL to use for the OpenAI API. If not specified, the default URL will be used. |
apiKey | EnvironmentValue | true | The API key to use for the OpenAI API |
EnvironmentValue
Either a literal string or a reference to a Hasura secret
Must have exactly one of the following fields:
Key | Value | Required | Description |
---|---|---|---|
value | string | false | |
valueFromEnv | string | false |
HasuraLlmConfig
Configuration settings for the Hasura-configured LLM
Key | Value | Required | Description |
---|---|---|---|
provider | hasura | true |