Skip to main content
Version: PromptQL

PromptQL Configuration

Introduction

Your PromptQlConfig is a metadata object that defines the configuration of PromptQL for your project. It includes the LLM to be used, the system instructions, and other settings.

Example PromptQlConfig
kind: PromptQlConfig
version: v2
definition:
llm:
provider: openai
model: o3-mini
ai_primitives_llm:
provider: openai
model: gpt-4o
system_instructions: |
You are a helpful AI Assistant.

Metadata structure

PromptQlConfigV2

Definition of the configuration of PromptQL, v2

KeyValueRequiredDescription
kindPromptQlConfigtrue
versionv2true
definitionPromptQlConfigV2trueDefinition of the configuration of PromptQL for the project

PromptQlConfigV2

Definition of the configuration of PromptQL for the project

KeyValueRequiredDescription
systemInstructionsstring / nullfalseCustom system instructions provided to every PromptQL thread that allows tailoring of behavior to match to the project's specific needs.
llmLlmConfigtrueConfiguration of the LLM to be used for PromptQL
aiPrimitivesLlmLlmConfig / nullfalseConfiguration of the LLM to be used for AI primitives, such as classification, summarization etc
featureFlagsobject / nullfalseFeature flags to be used for PromptQL

LlmConfig

Configuration of the LLM to be used for PromptQL

One of the following values:

ValueDescription
HasuraLlmConfigConfiguration settings for the Hasura-configured LLM
OpenAiLlmConfigConfiguration settings for an OpenAI LLM
AnthropicLlmConfigConfiguration settings for an Anthropic LLM
AzureLlmConfigConfiguration settings for an Azure-provided LLM
GeminiLlmConfigConfiguration settings for a Gemini LLM
BedrockLlmConfigConfiguration settings for an AWS Bedrock-provided LLM

BedrockLlmConfig

Configuration settings for an AWS Bedrock-provided LLM

KeyValueRequiredDescription
providerbedrocktrue
modelIdstringtrueThe specific AWS Bedrock model to use.
regionNamestringtrueThe specific AWS Bedrock region to use.
awsAccessKeyIdEnvironmentValuetrueThe AWS access key ID to use for the AWS Bedrock API
awsSecretAccessKeyEnvironmentValuetrueThe AWS secret access key to use for the AWS Bedrock API

GeminiLlmConfig

Configuration settings for a Gemini LLM

KeyValueRequiredDescription
providergeminitrue
modelstring / nullfalseThe specific Gemini model to use. If not specified, the default model will be used.
apiKeyEnvironmentValuetrueThe API key to use for the Gemini API

AzureLlmConfig

Configuration settings for an Azure-provided LLM

KeyValueRequiredDescription
providerazuretrue
apiVersionstring / nullfalseThe specific Azure API version to use. If not specified, the default version will be used.
modelstring / nullfalseThe specific Azure model to use. If not specified, the default model will be used.
endpointstringtrueThe endpoint to use for the Azure LLM API
apiKeyEnvironmentValuetrueThe API key to use for the Azure API

AnthropicLlmConfig

Configuration settings for an Anthropic LLM

KeyValueRequiredDescription
provideranthropictrue
modelstring / nullfalseThe specific Anthropic model to use. If not specified, the default model will be used.
baseUrlstring / nullfalseThe base URL to use for the Anthropic API. If not specified, the default URL will be used.
apiKeyEnvironmentValuetrueThe API key to use for the Anthropic API

OpenAiLlmConfig

Configuration settings for an OpenAI LLM

KeyValueRequiredDescription
provideropenaitrue
modelstring / nullfalseThe specific OpenAI model to use. If not specified, the default model will be used.
baseUrlstring / nullfalseThe base URL to use for the OpenAI API. If not specified, the default URL will be used.
apiKeyEnvironmentValuetrueThe API key to use for the OpenAI API

EnvironmentValue

Either a literal string or a reference to a Hasura secret

Must have exactly one of the following fields:

KeyValueRequiredDescription
valuestringfalse
valueFromEnvstringfalse

HasuraLlmConfig

Configuration settings for the Hasura-configured LLM

KeyValueRequiredDescription
providerhasuratrue