Providers & Models
Introduction
PromptQL supports configuring different LLM providers and models to tailor the experience to your application's needs.
This gives you the flexibility to choose the most performance-efficient and cost-effective solution for your use case, and the freedom to switch between providers and models as needed depending the task.
For example, you can use one model for conversational tasks and another for advanced AI primitives:
kind: PromptQlConfig
version: v1
definition:
llm:
provider: openai
model: o3-mini
ai_primitives_llm:
provider: openai
model: gpt-4o
llm
The llm
configuration is used to define the LLM provider and model for conversational tasks in your application.
In the example above, we're using openai
as the provider with the o3-mini
model.
ai_primitives_llm
The ai_primitives_llm
configuration is used to define the LLM provider and model for
AI primitives in your application. This is used for tasks such as
program generation and execution. In the example above, we're using openai
as the provider with the gpt-4o
model.
Available providers & models
Anthropic
To use an Anthropic model, set the provider
to
anthropic
. The following have been tested with PromptQL:
claude-3-5-sonnet-latest
claude-3-7-sonnet-latest
AWS Bedrock
To use a Bedrock-wrapped model, set the provider
to bedorck
. The following have
been tested with PromptQL:
- Claude 3.5 Sonnet
- Claude 3.7 Sonnet
NB: For Bedrock models, you'll need to provide a model_id
that resembles this string:
arn:aws:bedrock:<AWS region>:<AWS account ID>:inference-profile/us.anthropic.claude-3-5-sonnet-20241022-v2:0
Google Gemini
To use a Google Gemini model, set the provider
to gemini
. The
following have been tested with PromptQL:
gemini-1.5-flash
gemini-2.0-flash
Hasura
With Hasura—used as the default provider—there is no specific model necessary in your configuration.
Microsoft Azure
To use an Azure foundational model, set the
provider
to azure
.
OpenAI
To use an OpenAI model, set the provider
to openai
. The following have
been tested with PromptQL:
o1
o3-mini
gpt-4o
When using the hasura
provider, the default model is claude-3-5-sonnet-latest
. This is the recommended model for
PromptQL program generation.
Considerations
- The
model
key is not supported when using thehasura
provider. - The value for a
model
key is always in the dialect of the provider's API. - If
ai_primitives_llm
is not defined, it defaults to the provider specified in thellm
configuration. system_instructions
are optional but recommended to customize the behavior of your LLM.