PromptQL and Hasura DDN Glossary
PromptQL
PromptQL is a novel agent approach to enable high-trust LLM interaction with business data and systems. It composes tool calls and LLM tasks in a way that provides a high degree of explainability, accuracy, and repeatability for arbitrarily complex tasks.
Unlike traditional tool calling & RAG approaches that rely on in-context composition, PromptQL's planning engine generates and runs programs that compose tool calls and LLM tasks, providing:
- ~2x improvement in accuracy over traditional approaches
- Near-perfect repeatability as complexity increases
- Constant context size regardless of data volume
- Detailed execution plans for complete explainability
- Fine-grained authorization to protect sensitive data
PromptQL uses Hasura DDN's data layer which provides out-of-the-box connectors to a wide range of data sources, introspecting your data and helping you build a high-quality realtime data product with semantic metadata and fine-grained entitlements.
Hasura Data Delivery Network (DDN)
Hasura DDN is a managed service that powers PromptQL. It provides the infrastructure needed to connect, unify, and serve data from multiple sources — securely and at scale.
Hasura DDN delivers a production-ready platform that removes the need to manage your own API or application server. It handles everything under the hood so that PromptQL can focus on understanding your questions and delivering accurate, real-time answers.
Hasura DDN is globally-distributed and always available. Its new runtime engine accesses metadata on a per-request basis, enabling better isolation, high scalability, and faster response times.
The managed control plane includes tools for metadata authoring, CI/CD, infrastructure management, and team collaboration. This separation of control and data planes (unlike earlier versions, where they were bundled together) allows you to manage your metadata in version control and deploy confidently—so PromptQL always has the context it needs to reason across your data sources and respond reliably.
PromptQL Programs
PromptQL programs are Python programs that read & write data via python functions. They are generated by LLMs and represent the concrete implementation of a user's intended interaction with their business data. PromptQL programs can create, read, update, and analyze data across multiple sources.
PromptQL Primitives
PromptQL primitives are AI functions available as Python functions within PromptQL programs. They perform common AI tasks on data, such as classification, summarization, and extraction, allowing the composition of cognitive tasks with computational tasks.
PromptQL Artifacts
PromptQL artifacts are stores of data that can be referenced from PromptQL programs. PromptQL programs can create artifacts as structured memory, which helps overcome LLM context window limitations and ensures information consistency across interactions.
Query Plans
PromptQL builds dynamic query plans that can be modified during execution based on intermediate results or explicit instructions from the user. These plans incorporate control structures to handle different scenarios and can adjust retrieval strategies dynamically, overcoming limitations of rigid retrieval-based approaches.
Control plane
The control plane manages the configuration, orchestration, and coordination of the data plane elements along with providing tools to author metadata. It is responsible for setting up and managing the behavior, policies, and rules that govern how data is processed and forwarded in the data plane. It also oversees the overall behavior of the system, manages the endpoints, maintains the configurations, handles authentication and access control mechanisms, and gathers analytics or metrics related to application usage.
Data Plane
The data plane manages the actual handling of application requests and responses. It deals with the execution of application operations, the transmission of data between clients and endpoints, and the processing of data payloads. The following are the main components of a data plane:
Hasura v3 GraphQL Engine: The engine takes in an API request (e.g., a question for PromptQL), converts it into an intermediate representation that the connectors can handle, and creates a plan for the query execution across various data sources.
Hasura connectors: These handle the actual API execution. They accept the intermediate representation from the engine and use the most efficient mechanism to execute the query and fetch/mutate data from the underlying data source.
Supergraph
A supergraph is how PromptQL sees the big picture.
It connects multiple subgraphs - each representing a specific data domain - into a single graph that PromptQL can query across. This lets you organize your data by team or domain, without giving up the ability to ask questions that span everything.
The supergraph balances the benefits of modular development (independent teams, separate repositories, clear ownership) with a unified interface for querying. It keeps everything loosely coupled under the hood, but tightly integrated when you need answers.
PromptQL uses the supergraph to reason across connected domains and return accurate results - even when the data lives in different systems or is owned by different teams.
Subgraphs
A subgraph is a building block of the supergraph - and the unit PromptQL uses to understand each part of your data.
Each subgraph is a standalone module that includes metadata, permissions, and one or more data connectors. It defines a clear boundary around a specific data domain, like users, orders, or content. Subgraphs are independently developed, tested, and versioned - so teams can work in parallel without stepping on each other.
Fields in a subgraph can be scoped or prefixed to avoid conflicts. Relationships between subgraphs let PromptQL follow connections across domains when answering questions.
Globals subgraph
When you run the ddn supergraph init
command, a globals
subgraph is created by default.
This subgraph holds shared configuration that applies across your entire supergraph - things like API settings, auth rules, and compatibility options. It gives PromptQL the context it needs to understand how to route, secure, and execute your conversations correctly.
By default, these objects live in the globals
subgraph, but you can move them to another subgraph if needed.
Hasura metadata
Hasura metadata defines the structure of your supergraph; it gives PromptQL the context it needs to ask the right questions, enforce permissions, and return meaningful answers.
Metadata is written declaratively and describes how to connect to your data sources, what the API should look like, and
how different parts of your system relate to each other. It introduces key modeling constructs such as Types
,
Models
, Commands
, Permissions
, and Relationships
to represent your real-world domain in a way PromptQL can
understand.
Beyond that, metadata also includes configuration for API security, caching, CI/CD, and deployment - giving your team full control over how the application behaves.
.hml (Hasura Metadata Language)
.hml
is the file extension used for writing Hasura metadata. It shares the same syntax as .yaml
, and offers the same
benefits: it's readable, supports nested data structures, allows for inline comments, and is portable across
environments and teams.
PromptQL relies on .hml
files to understand how your data is structured and how it can be queried and secured.
Supergraph modeling
Supergraph modeling is the process of defining how your data, operations, and relationships are represented in metadata.
By identifying and structuring the elements of your system - like types, models, permissions, and relationships - you create a clear, declarative blueprint. PromptQL uses this blueprint to answer questions, enforce rules, and coordinate across connected data sources.
Immutable-Build runtime system
Hasura DDN uses an immutable build system for your metadata. Each time you make a change, a new build is created. This means PromptQL always runs against a specific, versioned snapshot of your supergraph.
There's no shared state to manage, no long-running config to refresh. Every request is served using a clean, isolated runtime - making rollbacks, previews, and CI/CD workflows straightforward and reliable.
You can confidently ship changes to your API, knowing PromptQL will always use the exact version you deployed - and if something goes wrong, you can roll back just as easily.
Data Sources
Any external data source, database, or service that can be connected to Hasura DDN using a Data Connector agent.
Native Data Connector Specification (NDC Spec)
A standardized specification that allows you to extend the functionality of the Hasura server by providing web services
that resolve new sources of data and business logic and help define the metadata structure for APIs in Hasura DDN. The
specification defines types such as collections
, functions
, and procedures
that help in describing the behavior of
agents or connectors that connect to the underlying data source. It provides a framework and guidelines on the types of
web service endpoints that a connector needs to implement.
Native Data Connectors
Data connectors are specialized agents that connect subgraphs to data sources, enabling communication in the data source's native language.
They integrate Hasura with various external data sources and services based on the native data connector specification.
Each subgraph can have multiple data connectors, and the same data source can be connected to different subgraphs via separate connector instances - allowing teams to work with the same source from different domain perspectives.
Data connectors are available for a wide variety of sources including databases, business logic functions, REST APIs, and GraphQL APIs. They can be official Hasura-verified connectors or custom-built connectors to integrate with other data sources.
Push-Down Capabilities
The ability in Hasura DDN to delegate certain query operations including Authorization to the underlying data source. This can improve query optimization and performance and is the reason why data connectors in Hasura DDN are called 'native'.
Connector Hub
Refers to the public site where all Native Data Connectors for Hasura DDN are listed. Users can discover connectors, get more information about their specific features and find documentation on how to use each connector with Hasura DDN.
Model
A model defines an entity in your API that maps directly to something in your underlying data source - like a table, collection, or resource exposed by a native data connector.
Models are central to how PromptQL understands your data. Each model includes a reference to its data type, API configuration, available arguments, and optional global ID settings. Models support select, insert, update, and delete operations.
Within a select operation, PromptQL can handle filtering, aggregation, pagination, and limiting - giving you full flexibility when asking questions.
Command
A command defines a piece of business logic that PromptQL can call as an action. It represents something you can do - a procedure, mutation, or custom operation - and returns a result.
Commands map directly to functions or procedures defined in your connected data sources. PromptQL uses commands when you want to go beyond querying and take meaningful action.
Relationships
A relationship connects two models - or a model and a command - and tells PromptQL how different pieces of your data are linked.
With relationships in place, PromptQL can traverse from one part of your system to another, combining data as it answers questions. Relationships can be defined within a single subgraph or across subgraphs - even if those subgraphs live in different repositories.
When working across subgraphs, there are a few structural differences to keep in mind. You can read more about those here.
To make working with relationships easier, the Hasura VS Code extension offers auto-complete and validation as you author metadata.
Authorization Permissions
Authentication is managed at the supergraph level in Hasura DDN as defined by the AuthConfig
object and cannot be
customized at the subgraph level.
Authorization however, which is a metadata object in Hasura DDN that defines the access control or authorization rules on models and commands. Learn more, is managed at the subgraph level in Hasura DDN. This means that the permissions for models and commands are defined within the context of those objects in their respective subgraphs, and do not affect other subgraphs.
Authorization rules in one subgraph can also be defined to reference data in a foreign subgraph even if that subgraph is in another repository.
Collection
A collection is a Native Data Connector Spec object, which encapsulates part of a data source, providing standard querying capabilities.
Each collection is defined by its name, any collection arguments (need to parametrize the collections), the object type (a collection of fields) of its rows, and some additional metadata related to constraints.
Tracking a collection results in the creation of a 'model' object in Hasura metadata.
Function
A function is a Native Data Connector Specification object that can be invoked with arguments to get a result, and which doesn't have side-effects and is thus "read only". Unlike collections, functions do not describe constraints and do not have object types.
Tracking a function results in the creation of a Command
Supergraph object in Hasura metadata.
Procedure
A procedure is a native data connector specification object, and it defines an action that the data connector implements and can mutate data and have other side-effects. Each procedure has arguments and a return type.
Tracking a procedure results in the creation of a Command
object in Hasura metadata.
Supergraph config
Supergraph config tells Hasura DDN how to construct your supergraph. A config will contain information such as which subgraphs to include and which resources to use for the build.
Connector config
Connector config tells Hasura DDN how to build your connector. It will contain information such as the type of connector and the location to the context files needed to build the connector.
DDN CLI (Command-Line Interface)
A tool in Hasura DDN that enables developers to interact with DDN from the command line. It supports various commands for creating builds, tracking objects, and deploying projects.
VS Code Extension
The Hasura VS Code extension enables features like inline validation of Hasura DDN metadata without having to create builds, code scaffolding snippets which can be used to quickly scaffold DDN metadata objects, intelligent autocomplete which shows autocomplete suggestions based on the current state of the DDN project, and other features like go-to-definition for inter-related DDN metadata objects, documentation on hover of any DDN metadata object, a project tree on sidebar which shows all DDN projects and metadata objects present in the currently opened folder.
It is strongly recommended to download the VS code extension while working with DDN projects, it can be downloaded from the VS Code extension marketplace.
Metadata build service
Creates builds from the metadata and makes it available to Hasura v3 GraphQL Engine at the edge for it to serve the application. Provides essential error handling for fast debugging and troubleshooting.
Control plane cloud API
This component is the underlying cloud service, which enables creating builds, applying builds and testing your API. We consider this the brain of DDN Console and CLI – the component that drives most of the DX functionalities.
PromptQL Console and Playground
An interface that provides tools for interacting with PromptQL, visualizing metadata, team collaboration, documentation, traces, and analytics.
The Playground is your primary method of interacting with and testing conversations before integrating the PromptQL API into your customer-facing application(s). It allows you to experiment with different queries, view the generated query plans, and observe how PromptQL interacts with your data sources in real-time.
Cloud PAT
This refers to a personal authentication token that Hasura Cloud creates automatically for you on every new project
creation. This ensures that your application always has a security mechanism. The auto-generated PAT is included in the
API header cloud_pat
.