Skip to main content
Version: PromptQL

Get Started with PromptQL and Databricks

Overview

This tutorial takes about twenty minutes to complete. You'll learn how to:

  • Set up a new Hasura DDN project to use with PromptQL
  • Connect it to Databricks
  • Generate Hasura metadata
  • Create a build
  • Run your first query with PromptQL

Additionally, we'll familiarize you with the steps and workflows necessary to iterate on your data source.

Prerequisites

Install the DDN CLI

Minimum version requirements

To use this guide, ensure you've installed/updated your CLI to at least v2.28.0.

Simply run the installer script in your terminal:

curl -L https://graphql-engine-cdn.hasura.io/ddn/cli/v4/get.sh | bash
ARM-based Linux Machines

Currently, the CLI does not support installation on ARM-based Linux systems.

Install Docker

The Docker-based workflow helps you iterate and develop locally without deploying any changes to Hasura DDN, making the development experience faster and your feedback loops shorter. You'll need Docker Compose v2.20 or later.

Validate the installation

You can verify that the DDN CLI is installed correctly by running:

ddn doctor

Tutorial

Step 1. Authenticate your CLI

Before you can create a new Hasura DDN project for PromptQL, you need to authenticate your CLI:
ddn auth login

This will launch a browser window prompting you to log in or sign up for Hasura DDN. After you log in, the CLI will acknowledge your login, giving you access to Hasura Cloud resources.

Step 2. Scaffold out a new local project

Next, create a new local project:
ddn supergraph init my-project --with-promptql && cd my-project

Once you move into this directory, you'll see your project scaffolded out for you. You can view the structure by either running ls in your terminal, or by opening the directory in your preferred editor.

Step 3. Initialize your connector

In your project directory, run:
ddn connector init my_connector -i

Select hasura/databricks (you can type to filter the list). Then, enter the following environment variables:

ENVExampleDescription
JDBC_URLjdbc:databricks://<host>:<port>/default;transportMode=http;ssl=1;AuthMech=3;httpPath=/sql/1.0/warehouses/<warehouse-id>;UID=token;PWD=<access-token>;ConnCatalog=main;You can construct the base of this using your Databricks UI under SQL Warehouses » <name-of-warehouse> » Connection details.
JDBC_SCHEMASdefault,publicA comma-separated list of schemas within the referenced catalog.

Step 4. Introspect your source

Next, use the CLI to introspect your source:
ddn connector introspect my_connector

After running this, you should see a representation of your source's schema in the app/connector/my_connector/configuration||config.json file; you can view this using cat or open the file in your editor.

Additionally, you can check which resources are available — and their status — at any point using the CLI:
ddn connector show-resources my_connector

Step 5. Add your resources

Add your resources to create metadata:
ddn model add my_connector "*"
ddn command add my_connector "*"
ddn relationship add my_connector "*"

Open the app/metadata directory and you'll find newly-generated file(s) ending in .hml. The DDN CLI will use these Hasura Metadata Language files to represent your data source to PromptQL as models, commands, and relationships.

Step 6. Add semantic information to your metadata (optional)

It is highly recommended to provide extra natural language descriptions of the resources in your project so that the LLM can better understand your data and create appropriate query plans.

The description field can be added to Model, Command and Relationship metadata objects to provide semantic context. See more about semantic information here.

Step 7. Create a new build

To create a local build, run:
ddn supergraph build local

The build is stored as a set of JSON files in engine/build.

Step 8. Start your local services

Start your local Hasura DDN Engine and connector:
ddn run docker-start

Your terminal will be taken over by logs for the different services.

Step 9. Chat with your data

In a new terminal tab, open your local console:
ddn console --local

Once the PromptQL interface is open, ask a question about your data. For example:

Hi, what are some questions you can answer?

I can help you analyze data from what appears to be an e-commerce system. Here are some types of questions I can answer:

  1. Product-related:
    • Find popular products based on orders or reviews
    • Analyze product pricing across categories
    • Look up product details and descriptions
    • Find products from specific manufacturers or countries
  2. Order and Purchase Analysis:
    • Track order statuses
    • Analyze delivery times
    • Look at purchase patterns
    • Find abandoned carts
  3. Customer Behavior:
    • Analyze customer reviews and ratings
    • Look at shopping patterns
    • Track cart abandonment
    • Study product category preferences
  4. Business Metrics:
    • Calculate sales metrics
    • Analyze review scores
    • Track coupon usage
    • Monitor cart completion rates

Would you like me to explore any of these areas in particular?

Step 10. Iterate on your source's schema

If something changes in your data source's schema, you can iterate on your data model by following the steps in the iteration guide.

Next steps

Congratulations on completing your first Hasura PromptQL project with Databricks! 🎉

Here's what you just accomplished:

  • You started with a fresh project and connected it to Databricks.
  • You set up metadata to represent your database schema, which acts as the blueprint for PromptQL.
  • Then, you created a build — essentially compiling everything into a ready-to-use structure — and successfully ran your first PromptQL queries to learn about your data.

Now, you're equipped to connect and expose your data, empowering you to iterate and scale with confidence. Great work!

Take a look at our connector docs to learn more about how to use PromptQL with Databricks. Or, if you're ready, get started with adding custom business logic to get PromptQL to act on a user's behalf!