Skip to main content
Version: v3.x (DDN)

Get Started with Hasura DDN and PostgreSQL

Overview

This tutorial takes about twenty minutes to complete. You'll learn how to:

  • Set up a new Hasura DDN project
  • Connect it to a PostgreSQL database
  • Generate Hasura metadata
  • Create a build
  • Run your first query
  • Create relationships
  • Mutate data

Additionally, we'll familiarize you with the steps and workflows necessary to iterate on your API.

This tutorial assumes you're starting from scratch; a PostgreSQL docker image ships with the data connector you'll use in just a bit to connect a locally-running database to Hasura, but you can easily follow the steps if you already have data seeded; Hasura will never modify your source schema.

Prerequisites

Install the DDN CLI

Simply run the installer script in your terminal:

curl -L https://graphql-engine-cdn.hasura.io/ddn/cli/v4/get.sh | bash

Install Docker

The Docker based workflow helps you iterate and develop locally without deploying any changes to Hasura DDN, making the development experience faster and your feedback loops shorter. You'll need Docker Compose v2.27.1 or later.

Validate the installation

You can verify that the DDN CLI is installed correctly by running:

ddn doctor

Tutorial

Step 1. Authenticate your CLI

Before you can create a new Hasura DDN project, you need to authenticate your CLI:
ddn auth login

This will launch a browser window prompting you to log in or sign up for Hasura DDN. After you log in, the CLI will acknowledge your login, giving you access to Hasura Cloud resources.

Step 2. Scaffold out a new local project

Next, create a new local project:
ddn supergraph init my-project && cd my-project

Once you move into this directory, you'll see your project scaffolded out for you. You can view the structure by either running ls in your terminal, or by opening the directory in your preferred editor.

Step 3. Initialize your PostgreSQL connector

In your project directory, run:
ddn connector init my_pg -i

From the dropdown, start typing PostgreSQL and hit enter to advance through all the options.

The CLI will output something similar to this:

HINT To access the local Postgres database:
- Run: docker compose -f app/connector/my_pg/compose.postgres-adminer.yaml up -d
- Open Adminer in your browser at http://localhost:5143 and create tables
- To connect to the database using other clients use postgresql://user:[email protected]:8105/dev

Step 4. Start the local PostgreSQL container and Adminer

Use the hint from the CLI output:
docker compose -f app/connector/my_pg/compose.postgres-adminer.yaml up -d

Run docker ps to see on which port Adminer is running. Then, you can then navigate to the address below to access it:

http://localhost:<ADMINER_PORT>

Step 5. Create a table in your PostgreSQL database

Next, via Adminer select SQL command from the left-hand nav, then enter the following:
--- Create the table
CREATE TABLE users (
id SERIAL PRIMARY KEY,
name TEXT NOT NULL,
age INT NOT NULL
);

--- Insert some data
INSERT INTO users (name, age) VALUES ('Alice', 25);
INSERT INTO users (name, age) VALUES ('Bob', 30);
INSERT INTO users (name, age) VALUES ('Charlie', 35);

You can verify this worked by using Adminer to query all records from the users table:

SELECT * FROM users;

Step 6. Introspect your PostgreSQL database

Next, use the CLI to introspect your PostgreSQL database:
ddn connector introspect my_pg

After running this, you should see a representation of your database's schema in the app/connector/my_pg/configuration.json file; you can view this using cat or open the file in your editor.

Additionally, you can check which resources are available — and their status — at any point using the CLI:
ddn connector show-resources my_pg

Step 7. Add your model

Now, track the table from your PostgreSQL database as a model in your DDN metadata:
ddn models add my_pg users

Open the app/metadata directory and you'll find a newly-generated file: Users.hml. The DDN CLI will use this Hasura Metadata Language file to represent the users table from PostgreSQL in your API as a model.

Step 8. Create a new build

To create a local build, run:
ddn supergraph build local

The build is stored as a set of JSON files in engine/build.

Step 9. Start your local services

Start your local Hasura DDN Engine and PostgreSQL connector:
ddn run docker-start

Your terminal will be taken over by logs for the different services.

Step 10. Run your first query

In a new terminal tab, open your local console:
ddn console --local
In the GraphiQL explorer of the console, write this query:
query {
users {
id
name
age
}
}
You'll get the following response:
{
"data": {
"users": [
{
"id": 1,
"name": "Alice",
"age": 25
},
{
"id": 2,
"name": "Bob",
"age": 30
},
{
"id": 3,
"name": "Charlie",
"age": 35
}
]
}
}

Step 11. Iterate on your PostgreSQL schema

Via Adminer, add a new table and insert some data to your PostgreSQL database:
-- Create the posts table
CREATE TABLE posts (
id SERIAL PRIMARY KEY,
user_id INT NOT NULL REFERENCES users(id) ON DELETE CASCADE,
title TEXT NOT NULL,
content TEXT NOT NULL,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);

-- Insert some seed data
INSERT INTO posts (user_id, title, content) VALUES
(1, 'My First Post', 'This is Alice''s first post.'),
(1, 'Another Post', 'Alice writes again!'),
(2, 'Bob''s Post', 'Bob shares his thoughts.'),
(3, 'Hello World', 'Charlie joins the conversation.');
Using Adminer, verify this by running the following query:
-- Fetch all posts with user information
SELECT
posts.id AS post_id,
posts.title,
posts.content,
posts.created_at,
users.name AS author
FROM
posts
JOIN
users ON posts.user_id = users.id;

You should see a list of posts returned with the author's information joined from the users table

Step 12. Refresh your metadata and rebuild your project

tip

The following steps are necessary each time you make changes to your source schema. This includes, adding, modifying, or dropping tables.

Step 12.1. Re-introspect your data source

Run the introspection command again:
ddn connector introspect my_pg

In app/connector/my_pg/configuration.json, you'll see schema updated to include operations for the posts table. In app/metadata/my_pg.hml, you'll see posts present in the metadata as well.

Step 12.2. Update your metadata

Add the posts model:
ddn model add my_pg "posts"

Step 12.3. Create a new build

Next, create a new build:
ddn supergraph build local

Step 12.4. Restart your services

Bring down the servies by pressing CTRL+C and start them back up:
ddn run docker-start

Step 13. Query your new build

Head back to your console and query the posts model:
query GetPosts {
posts {
id
title
content
}
}
You'll get a response like this:
{
"data": {
"posts": [
{
"id": 1,
"title": "My First Post",
"content": "This is Alice's first post."
},
{
"id": 2,
"title": "Another Post",
"content": "Alice writes again!"
},
{
"id": 3,
"title": "Bob's Post",
"content": "Bob shares his thoughts."
},
{
"id": 4,
"title": "Hello World",
"content": "Charlie joins the conversation."
}
]
}
}

Step 14. Create a relationship

Since there's already a foreign key on the posts table in PostgreSQL, we can easily add the relationship:
ddn relationship add my_pg "posts"

You'll see a new metadata object added to the app/metadata/posts.hml file of kind Relationship explaining the relationship between posts and users.

Step 15. Rebuild your project

As your metadata has changed, create a new build:
ddn supergraph build local
Bring down the servies by pressing CTRL+C and start them back up:
ddn run docker-start

Step 16. Query using your relationship

Now, execute a nested query using your relationship:
query GetPosts {
posts {
id
title
content
user {
id
name
age
}
}
}
Which should return a result like this:
{
"data": {
"posts": [
{
"id": 1,
"title": "My First Post",
"content": "This is Alice's first post.",
"user": {
"id": 1,
"name": "Alice",
"age": 25
}
},
{
"id": 2,
"title": "Another Post",
"content": "Alice writes again!",
"user": {
"id": 1,
"name": "Alice",
"age": 25
}
},
{
"id": 3,
"title": "Bob's Post",
"content": "Bob shares his thoughts.",
"user": {
"id": 2,
"name": "Bob",
"age": 30
}
},
{
"id": 4,
"title": "Hello World",
"content": "Charlie joins the conversation.",
"user": {
"id": 3,
"name": "Charlie",
"age": 35
}
}
]
}
}

Step 17. Add all commands

We'll track the available operations — for inserting, updating, and deleting — on our users and posts tables as commands.

Add all available commands:
ddn command add my_pg "*"

You'll see newly-generated metadata files in the metadata directory for your connector that represent insert, update, and delete operations.

As your metadata has changed, create a new build:
ddn supergraph build local
Bring down the servies by pressing CTRL+C and start them back up:
ddn run docker-start

Step 18. Insert new data

Create a new post for Charlie:
mutation InsertSinglePost {
insertPosts(
objects: {
content: "I am an expert in Bird Law and I demand satisfcation."
title: "Charlie has more to say"
userId: "3"
}
) {
returning {
id
title
content
user {
id
name
}
}
}
}

You should see a response that returns your inserted data along with the id and name fields for the author.

Next steps

Congratulations on completing your first Hasura DDN project with PostgreSQL! 🎉

Here's what you just accomplished:

  • You started with a fresh project and connected it to a local PostgreSQL database.
  • You set up metadata to represent your tables and relationships, which acts as the blueprint for your API.
  • Then, you created a build — essentially compiling everything into a ready-to-use API — and successfully ran your first GraphQL queries to fetch data.
  • Along the way, you learned how to iterate on your schema and refresh your metadata to reflect changes.
  • Finally, we looked at how to enable mutations and insert data using your new API.

Now, you're equipped to connect and expose your data, empowering you to iterate and scale with confidence. Great work!

Take a look at our PostgreSQL docs to learn more about how to use Hasura DDN with PostgreSQL. Or, if you're ready, get started with adding permissions to control access to your API.