Get Started with Hasura DDN and Amazon Redshift
Overview
This tutorial will guide you through setting up a Hasura DDN project with Amazon Redshift. You'll learn how to:
- Set up a new Hasura DDN project
- Connect it to a Redshift database
- Generate Hasura metadata
- Create a build
- Run your first query
- Create relationships
- Mutate data
Additionally, we'll familiarize you with the steps and workflows necessary to iterate on your API.
You'll also need:
- An AWS account
- A Redshift database with namespace
- An IAM user with access to the Redshift database
- The IAM user's credentials
Prerequisites
Install the DDN CLI
To use this guide, ensure you've installed/updated your CLI to at least v2.28.0
.
- macOS and Linux
- Windows
Simply run the installer script in your terminal:
curl -L https://graphql-engine-cdn.hasura.io/ddn/cli/v4/get.sh | bash
Currently, the CLI does not support installation on ARM-based Linux systems.
- Download the latest DDN CLI installer for Windows.
- Run the
DDN_CLI_Setup.exe
installer file and follow the instructions. This will only take a minute. - By default, the DDN CLI is installed under
C:\Users\{Username}\AppData\Local\Programs\DDN_CLI
- The DDN CLI is added to your
%PATH%
environment variable so that you can use theddn
command from your terminal.
Install Docker
The Docker-based workflow helps you iterate and develop locally without deploying any changes to Hasura DDN, making the
development experience faster and your feedback loops shorter. You'll need Docker Compose v2.20
or later.
Validate the installation
You can verify that the DDN CLI is installed correctly by running:
ddn doctor
Tutorial
Step 1. Authenticate your CLI
ddn auth login
This will launch a browser window prompting you to log in or sign up for Hasura DDN. After you log in, the CLI will acknowledge your login, giving you access to Hasura Cloud resources.
Step 2. Scaffold out a new local project
ddn supergraph init my-project && cd my-project
Once you move into this directory, you'll see your project scaffolded out for you. You can view the structure by either
running ls
in your terminal, or by opening the directory in your preferred editor.
Step 3. Create a new Redshift database
In the AWS console, navigate to the Redshift page and create a new database
called hasura_demo
in your namespace. You can do so by clicking "Query data" in the top right. Then click the "+
Create" button and select database. Select the appropriate cluster / workgroup, name the database hasura_demo
and
click "Create database".
Step 4. Create a table in your Redshift database
--- Create the table
CREATE TABLE users (
id SERIAL PRIMARY KEY,
name TEXT NOT NULL,
age INT NOT NULL
);
--- Insert some data
INSERT INTO users (name, age) VALUES ('Alice', 25);
INSERT INTO users (name, age) VALUES ('Bob', 30);
INSERT INTO users (name, age) VALUES ('Charlie', 35);
You can verify this worked by querying all records from the users
table:
SELECT * FROM users;
Step 5. Initialize your Redshift connector
ddn connector init my_redshift -i
From the dropdown, select hasura/redshift
(you can type to filter the list), and enter the value of the connection
string.
ENV | Example | Description |
---|---|---|
JDBC_URL | jdbc:redshift://<host>:<port>/<database>?user=<username>&password=<password> | The JDBC URL to connect to the Amazon Redshift database. |
JDBC_SCHEMAS | public,app | The schemas to use for the database. Optional. This can also be included in the connection string. |
Hasura will never modify your source schema.
Step 6. Introspect your Redshift database
ddn connector introspect my_redshift
After running this, you should see a representation of your database's schema in the
app/connector/my_redshift/configuration.json
file; you can view this using cat
or open the file in your editor.
ddn connector show-resources my_redshift
Step 7. Add your model
ddn model add my_redshift users
Open the app/metadata
directory and you'll find a newly-generated file: Users.hml
. The DDN CLI will use this Hasura
Metadata Language file to represent the users
table from Redshift in your API as a
model.
Step 8. Create a new build
ddn supergraph build local
The build is stored as a set of JSON files in engine/build
.
Step 9. Start your local services
ddn run docker-start
Your terminal will be taken over by logs for the different services.
Step 10. Run your first query
ddn console --local
query {
users {
id
name
age
}
}
{
"data": {
"users": [
{
"id": 1,
"name": "Alice",
"age": 25
},
{
"id": 2,
"name": "Bob",
"age": 30
},
{
"id": 3,
"name": "Charlie",
"age": 35
}
]
}
}
Step 11. Iterate on your Redshift schema
-- Create the posts table
CREATE TABLE posts (
id SERIAL PRIMARY KEY,
user_id INT NOT NULL REFERENCES users(id) ON DELETE CASCADE,
title TEXT NOT NULL,
content TEXT NOT NULL,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
-- Insert some seed data
INSERT INTO posts (user_id, title, content) VALUES
(1, 'My First Post', 'This is Alice''s first post.'),
(1, 'Another Post', 'Alice writes again!'),
(2, 'Bob''s Post', 'Bob shares his thoughts.'),
(3, 'Hello World', 'Charlie joins the conversation.');
-- Fetch all posts with user information
SELECT
posts.id AS post_id,
posts.title,
posts.content,
posts.created_at,
users.name AS author
FROM
posts
JOIN
users ON posts.user_id = users.id;
You should see a list of posts returned with the author's information joined from the users
table
Step 12. Refresh your metadata and rebuild your project
The following steps are necessary each time you make changes to your source schema. This includes, adding, modifying, or dropping tables.
Step 12.1. Re-introspect your data source
ddn connector introspect my_redshift
In app/connector/my_redshift/configuration.json
, you'll see schema updated to include operations for the posts
table. In app/metadata/my_redshift.hml
, you'll see posts
present in the metadata as well.
Step 12.2. Update your metadata
ddn model add my_redshift "posts"
Step 12.3. Create a new build
ddn supergraph build local
Step 12.4. Restart your services
ddn run docker-start
Step 13. Query your new build
query GetPosts {
posts {
id
title
content
}
}
{
"data": {
"posts": [
{
"id": 1,
"title": "My First Post",
"content": "This is Alice's first post."
},
{
"id": 2,
"title": "Another Post",
"content": "Alice writes again!"
},
{
"id": 3,
"title": "Bob's Post",
"content": "Bob shares his thoughts."
},
{
"id": 4,
"title": "Hello World",
"content": "Charlie joins the conversation."
}
]
}
}
Step 14. Create a relationship
ddn relationship add my_redshift "posts"
You'll see a new metadata object added to the app/metadata/posts.hml
file of kind Relationship
explaining the
relationship between posts
and users
.
Step 15. Rebuild your project
ddn supergraph build local
ddn run docker-start
Step 16. Query using your relationship
query GetPosts {
posts {
id
title
content
user {
id
name
age
}
}
}
{
"data": {
"posts": [
{
"id": 1,
"title": "My First Post",
"content": "This is Alice's first post.",
"user": {
"id": 1,
"name": "Alice",
"age": 25
}
},
{
"id": 2,
"title": "Another Post",
"content": "Alice writes again!",
"user": {
"id": 1,
"name": "Alice",
"age": 25
}
},
{
"id": 3,
"title": "Bob's Post",
"content": "Bob shares his thoughts.",
"user": {
"id": 2,
"name": "Bob",
"age": 30
}
},
{
"id": 4,
"title": "Hello World",
"content": "Charlie joins the conversation.",
"user": {
"id": 3,
"name": "Charlie",
"age": 35
}
}
]
}
}
Step 17. Add all commands
We'll track the available operations — for inserting, updating, and deleting — on our users
and posts
tables as
commands.
ddn command add my_redshift "*"
You'll see newly-generated metadata files in the metadata
directory for your connector that represent insert, update,
and delete operations.
ddn supergraph build local
ddn run docker-start
Step 18. Insert new data
mutation InsertSinglePost {
insertPosts(
objects: {
content: "I am an expert in Bird Law and I demand satisfaction."
title: "Charlie has more to say"
userId: "3"
}
) {
returning {
id
title
content
user {
id
name
}
}
}
}
You should see a response that returns your inserted data along with the id
and name
fields for the author.
Next steps
Congratulations on completing your first Hasura DDN project with Redshift! 🎉
Here's what you just accomplished:
- You started with a fresh project and connected it to a local Redshift database.
- You set up metadata to represent your tables and relationships, which acts as the blueprint for your API.
- Then, you created a build — essentially compiling everything into a ready-to-use API — and successfully ran your first GraphQL queries to fetch data.
- Along the way, you learned how to iterate on your schema and refresh your metadata to reflect changes.
- Finally, we looked at how to enable mutations and insert data using your new API.
Now, you're equipped to connect and expose your data, empowering you to iterate and scale with confidence. Great work!
Take a look at our Redshift docs to learn more about how to use Hasura DDN with Redshift. Or, if you're ready, get started with adding permissions to control access to your API.