Snowflake Connector
With this connector, Hasura allows you to instantly create a real-time GraphQL API on top of your data models in Snowflake. This connector supports Snowflake's functionalities listed in the table below, allowing for efficient and scalable data operations. Additionally, users benefit from all the powerful features of Hasura’s Data Delivery Network (DDN) platform, including query pushdown capabilities that delegate query operations to the database, thereby enhancing query optimization and performance.
This connector implements the Data Connector Spec.
Features
Below, you'll find a matrix of all supported features for the Snowflake connector:
Feature | Supported | Notes |
---|---|---|
Native Queries + Logical Models | ✅ | |
Native Mutations | ❌ | |
Simple Object Query | ✅ | |
Filter / Search | ✅ | |
Simple Aggregation | ✅ | |
Sort | ✅ | |
Paginate | ✅ | |
Table Relationships | ✅ | |
Views | ✅ | |
Remote Relationships | ✅ | |
Custom Fields | ❌ | |
Mutations | ❌ | |
Distinct | ❌ | |
Enums | ❌ | |
Naming Conventions | ❌ | |
Default Values | ❌ | |
User-defined Functions | ❌ |
Before you get Started
- Create a Hasura Cloud account
- Install the CLI
- Install the Hasura VS Code extension
- Create a supergraph
- Create a subgraph
Using the connector
To use the Snowflake connector, follow these steps in a Hasura project: (Note: for more information on the following steps, please refer to the Postgres connector documentation here)
1. Init the connector
(Note: here and following we are naming the subgraph "my_subgraph" and the connector "my_snowflake")
ddn connector init my_snowflake --subgraph my_subgraph --hub-connector hasura/snowflake
2. Add your Snowflake credentials:
Add your credentials to my_subgraph/connector/my_sql/.env.local
JDBC_URL="jdbc:snowflake://ak1234.us-east-2.aws.snowflakecomputing.com/?user=<user>&password=<password>&db=CHINOOK&schema=PUBLIC&warehouse=COMPUTE_WH&role=ACCOUNTADMIN" JDBC_SCHEMAS=PUBLIC
3. Introspect your indices
ddn connector introspect --connector my_subgraph/connector/my_snowflake/connector.yaml
If you look at the configuration.json
for your connector, you'll see metadata describing your Snowflake mappings.
4. Create the Hasura metadata
ddn connector-link add my_snowflake --subgraph my_subgraph
The generated file has two environment variables — one for reads and one for writes — that you'll need to add to your
subgraph's .env.my_subgraph
file. Each key is prefixed by the subgraph name, an underscore, and the name of the
connector. Ensure the port value matches what is published in your connector's docker compose file.
MY_SUBGRAPH_MY_SNOWFLAKE_READ_URL=http://local.hasura.dev:8081 MY_SUBGRAPH_MY_SNOWFLAKE_WRITE_URL=http://local.hasura.dev:8081
5. Start the connector's docker compose
Let's start our connector's docker compose file.
docker compose -f docker-compose.my_snowflake.yaml up
This starts our Snowflake connector on the specified port. We can navigate to the following address, with the port modified, to see the schema of our Snowflake data source:
http://localhost:8081/schema
6. Include the connector in your docker compose
Kill the connector by pressing CTRL+C
in the terminal tab in which the connector is running.
Then, add the following inclusion to the docker compose docker-compose.hasura.yaml
in your project's root directory,
taking care to modify the
subgraph's name.
include: - path: my_subgraph/connector/my_snowflake/docker-compose.my_snowflake.yaml
Now, whenever running the following, you'll bring up the GraphQL engine, observability tools, and any connectors you've included:
HASURA_DDN_PAT=$(ddn auth print-pat) docker compose -f docker-compose.hasura.yaml watch
7. Update the new DataConnectorLink object
Finally, now that our DataConnectorLink
has the correct environment variables configured for the Snowflake connector,
we can run the update command to have the CLI look at the configuration JSON and transform it to reflect our database's
schema in hml
format. In a new terminal tab, run:
ddn connector-link update my_snowflake --subgraph my_subgraph
After this command runs, you can open your my_subgraph/metadata/my_snowflake.hml
file and see your metadata completely
scaffolded out for you 🎉
8. Import all your indices
You can do this in one convenience command.
ddn connector-link update my_snowflake --subgraph my_subgraph --add-all-resources
9. Create a supergraph build
Pass the local
subcommand along with specifying the output directory as ./engine
in the root of the project. This
directory is used by the docker-compose file to serve the engine locally:
ddn supergraph build local --output-dir ./engine
You can now navigate to
https://console.hasura.io/local/graphql?url=http://localhost:3000
License
The Hasura Snowflake connector is available under the Apache License 2.0.