Skip to main content
Version: v3.x beta

Add Custom Business Logic

What's about to happen?

As we alluded to in the beginning of this guide, the remaining work on top of your existing data source modeling is authoring and maintaining custom code, or business logic. With Hasura, you can integrate — and even host — this business logic directly with Hasura DDN and your API.

Hasura handles business logic using the TypeScript data connector. This enables you to author you own custom code, written in TypeScript, and host it alongside your API. Using this connector, you can transform or enrich data before it reaches your consumers, or perform any other app business logic you may need.

You can then integrate this custom code as individual commands in your metadata. This process enables you to simplify client applications and speed up your backend development.

In this guide we will:

  • Initialize the hasura/nodejs data connector
  • Work with Node.js and npm to create a simple script
  • Add a new DataConnectorLink
  • Update the DataConnectorLink to track the function script as a command in our metadata
  • Create a new API build and test it


Step 1. Initialize the TypeScript connector

Let's begin by initializing the connector on our project. In the example below, you'll see some familiar flags with new values being passed to them. We'll call this the ts_connector and use the hasura/nodejs connector from the connector hub:

From the root of your project, run:
ddn connector init my_ts --subgraph my_subgraph --hub-connector hasura/nodejs

This will create the following file structure in a my_subgraph/connector/my_ts directory, with the functions.ts file being your connector's entrypoint:

├── .ddnignore
├── .env.local
├── .hasura-connector
├── connector.yaml
├── docker-compose.my_ts.yaml
├── functions.ts
├── package-lock.json
├── package.json
└── tsconfig.json

Step 2. Modify the connector's port

Add the port's value to the my_subgraph/connector/my_ts/.env.local in your my_ts directory:


As with any other connector, we'll now need to create the DataConnectorLink which will translate our TypeScript functions into commands that can be exposed as queries and mutations via our GraphQL API. Create this using:

From the root of your project, run:
ddn connector-link add my_ts --subgraph my_subgraph

Then, update the values in your subgraph's .env.my_subgraph file to include this connector.

Don't forget to modify the port number

To avoid port conflicts, you should modify the port number in the .env.my_subgraph file to match the port number you set in the .env.local file.

These values are already referenced in my_subgraph/metadata/my_ts/my_ts.hml.

Step 4. Install npm package dependencies

Within our my_ts connector directory, let's install any necessary dependencies:

cd my_subgraph/connector/my_ts && npm i

Then, from the my_ts directory run this command to use the included dotenv package to load environment variables from the . env.local file, start the connector, and watch for any changes:

npx dotenv -e .env.local -- npm run watch
Node.js version

The hasura/nodejs connector uses Node.js version >=20. Please make sure you have the correct version installed.

Step 5. Write business logic

In this simple example, we're going to transform a timestamp with timezone (eg: "2024-03-14T08:00:00Z") into a nicely formatted version for humans, eg: "8am, Thursday, March 14th, 2024."

We'll replace the default hello() function in our functions.ts file with the following:

* @readonly
export async function formatTimezoneDate(dateString: string): Promise<string> {
const date = new Date(dateString);

const day = date.getDate();
const nth = (d: number) => {
if (d > 3 && d < 21) return "th";
switch (d % 10) {
case 1:
return "st";
case 2:
return "nd";
case 3:
return "rd";
return "th";

const hours = date.getHours();
const ampm = hours >= 12 ? "pm" : "am";
const hour = hours % 12 || 12;

const dayOfWeek = date.toLocaleString("en-US", { weekday: "long" });
const month = date.toLocaleString("en-US", { month: "long" });
const year = date.getFullYear();

return `${hour}${ampm}, ${dayOfWeek}, ${month} ${day}${nth(day)}, ${year}.`;

As this is a Node.js project, you can install any dependency!

Step 6. Track the new function

To add our function, similar to how we added our individual tables earlier, we can use the following to generate the related Hasura metadata:

ddn connector-link update my_ts --subgraph my_subgraph --add-all-resources

Step 7. Create a new API build and test

Next, let's create a new build of our supergraph:

ddn supergraph build local --output-dir ./engine
Start your engines!

Want to test your supergraph? Don't forget to start your GraphQL engine and connectors using the following command.

From the root of your project, run:
HASURA_DDN_PAT=$(ddn auth print-pat) docker compose -f docker-compose.hasura.yaml watch

If you haven't included your connector(s) in your docker-compose.hasura.yaml, don't forget to start it as well.

You should see your command available, along with its documentation, in the GraphiQL explorer which you should be able to access at You can then test your new command with a string such as: "2024-03-14T08:00:00Z".

Demo Business Logic query

What did this do?

The commands above initialized a new TypeScript connector, installed dependencies, and created a new function to format a timestamp with timezone into a human-readable format. We then added this function to our metadata as a command, and created a new build of our supergraph.

Next Steps

You can also create relationships between types in your supergraph and your commands. This enables you to pair custom business logic with — for example — database tables, and then transform or enrich data before sending it back to your consumers.

You can learn more about creating these and other relationships on the next page, or you can learn about mutating data using the TypeScript connector.