Skip to main content
Version: v3.x

Custom Business Logic via Go

Introduction

The Go Connector can be added to any project to incorporate custom business logic directly into your supergraph.

This can be used to enrich data returned to clients, or to write custom mutations in Go.

Under the hood, the Go connector is a CLI plugin to generate codes upon a boilerplate of ndc-sdk-go.

Prerequisites

If you've never used Hasura DDN, we recommend that you first go through the getting started. 😊

Functions or Procedures

Go functions exported from a file in the connector will then be available from your Hasura DDN GraphQL API in the form of either functions or procedures.

In Hasura metadata, functions are used for read operations. They will not modify the data in the database and can only be used to retrieve data.

Conversely, procedures are used for write operations. They can modify the data in the database and can be used to create, update, or delete data.

The distinction is important in metadata because it allows the system to know what types to expect for arguments and return values.

Add the Go connector to a project

Step 1. Initialize the Go connector

Go version

The hasura/go connector requires Go version >=1.21.0. Please make sure you have the correct version installed.

Let's begin by initializing the connector on our project. In the example below, you'll see some familiar flags with new values being passed to them. Assume that you initialized a my_subgraph subgraph. We'll call this my_go and use the hasura/go connector from the connector hub:

From the root of your project, run:
ddn connector init -i --subgraph my_subgraph/subgraph.yaml \
--add-to-compose-file ./compose.yaml

# ? Hub Connector hasura/go
# ? Connector Name my_go
# ? Port 6197

In this command, we're passing a few important values.

Connector name

We're naming the connector my_go in this example, but you can call it whatever makes sense to you.

Hub Connector: --hub-connector

We're specifying that this connector should be the: hasura/go, connector listed in the Connector Hub.

Port: configure-port

We're specifying the port to run the connector on. This is important to avoid port collisions with other connectors or services which you might have running on your machine. Remember to use a different port for each connector you may have running.

Best practices

Importantly, a data connector can only connect to one data source.

The project will be kept organized with each data connector's configuration located in a relevant subgraph directory. In this example the CLI will create a my_subgraph/connector/my_go directory if it doesn't exist. You can also change this directory by passing a --dir flag to the CLI.

We recommend that the name of the connector and the directory in which the configuration is stored, my_go in this example, should match for convenience and clarity sake.

In subsequent steps, when running your connector locally, it's critical to ensure the port value matches the connection string you provide in your supergraph's .env file.

What did this do?

This command created the following file structure in a my_subgraph/connector/my_go directory, with Go files in the functions folder being your connector's entrypoint:

.
├── .ddnignore
├── .gitignore
├── .hasura-connector
│ ├── ...
├── compose.yaml
├── functions
│ ├── hello.go
│ ├── types.generated.go
├── types
│ ├── connector.go
├── connector.generated.go
├── connector.go
├── main.go
├── go.mod
├── go.sum
├── Makefile
├── README.md
├── schema.generated.json

The command also adds the read and write URL for the connection to the .env file for us.

Example .env file
MY_GO_MY_GO_READ_URL="http://local.hasura.dev:6197"
MY_GO_MY_GO_WRITE_URL="http://local.hasura.dev:6197"

These keys are already referenced in my_subgraph/metadata/my_go.hml.

Step 2. Write business logic

The template code that ships with the Go connector provides some simple examples in the functions/hello.go file to help explain how it works.

Functions that have a Function prefix are allowed to be exposed as a read-only queries. For example:

// A hello argument
type HelloArguments struct {
Greeting string `json:"greeting"` // value argument will be required
Count *int `json:"count"` // pointer arguments are optional
}

// A hello result
type HelloResult struct {
Reply string `json:"reply"`
Count int `json:"count"`
}

// FunctionHello sends a hello message
func FunctionHello(ctx context.Context, state *types.State, arguments *HelloArguments) (*HelloResult, error) {
count := 1
if arguments.Count != nil {
count = *arguments.Count + 1
}
return &HelloResult{
Reply: fmt.Sprintf("Hi! %s", arguments.Greeting),
Count: count,
}, nil
}

Functions that have a Procedure prefix are allowed to be exposed as a mutation. For example:

// A create author argument
type CreateAuthorArguments struct {
Name string `json:"name"`
}

// A create author result
type CreateAuthorResult struct {
ID int `json:"id"`
Name string `json:"name"`
}

// ProcedureCreateAuthor creates an author
func ProcedureCreateAuthor(ctx context.Context, state *types.State, arguments *CreateAuthorArguments) (*CreateAuthorResult, error) {
return &CreateAuthorResult{
ID: 1,
Name: arguments.Name,
}, nil
}

The CLI plugin infers the third argument of the function and generates the input argument. The first result type is generated as the response schema.

See more details in the documentation of hasura-ndc-go plugin.

Step 3. Track the new function

Introspect the connector to generate connector schemas and the related Hasura metadata whenever there are new changes in the functions folder.

ddn connector introspect my_go \
--subgraph ./app/subgraph.yaml \
--add-all-resources

Step 4. Create a new API build and test

Next, let's create a new build of our supergraph:

ddn supergraph build local
Start your engines!

Want to test your supergraph? Don't forget to start your GraphQL engine using the following command.

From the root of your project, run:
ddn run docker-start

This reads the docker-start script from the context config at .hasura/context.yaml and starts your Hasura engine, any connectors, and observability tools.

You should see your command available, along with its documentation, in the GraphiQL explorer which you should be able to access at https://console.hasura.io/local/graphql?url=http://localhost:3000.

You can then test your new command with the following query:
query Hello {
hello(greeting: "world") {
reply
count
}
}
Privacy settings in some browsers

Your browser settings or privacy tools may prevent the Console from accessing your local Hasura instance. This could be due to features designed to protect your privacy and security. Should you encounter one of these issues, we recommend disabling these settings for the console.hasura.io domain.

Chrome and Firefox are the recommended browsers for the best experience with the Hasura Console including for local development.

Loading...