Skip to main content
Version: v3.x

Enrich data with LLMs

Introduction

In this recipe, you'll learn how to interact with OpenAI's API to send prompts and receive responses. This can be used to integrate AI-driven features, such as generating text or completing tasks, into your API. In the example below, we'll hard-code a prompt and have OpenAI apply it to existing content in our supergraph.

Prerequisites

Before continuing, ensure you have:

Recipe

Step 1. Write the function

In your connector's directory, install the OpenAI package:
npm install openai
In your functions.ts file, add the following:
// We can store this in the project's .env file and reference it here
const OPENAI_API_KEY =
"your_openai_api_key";

const client = new OpenAI({
apiKey: OPENAI_API_KEY,
});

/**
* @readonly
*/
export async function generateSeoDescription(input: string): Promise<string | null> {
const response = await client.chat.completions.create({
messages: [
{
role: "system",
content:
"You are a senior marketing associate. Take the product description provided and improve upon it to rank well with SEO.",
},
{ role: "user", content: input },
],
model: "gpt-4o",
});

return response.choices[0].message.content;
}

Step 2. Track your function

To add your function, generate the related metadata that will link together any functions in your lambda connector's source files and your API:

ddn connector introspect <connector_name>

Then, you can generate an hml file for the function using the following command:

ddn command add <connector_name> "*"

Step 3. Create a relationship (optional)

Assuming the input argument's type matches that of a type belonging to one or more of your models, you can create a relationship to the command. This will enable you to make nested queries that will invoke your custom business logic using the value of the field from the related model!

Create a relationship in the corresponding model's HML file.

For example, if we have a Prompts model:
---
kind: Relationship
version: v1
definition:
name: optimizedDescription
sourceType: Products
target:
command:
name: generateSeoDescription
mapping:
- source:
fieldPath:
- fieldName: description
target:
argument:
argumentName: input

Step 4. Test your function

Create a new build of your supergraph:

ddn supergraph build local

In your project's explorer, you should see the new function exposed as a type and should be able to make a query like this:

If you created a relationship, you can make a query like this, too:

Wrapping up

In this guide, you learned how to send prompts to OpenAI's API and receive responses in your application. By leveraging lambda connectors with relationships, you can easily incorporate AI-driven capabilities into your existing supergraph.

Learn more about lambda connectors

Similar recipes

Loading...