What’s new in PromptQL: Managed LLM keys, Collaboration and Automation with APIs
Over the past few weeks, we’ve introduced a series of exciting updates to PromptQL that enhance its power, accessibility, and collaboration capabilities. These enhancements are focused on simplifying workflows. Everything is now live and ready to explore!
Quickstart PromptQL with built-in LLM keys
Getting started with PromptQL just got a whole lot easier! Hasura now offers pre-configured LLM keys out of the box, so you can dive right in without the hassle of setting up your own API key.
You get $10 worth of free credits giving you the freedom to explore PromptQL’s capabilities without any upfront commitment. This is a great way to get hands-on experience before deciding to purchase additional credits or integrate your own LLM API key from providers like Anthropic or OpenAI.
In the PromptQL settings page of your project, you can configure the LLM provider. We recommend Anthropic models (Sonnet 3.5). Hasura’s built-in LLM key that comes pre-configured, uses the same behind the scenes.
Stay tuned for an upcoming blog post where we’ll compare different LLMs to help you choose the best fit for your PromptQL projects.
New Program API Endpoint
We recently launched the PromptQL Program API, which takes the power of PromptQL beyond the interactive Playground. Developers can now trigger PromptQL Programs via HTTP, allowing seamless integration of AI-driven query planning and execution into their existing workflows and external systems.
PromptQL Programs, which are Python-based scripts generated by LLMs, enable automation of complex tasks by accessing and manipulating data from multiple sources unified under Hasura DDN connectors. With the new API, these programs can be saved and executed on demand, making them invaluable for automating business workflows, enhancing analytics, and streamlining operations.
Key use cases include automating customer support workflows, e-commerce analytics, and GitHub integrations, such as classifying and summarizing issues using webhooks. Developers can secure API access through robust authentication mechanisms for production environments, ensuring only authorized users can interact with the endpoints.
With the Public PromptQL Playground, you can now make your entire project public, allowing users to explore and engage with all the threads in one place.
By enabling the Public PromptQL Playground, your project becomes accessible to anyone with a login.
You can enable Public PromptQL Playground by heading to the DDN project settings page -> PromptQL -> Enable Public PromptQL Playground.
The URL for public PromptQL project will look something like this:
Extending the functionality of the Public PromptQL Playground with a new feature: Public Thread Sharing. Now, you can share specific chat threads with anyone—quickly and effortlessly.
View a shared thread without logging in
Your shared thread is accessible via a public link, with no signup or login required for viewers. It’s as simple as sharing your insights with the world, whether it’s an interesting conversation or an impressive query result with artifacts.
Sharing a thread doesn’t reveal any publicly identifiable information about your project—only the content of the thread itself is visible. Plus, any messages added after sharing won’t appear in the public link, keeping the snapshot of your conversation intact.
Showcase Your Data Stories Here’s your chance to share compelling data artifacts and workflows with your audience. Whether on Twitter, LinkedIn, or your favorite platform, let your threads inspire and inform others in the community.
Here’s me sharing some interesting data points about the H-1B program on LinkedIn, powered by PromptQL.
Saved and Shareable prompts
Streamline collaboration and user onboarding with Saved and Shareable Prompts, a new feature for PromptQL projects. Whether you’re showcasing a project or guiding others to explore specific capabilities, this feature makes it effortless to share ready-made prompts.
How It Works You can link your predefined prompts directly to a URL by appending a prompt query parameter. This allows users to jump straight into a conversation with the prompt auto-filled in the input chat box—ready to execute.
In this example, the prompt “find the top 10 results for hasura” is encoded in the URL and seamlessly loaded into the chat box when the link is opened. Special characters are automatically escaped, ensuring the prompt works without issues.
Start using Saved & Shareable Prompts today and make it easier than ever for others to explore your public PromptQL projects.
Summary
PromptQL’s latest updates bring significant improvements to its usability and feature set: To summarize, here are the key updates:
Quickstart with Built-in LLM Keys: Get started effortlessly with pre-configured LLM keys and $10 worth of free credits, eliminating the need for an external API key setup.
API Endpoint Launch: Trigger Python-based PromptQL programs via HTTP, enabling smooth integration into existing workflows.
Public PromptQL Playground: Share projects and threads publicly, making collaboration and knowledge-sharing seamless.
Saved & Shareable Prompts: Easily guide users with predefined, shareable prompts, enabling smooth onboarding and collaborative exploration.
New to PromptQL? Take a look at the recent webinar titled "Beyond RAG: Building Enterprise AI Assistants That Users Actually Trust". Learn how to leverage PromptQL’s latest features to build reliable, scalable AI assistants for enterprise use cases. Register now and take your AI-powered applications to the next level!