AI Configuration

Set up AI-powered natural language queries with your own OpenAI or Azure OpenAI API key.

Overview

GremlinStudio’s AI feature lets you describe what you want in plain English, and it translates your request into executable Gremlin queries. Instead of memorizing Gremlin syntax for complex traversals, you can type a natural language description and get a working query in seconds. This is a Pro feature that uses a Bring Your Own Key (BYOK) approach — you provide your own OpenAI or Azure OpenAI API key, and all translation happens through direct API calls from your machine.

Bring Your Own Key (BYOK)

GremlinStudio does not operate any AI servers. When you configure an API key, it is stored locally in the application’s SQLite database on your machine. When you submit a natural language query, GremlinStudio sends the request directly from your computer to your configured AI provider. No GremlinStudio servers are involved at any point in the process. This means you have full control over costs, rate limits, and data residency.

Setting Up Your API Key

  1. Open Settings by clicking the gear icon in the title bar.
  2. Go to the AI tab.
  3. Choose your AI provider: Azure OpenAI or OpenAI.
  4. Enter your API credentials (see the tables below).
  5. Click Save.

Azure OpenAI Setup

FieldDescription
EndpointYour Azure OpenAI resource endpoint (e.g., https://my-resource.openai.azure.com/)
API KeyYour Azure OpenAI API key from the Azure portal
DeploymentThe deployment name of your GPT model (e.g., gpt-4o)

OpenAI Setup

FieldDescription
API KeyYour OpenAI API key from platform.openai.com

Which Provider to Choose

Azure OpenAI is the best choice if you already have Azure resources, especially if your Cosmos DB instance is in the same Azure region. Running the AI provider in the same region as your database minimizes latency and keeps all traffic within the Azure network. Azure OpenAI also gives you enterprise features like virtual network integration, managed identity, and content filtering policies.

OpenAI is the simplest option if you want to get started quickly. Create an account at platform.openai.com, generate an API key, paste it into GremlinStudio, and you are ready to go. There is no need to provision Azure resources or configure deployments. This is a good choice for individual developers and small teams who want natural language Gremlin queries without additional infrastructure.

Both providers support GPT-4 class models, which produce the best results for Gremlin query generation. GPT-3.5 models also work but may struggle with complex traversal patterns.

Using Natural Language Queries

  1. Click the AI icon in the Query Toolbar or press the NL Query button to open the natural language panel.
  2. Type a description of what you want to find in your graph.
  3. GremlinStudio sends your description to the AI provider along with context about your graph schema.
  4. The generated Gremlin query appears in the editor.
  5. Review the query, then click Execute to run it.

Always review the generated query before executing it. The AI produces correct Gremlin syntax in most cases, but verifying the logic against your data model ensures you get the results you expect.

Example Prompts

Here are examples of natural language queries and the Gremlin they generate. These demonstrate the range of questions you can ask using AI graph database queries:

Natural LanguageGenerated Gremlin
Show me all friends of Aliceg.V().has('person','name','Alice').out('knows')
Find the shortest path between vertex A and Bg.V('A').repeat(out().simplePath()).until(hasId('B')).path().limit(1)
How many vertices of each type are there?g.V().groupCount().by(label)
List all people older than 30g.V().hasLabel('person').has('age',gt(30))
Who has the most connections?g.V().project('name','degree').by('name').by(both().count()).order().by(select('degree'),desc).limit(10)
Find all paths between departments and projectsg.V().hasLabel('department').out('manages').hasLabel('project').path()

The AI uses your graph’s schema information to generate accurate label names and property references. Running schema discovery before using natural language queries gives the AI better context and produces more precise results.

Tips for Better Results

  • Be specific about vertex labels and property names. Instead of “show me all people,” say “show me all vertices with label person.” The more precise your language, the more accurate the generated query.
  • Mention the graph structure. If you know the edge labels, include them: “find all person vertices connected by knows edges” produces better results than “find all connections.”
  • Use the Schema panel first. Open the Schema panel to see the vertex labels, edge labels, and property keys in your graph. Reference these exact names in your natural language queries for best results.
  • Start simple and refine. Begin with a straightforward query, review the generated Gremlin, then ask for modifications: “add a filter for age greater than 25” or “sort by name.”
  • Describe the output you want. Saying “give me a count” versus “list all vertices” helps the AI choose between aggregation steps and traversal steps.

Privacy and Data Security

GremlinStudio takes your privacy seriously. Here is exactly what happens when you use the AI feature:

  • Your API key stays local. It is stored in the application’s SQLite database on your machine and is only sent to your configured AI provider as an authentication header.
  • Only query text is sent to the AI provider. The natural language description you type and a summary of your graph schema are sent for translation. No vertex data, property values, connection strings, or query results are included.
  • No data is sent to GremlinStudio servers. The AI feature is a direct integration between your machine and your AI provider. There is no intermediary.
  • No telemetry of AI queries. Even if you have opted into telemetry, the content of your natural language queries and the generated Gremlin are never recorded. The only telemetry event is a simple counter that the NL feature was used, with no payload.
  • API keys are not exported. When you export settings or style rules, API keys are excluded from the export file.

This BYOK architecture means your data governance policies are preserved. Your AI queries go directly to OpenAI or Azure OpenAI under your own account, subject to your own terms of service and data processing agreements with those providers.