Interacting with LLMs via GraphQL

Presentation for apidays Singapore 2025

Leonardo Losoviz
Gato GraphQL - Founder

About me
- Working with GraphQL since 2019
- Built my own GraphQL server
- Released Gato GraphQL in 2023 - gatographql.com
- Adding new features on an ongoing basis
- Currently: AI integrations


We'll explore how users can access AI services via the GraphQL schema
(not how to use AI to code resolvers)
Interacting with AI services in app





Interacting with AI services via GraphQL
Interacting with AI services in app
1. Create data via AI
(eg: write post content)
2. Execute operation
(eg: store in DB)
GraphQL
(server)
JS
(client)
Example 1:
Interacting with AI services via GraphQL
1. Create data via AI
(eg: write post content)
2. Execute operation
(eg: store in DB)
GraphQL
(server)
GraphQL query
Example 1:
Interacting with AI services in app
1. Query data
(eg: posts)
2. Manipulate it via AI
(eg: rewrite)
3. Execute operation
(eg: store again in DB)
GraphQL
(1st request)
GraphQL
(2nd request)
JS
(client)
Example 2:
Interacting with AI services via GraphQL
1. Query data
(eg: posts)
2. Manipulate it via AI
(eg: rewrite)
3. Execute operation
(eg: store again in DB)
GraphQL
(1 request)
GraphQL query
Example 2:
Benefits
1. Reduced complexity
Execute tasks without a client
2. Performance
Execute tasks in 1 request
(skip to 1:46)
How to access AI services via GraphQL?
1. Predefined by schema owner
Via some field or directive in the schema
2. Unrestrained access
Via connection to provider's API endpoint
Predefined by schema owner
1. Create a field or directive for some specific task
Eg: translate content via field translate, or directive @translate
2. Connect to the LLM's API in the resolver
query {
post(id: 1) {
content
}
translate(to: "es", text: "Some text to translate")
}
Issue: How to translate the post content?
query {
post(id: 1) {
content @translate(to: "es")
}
}
Issue: How to store again the translated content?
We need a custom feature
@export
[RFC] exporting variables between queries
query FetchContent {
post(id: 1) {
content @export(as: "postContent")
}
}
query AdaptContent @depends(on: "FetchContent") {
translate(to: "es", text: $postContent) @export(as: "adaptedPostContent")
}
mutation StoreContent @depends(on: "AdaptContent") {
updatePost(id: 1, content: $adaptedPostContent) {
status
errors {
message
}
post {
content
}
}
}
query FetchAndAdaptData {
post(id: 1) {
content
@translate(to: "es")
@export(as: "adaptedPostContent")
}
}
mutation StoreContent @depends(on: "FetchAndAdaptData") {
updatePost(id: 1, content: $adaptedPostContent) {
status
errors {
message
}
post {
content
}
}
}
Lesson 1
Check if your GraphQL server supports @export
Unrestrained access
1. For a specific LLM
1. Create field to connect to that LLM: sendChatGPTRequest: String!
2. Query it to connect to LLM API
{
"id": "chatcmpl-B9MHDbslfkBeAs8l4bebGdFOJ6PeG",
"object": "chat.completion",
"created": 1741570283,
"model": "gpt-4o-2024-08-06",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "The image shows a wooden boardwalk path running through a lush green field or meadow. The sky is bright blue with some scattered clouds, giving the scene a serene and peaceful atmosphere. Trees and shrubs are visible in the background.",
"refusal": null,
"annotations": []
},
"logprobs": null,
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 1117,
"completion_tokens": 46,
"total_tokens": 1163,
"prompt_tokens_details": {
"cached_tokens": 0,
"audio_tokens": 0
},
"completion_tokens_details": {
"reasoning_tokens": 0,
"audio_tokens": 0,
"accepted_prediction_tokens": 0,
"rejected_prediction_tokens": 0
}
},
"service_tier": "default",
"system_fingerprint": "fp_fc9f1d7035"
}
ChatGPT: Chat completion object
query FetchData {
post(id: 1) {
content
@export(as: "prompt")
}
}
query AdaptData($openAIAPIKey: String!) @depends(on: "FetchData") {
sendChatGPTRequest(input: {
apiKey: $openAIAPIKey,
model: "gpt-4o",
messages: [
{
role: "system",
content: "You are a language translator"
},
{
role: "user",
content: $prompt
}
]
}) @export(as: "adaptedPostContent")
}
mutation StoreContent @depends(on: "AdaptData") {
updatePost(id: 1, content: $adaptedPostContent) {
status
errors {
message
}
post {
content
}
}
}
Issue: How to generate the prompt, combining static + dynamic data?
@prepend
: Create the prompt combining static + dynamic data
We need a custom directive
query FetchData {
post(id: 1) {
content
@prepend(text: "Please translate from English to Spanish: ")
@export(as: "prompt")
}
}
query AdaptData($openAIAPIKey: String!) @depends(on: "FetchData") {
sendChatGPTRequest(input: {
apiKey: $openAIAPIKey,
model: "gpt-4o",
messages: [
{
role: "system",
content: "You are a language translator"
},
{
role: "user",
content: $prompt
}
]
}) @export(as: "adaptedPostContent")
}
mutation StoreContent @depends(on: "AdaptData") {
updatePost(id: 1, content: $adaptedPostContent) {
status
errors {
message
}
post {
content
}
}
}
Lesson 2
Check if your GraphQL server allows creating custom directives
Unrestrained access
2. For any LLM
1. Create field to send an HTTP request: jsonHTTPRequest: JSON!
2. Query it to connect to LLM API
3. Navigate the response and extract data
(Different APIs return responses with different structures)
{
"content": [
{
"text": "Hi! My name is Claude.",
"type": "text"
}
],
"id": "msg_013Zva2CMHLNnXjNJJKqJ2EF",
"model": "claude-3-7-sonnet-20250219",
"role": "assistant",
"stop_reason": "end_turn",
"stop_sequence": null,
"type": "message",
"usage": {
"input_tokens": 2095,
"output_tokens": 503
}
}
Claude: Messages object
query FetchData {
post(id: 1) {
content
@prepend(text: "Please translate from English to Spanish: ")
@export(as: "prompt")
}
}
query AdaptData($openAIAPIKey: String!) @depends(on: "FetchData") {
jsonHTTPRequest(input: {
url: "https://api.openai.com/v1/chat/completions",
method: POST,
options: {
auth: {
password: $openAIAPIKey
},
json: {
model: "gpt-4o",
messages: [
{
role: "system",
content: "You are a language translator"
},
{
role: "user",
content: $prompt
}
]
}
}
})
# This will not work, as this is a JSON, not a String
@export(as: "adaptedPostContent")
}
mutation StoreContent @depends(on: "AdaptData") {
updatePost(id: 1, content: $adaptedPostContent) {
status
errors {
message
}
post {
content
}
}
}
Issue: How to navigate and extract the actual data?
@underJSONObjectProperty
: Navigate the response to the desired entry
We need a custom directive
query FetchData {
post(id: 1) {
content
@prepend(text: "Please translate from English to Spanish: ")
@export(as: "prompt")
}
}
query AdaptData($openAIAPIKey: String!) @depends(on: "FetchData") {
chatGPTResponse: jsonHTTPRequest(input: {
url: "https://api.openai.com/v1/chat/completions",
method: POST,
options: {
auth: {
password: $openAIAPIKey
},
json: {
model: "gpt-4o",
messages: [
{
role: "system",
content: "You are a language translator"
},
{
role: "user",
content: $prompt
}
]
}
}
})
@underJSONObjectProperty(by: { key: "choices.0.message.content" })
@export(as: "adaptedPostContent")
}
mutation StoreContent @depends(on: "AdaptData") {
updatePost(id: 1, content: $adaptedPostContent) {
status
errors {
message
}
post {
content
}
}
}
Lesson 3
The custom directives that you can create with your GraphQL server must be powerful and flexible
Conclusion
Accessing AI services directly within the GraphQL query can reduce the complexity of the application, and improve performance
The GraphQL spec currently doesn't provide all the required features (eg: @export
)
Check if your GraphQL server provides the required features, or allows adding them as custom features
Thanks! 👋
Interacting with LLMs via GraphQL

Presentation for apidays Singapore 2025

Leonardo Losoviz
Gato GraphQL - Founder
