Discussing groq query size limits and solutions in Sanity.io

2 replies
Last updated: Feb 7, 2025
Seems like I've hit a limit for the size of the groq query for
listenQuery()
. Is there any more details about this? Can something be done?
AI Update

Yes, there is a URL length limit that affects listenQuery() (the Listen API), and this is a known constraint when working with real-time updates in Sanity.

The Problem

The Listen API uses HTTP GET requests to establish real-time subscriptions to content changes. Since the GROQ query is passed as a URL parameter, you're subject to standard URL length limitations - typically around 16KB for most servers and browsers. This becomes problematic when you have:

  • Complex GROQ queries with multiple conditions
  • Large arrays of values in your query parameters
  • Queries with extensive projections or joins

The Listen API endpoint follows the pattern:

https://<project-id>.api.sanity.io/v2021-06-07/data/listen/<dataset>?query=<GROQ-query>

Since everything gets encoded in the URL, lengthy queries will hit these limits.

Workarounds and Solutions

1. Simplify Your Listen Query

The Listen API doesn't support joins or complex projections anyway - these get ignored. Your listen query should be as minimal as possible, focusing only on identifying which documents to watch:

// Keep it simple - just identify documents
const subscription = client.listen('*[_type == "post" && references($authorId)]', 
  { authorId: 'some-author-id' }
)

2. Use GROQ Parameters

Instead of embedding large values directly in your query string, use GROQ parameters to keep the query itself shorter:

// Instead of: *[_type == "post" && slug.current in ["slug1", "slug2", "slug3"...]]
// Use parameters:
client.listen('*[_type == "post" && slug.current in $slugs]', { slugs: arrayOfSlugs })

Parameters are passed separately and can help reduce the URL length, though there are still practical limits.

3. Consider the Live Content API

For many real-time use cases, the newer Live Content API (generally available as of March 2025) is a better alternative. It's designed for modern frameworks like Next.js, Nuxt, SvelteKit, and Astro, and handles real-time updates more efficiently without the same URL constraints. Using defineLive in Next.js:

import { defineLive } from 'next-sanity/live'

export const { sanityFetch, SanityLive } = defineLive({
  client,
  browserToken: token,
  serverToken: token,
})

The Live Content API uses a different architecture that doesn't have the same URL length limitations as the older Listen API.

4. Use Sanity Functions for Server-Side Events

If you're trying to react to content changes server-side, consider using Sanity Functions with event handlers instead of the Listen API. Functions can respond to content changes without URL length constraints and run within Sanity's serverless environment. This is the modern, recommended approach for server-side automation and event handling.

Bottom Line

Unfortunately, there's no way to increase the URL length limit itself - it's an HTTP constraint. Your best options are to:

  1. Simplify your listen queries and use parameters effectively
  2. Migrate to the Live Content API for client-side real-time needs
  3. Use Sanity Functions for server-side event handling

Both modern approaches (Live Content API and Functions) are more robust and don't have the same limitations as the older Listen API.

The limit here is 16K and we add in a 'buffer' of 1.2K ; so the actual limit including headers is ~15K . This error is typically thrown when the request URL exceeds that limit.Would it be possible to break down your Fragment into something smaller?
Yes, it could be possible. But I ended up solving it with a regular client.fetch().
But thanks for the details.
πŸ™‚ Really good to know the exact limits. πŸ™‚

Sanity – Build the way you think, not the way your CMS thinks

Sanity is the developer-first content operating system that gives you complete control. Schema-as-code, GROQ queries, and real-time APIs mean no more workarounds or waiting for deployments. Free to start, scale as you grow.

Was this answer helpful?