Warning: Data for Page Exceeds the Threshold of 128 kB

11 replies
Last updated: Aug 17, 2022
Hi all,
I'm getting the following:
Warning: data for page "/numeros" is 1.16 MB which exceeds the threshold of 128 kB, this amount of data can reduce performance.

I have a query that outputs 137 blog results in a grid. Only 6 are displayed at a time on first load, with a "load more" button displaying 6 additional articles on every click using
.slice()
.
Is there a better (more performant) way of going about this, such that I don't need to query all the data on page load?
AI Update

You're hitting a common performance issue with Next.js when loading too much data at once! The good news is there are several better approaches than fetching all 137 blog posts upfront and using client-side .slice().

Better Approaches

Instead of fetching all posts and slicing on the client, use GROQ's built-in pagination operators to fetch only what you need:

*[_type == "post"] | order(publishedAt desc) [0...6]

For subsequent loads, you can use offset-based pagination:

*[_type == "post"] | order(publishedAt desc) [6...12]  // next 6
*[_type == "post"] | order(publishedAt desc) [12...18] // next 6

Implement this with Next.js API routes or Server Actions (App Router) that accept a page or offset parameter, so each "load more" click fetches fresh data from Sanity rather than loading everything upfront.

2. Cursor-Based Pagination (More Robust)

For better performance and consistency (especially if content changes between loads), use cursor-based pagination with _id or _createdAt:

*[_type == "post" && _createdAt < $lastCreatedAt] 
  | order(_createdAt desc) [0...6]

Pass the last item's _createdAt value as $lastCreatedAt for the next batch. This prevents duplicate or skipped items if posts are added/removed between loads.

3. Use SWR or useSWRInfinite

If you're using client components, useSWRInfinite works great for "load more" patterns:

import useSWRInfinite from 'swr/infinite'

const getKey = (pageIndex, previousPageData) => {
  if (previousPageData && !previousPageData.length) return null
  return `/api/posts?page=${pageIndex}&limit=6`
}

const { data, size, setSize } = useSWRInfinite(getKey, fetcher)

Your /api/posts route would query Sanity with the appropriate GROQ pagination.

4. Move to getServerSideProps or Server Components

If you're using getStaticProps, consider switching to getServerSideProps for pages with frequently changing content, or use React Server Components with the App Router. This fetches data on-demand rather than bundling it all into the page payload.

Why This Matters

That 1.16 MB warning means Next.js is serializing all 137 posts into the initial HTML payload, which:

  • Slows down page load and Time to Interactive (TTI)
  • Hurts SEO and Core Web Vitals
  • Wastes bandwidth for users who may never click "load more"

By fetching only 6 posts initially and loading more on-demand, you'll likely reduce that payload to under 128 kB and significantly improve performance. The combination of GROQ pagination and proper data fetching patterns (API routes, Server Actions, or getServerSideProps) will give you a much more performant solution.

It may be helpful to check out this article on pagination with GROQ!
Thank you! I'll give this a read and let you know if I run into any hurdles
Hey! Sorry for the delayed reply, something came up.
So just to make sure I'm understanding the implementation correctly, the groq query formation for loading more posts would be done within the component it will be used in, rather than with something like getStaticProps?

In other words, if I'm currently querying ALL articles of type
numeros
in getStaticProps, I would instead query for my initial set of results that would be displayed on page load (6 "numero" articles for example), then, to load an additional subset of posts after a button is clicked, I would use an async function with a groq query within my component to retrieve that data?
Exactly. 🙂
Your static data contains the first 6, and then the rest is done on the client.
Another thing you should consider before bothering with client-side pagination is reducing the amount of data you query for each of your posts.
If you say your posts are displayed in a grid, I assume they need very little information: title, image URL and short excerpt perhaps? Even with 137 posts, it shouldn’t amount to 1Mb+. My gut feeling is that you fetch too much data for each post, perhaps even their entire content?

Try fetching only what’s needed, and try again to see how much data is really needed for that entire page.
Thanks!! I reduced the amount of fields I'm querying and brought it down to 176kb. This still exceeds the nextjs "threshold" though -- do you think it's really worth refactoring the code for this extra bit of performance?
Probably not at this stage. 🙂
You can always do it when you have more posts.
thank you :)) you're always such a great help!
đź’š

Sanity – Build the way you think, not the way your CMS thinks

Sanity is the developer-first content operating system that gives you complete control. Schema-as-code, GROQ queries, and real-time APIs mean no more workarounds or waiting for deployments. Free to start, scale as you grow.

Was this answer helpful?