Migration API Mutation Error: Request Body Exceeds Limit of 4000000 Bytes

11 replies
Last updated: May 9, 2022
We are migrating all of our data from the old CMS to Sanity via API mutations, and some of the resources are throwing errors “Request body exceeds limit of 4000000 bytes”. Most of those are portable text. Any workaround suggestions?
AI Update

You're hitting Sanity's 4MB request body limit for mutation API requests, which is a hard limit documented in their technical limits. When migrating data with large portable text fields, here are your workaround options:

Break your large documents into smaller pieces before importing. For portable text specifically, you can:

  • Split long portable text arrays into multiple fields or related documents
  • Use transactions to batch multiple smaller mutations together (keeping each transaction under 4MB)
  • Process documents in batches with proper rate limiting (25 requests/second max)

Example using the Sanity client with batching:

const {default: PQueue} = require('p-queue')
const queue = new PQueue({
  concurrency: 1,
  interval: 1000 / 25  // Stay under 25 req/s rate limit
})

// Split large documents and queue them
largeDocs.forEach(doc => {
  queue.add(() => client.create(doc))
})

2. Use the NDJSON Import Tool

The CLI import tool (sanity dataset import) is designed for bulk operations and may handle large documents more efficiently than direct API mutations:

sanity dataset import my-data.ndjson production --replace

However, individual documents still can't exceed the 32MB document size limit.

3. Compress Your Requests

While not explicitly documented, you could try using gzip compression with the Content-Encoding: gzip header on your mutation requests. This might help if your portable text is highly compressible, though the 4MB limit likely applies to the uncompressed payload.

4. Restructure Your Content Model

Consider if your portable text fields are too large and should be split into:

  • Multiple related documents with references
  • Paginated content blocks
  • Separate content fragments

Key Constraints to Remember

  • Maximum mutation request body: 4MB
  • Maximum document size: 32MB
  • API rate limit: 25 mutations/second per IP
  • Transaction payload: Keep under 500KB for best performance

The transactions documentation recommends keeping transaction payloads reasonable and using throttling libraries like p-throttle for bulk operations.

If you're doing a one-time migration, the NDJSON import approach is your best bet. For ongoing operations, you'll need to architect your content to stay within these limits.

Ooooh, tricky. 😅
I wonder if dedicated support could help you bypass them maybe?
You could try doing it in multiple smaller passes, but this could be tricky depending on what your schema looks like.
Can block content / portable text even be broken down?
I already reduced the batch sizes to 1 entry per request, so I am not sure how I would go about it. I guess patching the partable text array would be an option, but it’s a super complex migration logic, which iterates through a gazillion nodes, migrates assets et al.
With transactions, I am not sure how to safely split the transaction to ensure that item references do not throw errors.
Maybe json-ld import is better for this? 4mb is a lot though
What’s a json-ld import?
I have a ton of markdown that needs to be transformed into portable text, not sure what json-ld has to do with it
Looks like there are some long raw svgs, will try to optimize that part
Ok. It’s me stupid. There was no clean up happening after the page was imported, so it kept adding transactions to a stack for each new page, until it reached the limit, which now that I think about it is huge. False alarm.

Sanity – Build the way you think, not the way your CMS thinks

Sanity is the developer-first content operating system that gives you complete control. Schema-as-code, GROQ queries, and real-time APIs mean no more workarounds or waiting for deployments. Free to start, scale as you grow.

Was this answer helpful?