How to efficiently patch 2,500 documents with 5,000 images using transactions?

2 replies
Last updated: Nov 29, 2025
Hi all, just wondering about a best practice approach to patch about 2,500 documents with nearly 5,000 images. As the documents already exist using a patch would normally be the approach. The following example is perfect as the
then
allows me to patch the relevant document with the freshly uploaded image... though I think that'll smash my API usage, so I'm assuming I'd need to amend that to handle Multiple mutations in a transaction ?
AI Update

Great question! Yes, you're absolutely right that batching your mutations into transactions is the way to go for patching 2,500 documents with 5,000 images. Let me break down the best practices:

Use Transactions to Batch Your Mutations

Instead of individual API calls for each patch, you should group multiple mutations into transactions. This dramatically reduces your API usage and avoids rate limits. Here's the approach:

import {getCliClient} from 'sanity/cli'

const client = getCliClient()

// Batch your patches into transactions
const batchSize = 100 // Adjust based on your needs
const documents = [...] // Your 2,500 documents

for (let i = 0; i < documents.length; i += batchSize) {
  const batch = documents.slice(i, i + batchSize)
  
  const transaction = client.transaction()
  
  batch.forEach(doc => {
    transaction.patch(doc._id, patch => 
      patch.set({ imageField: doc.uploadedImageReference })
    )
  })
  
  await transaction.commit()
}

Key Best Practices & Limits

Based on Sanity's documentation on transactions and bulk operations best practices:

  1. Transaction size limit: Keep each transaction payload under 500kB. With 5,000 images across 2,500 documents, you'll likely want batches of 50-100 documents per transaction.

  2. Rate limiting: Maximum 25 requests per second. Use a throttling library like p-throttle to manage this:

import pThrottle from 'p-throttle'

const throttle = pThrottle({
  limit: 25,
  interval: 1000 // 25 requests per second
})

const throttledCommit = throttle(async (transaction) => {
  return await transaction.commit()
})
  1. Upload images first, patch second: Upload all your images concurrently (with rate limiting using p-limit), collect the asset references, then batch patch the documents. This is covered in the migration course on uploading assets performantly.

Consider Using the Migration CLI

For a cleaner approach, use Sanity's CLI migration tool which handles batching automatically:

npx sanity@latest migration create

The migration tool provides:

  • Automatic batching of mutations into transactions
  • Built-in rate limit handling
  • Dry-run mode for testing
  • Visual feedback in Studio

This approach follows general migration principles of idempotency and graceful error handling, which is crucial when dealing with thousands of documents.

Show original thread
2 replies

Sanity – Build the way you think, not the way your CMS thinks

Sanity is the developer-first content operating system that gives you complete control. Schema-as-code, GROQ queries, and real-time APIs mean no more workarounds or waiting for deployments. Free to start, scale as you grow.

Was this answer helpful?