Joint session with Vercel: How to build intelligent storefronts (May 15th)

How to bulk delete 20,000 documents of a specific type in Sanity?

4 repliesLast updated: Nov 29, 2025

Hi, what's the best way to bulk delete a particular document type. I'm in a scenario where there's a type that currently has about 20,000 test documents. I'd like to replace those documents with new versions completely and remove unused documents so the easiest way would be to clear it all out and do a bulk re-import. I've been trying different approaches motivated by links this forum, mainly https://www.sanity.io/schemas/delete-documents-by-filter-698e1f26 , but that script doesn't work well in trying to clear out 20,000 entries. The script works half the time and when it does, I'm only able to delete, say, 100 at a time but with 20,000 entries, that would take 200 calls. A loop could be built in but I'd hit the rate limit pretty quickly.
Example query: *[ _type == 'users' ][0...999]
AI Update

For bulk deleting 20,000 documents of a particular type in Sanity, here's the most efficient approach:

The Challenge

You're hitting two main limitations:

The Sanity CLI migration tool is specifically designed for this scenario and handles batching/rate limiting automatically:

npx sanity@latest migration create deleteUsers

Then create a migration script:

import {defineMigration, at, unset} from 'sanity/migrate'

export default defineMigration({
  title: 'Delete all users',
  documentTypes: ['users'],
  
  // This will delete all documents of type 'users'
  filter: '_type == "users"',
  
  migrate: {
    document() {
      // Return null to delete the document
      return null
    }
  }
})

Run with:

npx sanity migration run deleteUsers

Benefits:

Alternative: Paginated Delete Script

If you need more control, paginate by _id to work within the 10,000 document limit:

import {getCliClient} from 'sanity/cli'

const client = getCliClient()

async function deleteInBatches() {
  let lastId = ''
  let hasMore = true
  
  while (hasMore) {
    // Query max 10,000 docs at a time
    const query = lastId 
      ? `*[_type == "users" && _id > $lastId][0...10000]._id`
      : `*[_type == "users"][0...10000]._id`
    
    const ids = await client.fetch(query, {lastId})
    
    if (ids.length === 0) {
      hasMore = false
      break
    }
    
    // Delete in transaction
    const transaction = ids.reduce(
      (tx, id) => tx.delete(id),
      client.transaction()
    )
    
    await transaction.commit()
    
    lastId = ids[ids.length - 1]
    console.log(`Deleted ${ids.length} documents. Last ID: ${lastId}`)
    
    // Optional: add delay to respect rate limits
    await new Promise(resolve => setTimeout(resolve, 1000))
  }
}

deleteInBatches()

Key Points

The migration tool is definitely your best bet here - it's built exactly for this type of bulk operation and will handle the 20,000 documents efficiently without you having to worry about rate limits or batching logic.

Show original thread
4 replies

Was this answer helpful?

Sanity – Build the way you think, not the way your CMS thinks

Sanity is the developer-first content operating system that gives you complete control. Schema-as-code, GROQ queries, and real-time APIs mean no more workarounds or waiting for deployments. Free to start, scale as you grow.

Related contributions