How to bulk delete 20,000 documents of a specific type in Sanity?

4 replies
Last updated: Jun 16, 2022
Hi, what's the best way to bulk delete a particular document type. I'm in a scenario where there's a type that currently has about 20,000 test documents. I'd like to replace those documents with new versions completely and remove unused documents so the easiest way would be to clear it all out and do a bulk re-import. I've been trying different approaches motivated by links this forum, mainly https://www.sanity.io/schemas/delete-documents-by-filter-698e1f26 , but that script doesn't work well in trying to clear out 20,000 entries. The script works half the time and when it does, I'm only able to delete, say, 100 at a time but with 20,000 entries, that would take 200 calls. A loop could be built in but I'd hit the rate limit pretty quickly.
Example query: *[ _type == 'users' ][0...999]
AI Update

For bulk deleting 20,000 documents of a particular type in Sanity, here's the most efficient approach:

The Challenge

You're hitting two main limitations:

  1. 10,000 document limit per delete query - When using delete by query, you can only operate on up to 10,000 documents at a time
  2. Rate limits - 25 mutations/second per IP and max 100 concurrent mutations per dataset

The Sanity CLI migration tool is specifically designed for this scenario and handles batching/rate limiting automatically:

npx sanity@latest migration create deleteUsers

Then create a migration script:

import {defineMigration, at, unset} from 'sanity/migrate'

export default defineMigration({
  title: 'Delete all users',
  documentTypes: ['users'],
  
  // This will delete all documents of type 'users'
  filter: '_type == "users"',
  
  migrate: {
    document() {
      // Return null to delete the document
      return null
    }
  }
})

Run with:

npx sanity migration run deleteUsers

Benefits:

  • Automatically batches mutations to avoid rate limits
  • Built-in dry-run mode by default
  • Shows progress and handles errors gracefully
  • Can handle all 20,000 documents without manual pagination

Alternative: Paginated Delete Script

If you need more control, paginate by _id to work within the 10,000 document limit:

import {getCliClient} from 'sanity/cli'

const client = getCliClient()

async function deleteInBatches() {
  let lastId = ''
  let hasMore = true
  
  while (hasMore) {
    // Query max 10,000 docs at a time
    const query = lastId 
      ? `*[_type == "users" && _id > $lastId][0...10000]._id`
      : `*[_type == "users"][0...10000]._id`
    
    const ids = await client.fetch(query, {lastId})
    
    if (ids.length === 0) {
      hasMore = false
      break
    }
    
    // Delete in transaction
    const transaction = ids.reduce(
      (tx, id) => tx.delete(id),
      client.transaction()
    )
    
    await transaction.commit()
    
    lastId = ids[ids.length - 1]
    console.log(`Deleted ${ids.length} documents. Last ID: ${lastId}`)
    
    // Optional: add delay to respect rate limits
    await new Promise(resolve => setTimeout(resolve, 1000))
  }
}

deleteInBatches()

Key Points

  • The 10,000 document limit applies to delete-by-query operations specifically
  • The migration tool is the recommended modern approach and handles all the complexity
  • For manual scripts, paginate by _id (not array slicing) for better performance
  • With 20,000 documents and rate limits, expect the full deletion to take a few minutes

The migration tool is definitely your best bet here - it's built exactly for this type of bulk operation and will handle the 20,000 documents efficiently without you having to worry about rate limits or batching logic.

Show original thread
4 replies
I don’t think there is a better approach than this script. Run it on a loop for a while until you have no remaining document I’d say.
Ok thank you
user F
. Will keep trying then. Is there a plan to build something into sanity directly? Clearing a table should be a fast/easy few clicks. Saw a plugin was being developed that enables bulk deletes but it's still in alpha.
If you have 20,000 documents to delete, maybe it's better to just export the dataset without those documents, then work from that dataset?
user M
Ah, I see, yes, that could be an alternate to do the opposite approach. I'll give the alternate approach a try next time.
For the current example, after some persistence and looping with cooldown to prevent rate limiting, I was able to delete the 20,000 entries. And I have already reimported the updated data successfully.

Of note, this use case of clearing out a document type is probably pretty common so something built in to easily do so would be very helpful, even if that requires some sort of extra safe guard to prevent accidental wipes. Something I could've done with a few clicks in phpMyAdmin or MySQL Workbench took a few hours to try and piece together the code to do a similarly common function. Suggestion for the future.

Thanks much again Kitty and RD for your responses. Appreciate it!

Sanity – Build the way you think, not the way your CMS thinks

Sanity is the developer-first content operating system that gives you complete control. Schema-as-code, GROQ queries, and real-time APIs mean no more workarounds or waiting for deployments. Free to start, scale as you grow.

Was this answer helpful?