How to bulk delete 20,000 documents of a specific type in Sanity?
For bulk deleting 20,000 documents of a particular type in Sanity, here's the most efficient approach:
The Challenge
You're hitting two main limitations:
- 10,000 document limit per delete query - When using delete by query, you can only operate on up to 10,000 documents at a time
- Rate limits - 25 mutations/second per IP and max 100 concurrent mutations per dataset
Recommended Solution: Use the CLI Migration Tool
The Sanity CLI migration tool is specifically designed for this scenario and handles batching/rate limiting automatically:
npx sanity@latest migration create deleteUsersThen create a migration script:
import {defineMigration, at, unset} from 'sanity/migrate'
export default defineMigration({
title: 'Delete all users',
documentTypes: ['users'],
// This will delete all documents of type 'users'
filter: '_type == "users"',
migrate: {
document() {
// Return null to delete the document
return null
}
}
})Run with:
npx sanity migration run deleteUsersBenefits:
- Automatically batches mutations to avoid rate limits
- Built-in dry-run mode by default
- Shows progress and handles errors gracefully
- Can handle all 20,000 documents without manual pagination
Alternative: Paginated Delete Script
If you need more control, paginate by _id to work within the 10,000 document limit:
import {getCliClient} from 'sanity/cli'
const client = getCliClient()
async function deleteInBatches() {
let lastId = ''
let hasMore = true
while (hasMore) {
// Query max 10,000 docs at a time
const query = lastId
? `*[_type == "users" && _id > $lastId][0...10000]._id`
: `*[_type == "users"][0...10000]._id`
const ids = await client.fetch(query, {lastId})
if (ids.length === 0) {
hasMore = false
break
}
// Delete in transaction
const transaction = ids.reduce(
(tx, id) => tx.delete(id),
client.transaction()
)
await transaction.commit()
lastId = ids[ids.length - 1]
console.log(`Deleted ${ids.length} documents. Last ID: ${lastId}`)
// Optional: add delay to respect rate limits
await new Promise(resolve => setTimeout(resolve, 1000))
}
}
deleteInBatches()Key Points
- The 10,000 document limit applies to delete-by-query operations specifically
- The migration tool is the recommended modern approach and handles all the complexity
- For manual scripts, paginate by
_id(not array slicing) for better performance - With 20,000 documents and rate limits, expect the full deletion to take a few minutes
The migration tool is definitely your best bet here - it's built exactly for this type of bulk operation and will handle the 20,000 documents efficiently without you having to worry about rate limits or batching logic.
Show original thread4 replies
Sanity – Build the way you think, not the way your CMS thinks
Sanity is the developer-first content operating system that gives you complete control. Schema-as-code, GROQ queries, and real-time APIs mean no more workarounds or waiting for deployments. Free to start, scale as you grow.