Bulk deleting posts with referential integrity protection in Sanity.io

10 replies
Last updated: Feb 28, 2023
trying to bulk delete some posts, but the mutation fails, I guess it's because these documents have references, but can I override that?

export async function deleteContent() {
  const posts = await client.fetch(`
    *[_type == 'work'][0...500]
    { title, _id, _rev, content, workContent }
  `)

  posts.forEach((post) => {
    queue.add(() => 
      client
        .delete(post._id)
        .then(() => {
          console.log('Post deleted')
        })
        .catch((err) => {
          console.error('Delete failed: ', err.message)
        })
    )
  })
}
AI Update

No, you cannot override or force delete documents that have strong references from other documents. This is by design for referential integrity protection in Sanity.

However, you have several options to handle this situation:

1. Use Weak References (Schema Change)

If you want to allow deletion regardless of references, you can modify your schema to use weak references by adding weak: true:

{
  name: 'someField',
  type: 'reference',
  to: [{type: 'work'}],
  weak: true  // Allows deletion even if referenced
}

Important: Just changing this property won't automatically update existing references in your data store.

2. Delete Only Unreferenced Documents

You can filter your query to only delete documents that don't have incoming references:

const posts = await client.fetch(`
  *[_type == 'work' && count(*[references(^._id)]) == 0][0...500]
  { title, _id, _rev }
`)

3. Identify Which Documents Are Blocking Deletion

To see which documents are preventing deletion, catch the error and query for referencing documents:

posts.forEach((post) => {
  queue.add(() => 
    client
      .delete(post._id)
      .catch((err) => {
        console.error('Delete failed: ', err.message)
        client.fetch('*[references($id)]._id', { id: post._id })
          .then((results) => {
            console.log('The following documents reference this document:')
            results.forEach((id) => console.log('  ', id))
          })
      })
  )
})

4. Delete References First

The proper approach is to delete or update the documents that reference your target documents first, then delete the target documents themselves.

Why There's No Force Delete

The strong reference protection is intentional—it prevents orphaned references and maintains data integrity in your content lake. If you truly need to bulk delete documents with references, you'll need to handle the referencing documents first, either by:

  • Deleting them
  • Updating them to remove the references
  • Converting the references to weak references

If you're trying to roll back mutations (as mentioned in the linked discussion), consider using the unset operation to remove specific fields rather than deleting entire documents.

It’s helpful to see what error message you get, but if it’s referential integrity protection, then you either have to delete the incoming references, or add
_weak: true
to the reference object(s).
Delete failed:  Mutation(s) failed with 1 error(s)
so not very helpful 🙂
I quickly just want to get rid of these posts, unfortunate that there's no
--force
options or similar 🤔
Yeah, I get it. I guess it would be nice to have a convenience method in the client or something.
To quickly just query docs without incoming references, you can at least do something like this:
*[_type == "work" && count(*[references(^._id)]) == 0]
And this is a brittle, because you risk being rate limited, but it will log out the docs that blocks you at least:
.catch((err) => {
          console.error('Delete failed: ', err.message)
          client.fetch('*[references($id)]._id', { id: post._id }).then((results) => {
            console.log('The following documents reference this document:')
            results.forEach((id) => console.log(' ', id))
          }
        })
wait, no, you’re adding the promise to the queue. so it will probably be fine
Thanks!
The reason why I'm asking is because I did some mutations on my dataset. Now I want to roll those back, but importing my backed up dataset does not roll back to the previous state because I guess Sanity wants to make sure I don't lose data.
I have populated some fields which I now want unpopulated 🙂
Aha, I see. You can maybe also adapt this script to “undo” the mutation: https://www.sanity.io/schemas/rename-a-field-across-documents-5cd6f5f0
Ah, yeah, unsetting my field was a good solution. Thanks!

Sanity – Build the way you think, not the way your CMS thinks

Sanity is the developer-first content operating system that gives you complete control. Schema-as-code, GROQ queries, and real-time APIs mean no more workarounds or waiting for deployments. Free to start, scale as you grow.

Was this answer helpful?