How AI is powering better personalization in e-commerce [with Vercel]. Sign up now

client.fetch in custom validation rules and API request impact

3 repliesLast updated: Nov 29, 2025

Hi. I see that a custom validation rule may be triggered several times when edditting a document.
Is it ok to have a
client.fetch
inside the custom validation rule of a document?

dumb question: Does it count as API Request?
AI Update

Not a dumb question at all! Yes, using client.fetch() inside a custom validation rule is technically possible, but there are important considerations:

Does it count as an API request?

Yes, absolutely. When you use client.fetch() (typically accessed via getClient() from the validation context), it makes a real API request to your Content Lake. Each of these requests counts toward your project's API usage and is subject to rate limits.

According to Sanity's technical limits documentation, there are several rate limits in place:

Should you do it?

While it works, you should be very cautious about this approach:

If you must use it:

validation: Rule => Rule.custom(async (value, context) => {
  const client = context.getClient({apiVersion: '2024-01-01'})
  // This WILL count as an API request and affect your rate limits
  const result = await client.fetch('*[_type == "reference"][0]')
  
  if (/* your validation logic */) {
    return true
  }
  return 'Validation error message'
})

The general recommendation is to keep validation rules lightweight and avoid external API calls when possible. Real-time validation should be fast and responsive for a good editing experience. If you need complex validation with data fetching, consider moving that logic to a Sanity Function that runs on document save/publish events instead.

Show original thread
3 replies

Was this answer helpful?

Sanity – Build the way you think, not the way your CMS thinks

Sanity is the developer-first content operating system that gives you complete control. Schema-as-code, GROQ queries, and real-time APIs mean no more workarounds or waiting for deployments. Free to start, scale as you grow.

Related contributions