Sanity client.fetch("count(*)") hangs in Remix loader with Express

3 replies
Last updated: Dec 2, 2025
Hello, I am having a really hard time figuring out why my client.fetch(“count(*)“) hangs. I’m using remix 1.12 with express, node 16.17.1. And sanity client 6.1.2. Here is the code I use directly on the loader.
export const loader = async ({ request, params }: LoaderArgs) => {
  const user = await getUser(request);
  const projectId = "4lfuaunh";
  const dataset = "production";

  const client = createClient({ projectId, dataset, useCdn: true, apiVersion: "2021-10-21" });
  try {
    console.log("getting info");
    const data = await client.fetch(`count(*)`);
    console.log(data);
    console.log("info obtain");
    return data;
  } catch (err) {
    console.error(err);
  }
  const items = await getCalculatorInfo()
  return { user };
};
AI Update

I can see why client.fetch("count(*)") is hanging in your setup. The issue is related to using @sanity/client v6 with Node 16.17.1.

The Core Problem: Missing fetch Implementation

@sanity/client v6 relies on the global fetch API being available, but Node.js 16 doesn't have native fetch support—that was only added in Node 18. When the client tries to make a request without fetch available, it can hang indefinitely rather than throwing a clear error.

Solution 1: Upgrade Node.js (Recommended)

The best long-term solution is to upgrade to Node.js 18 or higher (preferably Node 20+, as that's what Sanity Studio v4 requires). This gives you native fetch support and aligns with current Sanity tooling requirements.

Node 16 reached end-of-life in September 2023, so you're missing security updates and compatibility with modern tooling.

Solution 2: Add a Fetch Polyfill

If you can't upgrade Node immediately, you need to polyfill fetch globally before creating your Sanity client:

// At the top of your loader file or entry point
import fetch from 'node-fetch';
if (!globalThis.fetch) {
  globalThis.fetch = fetch;
}

import { createClient } from '@sanity/client';

export const loader = async ({ request, params }: LoaderArgs) => {
  const user = await getUser(request);
  const projectId = "4lfuaunh";
  const dataset = "production";

  const client = createClient({ 
    projectId, 
    dataset, 
    useCdn: true, 
    apiVersion: "2021-10-21" 
  });
  
  try {
    console.log("getting info");
    const data = await client.fetch(`count(*)`);
    console.log(data);
    console.log("info obtained");
    return data;
  } catch (err) {
    console.error(err);
    throw err; // Re-throw so Remix can handle the error properly
  }
};

First install the polyfill:

npm install node-fetch@2

Note: Use node-fetch@2 for CommonJS compatibility with older Node versions, or node-fetch@3 if you're using ESM.

Solution 3: Downgrade the Client

As a temporary workaround, you could use @sanity/client v5 or earlier, which may have better Node 16 compatibility:

npm install @sanity/client@5

Additional Debugging Tips

If the polyfill doesn't resolve it completely:

  1. Test with useCdn: false to rule out CDN-specific issues
  2. Add a timeout to see if it's truly hanging or just very slow:
    const timeoutPromise = new Promise((_, reject) => 
      setTimeout(() => reject(new Error('Request timeout')), 10000)
    );
    const data = await Promise.race([client.fetch(`count(*)`), timeoutPromise]);
  3. Try a simpler query like *[_type == "post"][0..1] to see if count(*) specifically is the issue
  4. Check your error handling - notice I changed your catch block to re-throw the error so Remix can handle it properly

The root cause is almost certainly the Node 16 + @sanity/client v6 combination lacking fetch. Start with the polyfill approach if you can't upgrade immediately, but plan to upgrade Node.js soon for better long-term compatibility and security.

Show original thread
3 replies

Sanity – Build the way you think, not the way your CMS thinks

Sanity is the developer-first content operating system that gives you complete control. Schema-as-code, GROQ queries, and real-time APIs mean no more workarounds or waiting for deployments. Free to start, scale as you grow.

Was this answer helpful?