Timeout error when exporting dataset, investigation ongoing.

8 replies
Last updated: May 4, 2020
Hello, I get a timeout when I try to export the dataset. It says 180000ms, which should be 3 minutes, but it only takes a few seconds before the timeout error appear. Is this a known issue?
I’ve done this before and it has worked fine
🙂


Exporting dataset "production" to "/Users/thomas/dev/bkk/deko/containers/cms/exports/20200504/production.tar.gz"
✔ Exporting documents... (494/494)
⠙ Downloading assets... (54/447)(node:51103) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added. Use emitter.setMaxListeners() to increase limit
(node:51103) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added. Use emitter.setMaxListeners() to increase limit
⠦ Downloading assets... (62/447)(node:51103) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added. Use emitter.setMaxListeners() to increase limit
(node:51103) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added. Use emitter.setMaxListeners() to increase limit
⠧ Downloading assets... (62/447)(node:51103) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added. Use emitter.setMaxListeners() to increase limit
(node:51103) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added. Use emitter.setMaxListeners() to increase limit
⠇ Downloading assets... (77/447)(node:51103) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added. Use emitter.setMaxListeners() to increase limit
⠋ Downloading assets... (77/447)(node:51103) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added. Use emitter.setMaxListeners() to increase limit
✔ Downloading assets... (447/447)
⠏ Adding assets to archive...events.js:174
throw er; // Unhandled 'error' event
^

Error: Read timeout: No data received on socket for 180000 ms
at TLSSocket.response.connection.setTimeout (/Users/thomas/dev/bkk/deko/containers/cms/node_modules/@sanity/export/lib/requestStream.js:76:32)
at Object.onceWrapper (events.js:277:13)
at TLSSocket.emit (events.js:194:15)
at TLSSocket.Socket._onTimeout (net.js:440:8)
at ontimeout (timers.js:436:11)
at tryOnTimeout (timers.js:300:5)
at listOnTimeout (timers.js:263:5)
at Timer.processTimers (timers.js:223:10)
Emitted 'error' event at:
  at emitErrorNT (internal/streams/destroy.js:82:8)
at process._tickCallback (internal/process/next_tick.js:63:19)
AI Update

This is a known issue with the @sanity/export package when dealing with datasets that have a large number of assets. The error you're seeing isn't actually about the 180-second timeout being too short – it's about the way the export tool handles concurrent asset downloads.

The MaxListenersExceededWarning messages are the key clue here. The export tool is trying to download many assets simultaneously, which creates too many event listeners and eventually causes the socket connection to hang while "Adding assets to archive", triggering the timeout error.

Here are several solutions you can try:

1. Export without assets first (quickest workaround)

If you primarily need the document data and can handle assets separately:

sanity dataset export production production-docs-only.ndjson --no-assets

2. Update your Sanity CLI

Make sure you're running the latest version of the Sanity CLI, as there have been improvements to the export functionality:

npm install -g @sanity/cli@latest

3. Adjust Node.js memory and event emitter limits

You can increase Node.js limits when running the export:

NODE_OPTIONS="--max-old-space-size=4096" sanity dataset export production production.tar.gz

4. Export during off-peak hours

Network conditions and API load can affect asset downloads. Try running the export during off-peak hours when there's less network congestion.

5. Contact Sanity support for large datasets

If you have a particularly large number of assets (like your 447 assets), the Sanity documentation notes that you may want to contact support for assistance with large dataset exports.

6. Export documents and assets separately

You could also manually download assets using the Export API with custom pagination logic to avoid overwhelming the connection.

The fact that this worked before but doesn't now suggests either the number of assets has grown, or there may have been network/infrastructure changes. The export tool's concurrent download mechanism can be fragile with larger asset collections, which is why the --no-assets flag is often the most reliable workaround for immediate needs.

Hi User, thanks for reporting! I’ve shared this internally for the team to investigate, as we’ve seen a few more reports of this warning lately. Have you tried running it locally outside of the container? Also, what versions of Sanity are you on? =>
sanity versions
And what node version in this case?
And could you DM me your project ID? 🙂
╰─ sanity versions
user Q
/cli 1.149.9 (up to date)
user Q
/base 1.149.10 (up to date)
user Q
/color-input 1.149.7 (up to date)
user Q
/components 1.149.10 (up to date)
user Q
/core 1.149.12 (up to date)
user Q
/default-layout 1.149.10 (up to date)
user Q
/default-login 1.149.11 (up to date)
user Q
/desk-tool 1.149.10 (up to date)
user Q
/rich-date-input 1.149.7 (up to date)
user Q
/vision 1.149.0 (up to date)
Thanks, User 👍 🙂
Hi User, thanks for reporting! I’ve shared this internally for the team to investigate, as we’ve seen a few more reports of this warning lately. Have you tried running it locally outside of the container? Also, what versions of Sanity are you on? =>
sanity versions
And what node version in this case?
And could you DM me your project ID? 🙂
Node version: v10.15.3
Upgraded to Node v12.16.3 and now it worked 👍
Thanks for following up in DM, User

Sanity – Build the way you think, not the way your CMS thinks

Sanity is the developer-first content operating system that gives you complete control. Schema-as-code, GROQ queries, and real-time APIs mean no more workarounds or waiting for deployments. Free to start, scale as you grow.

Was this answer helpful?