This Friday: Hear from Sanity + Vercel experts on AI and better personalization in e-commerce

CLI dataset export error: MaxListenersExceeded and Read timeout on assets

2 repliesLast updated: Nov 29, 2025

Hi guys, just checked out the updated plugins page after being away for several months - looking really exciting! Wish you all the best with sanity I really dig what you guys are doing.
I need help with the cli dataset export error I'm having below. The export command errors out after completing and due to the error the archive is unextractable.

I also want to ask how one would go about duplicating a dataset. When I export the production dataset, I'm getting an assets folder, an assets json and the ndjson. I can import the ndjson with the cli import command, but I couldn't find info on how one would go about importing assets.

Many thanks!


Exporting documents... (204/204)
Downloading assets... (29/590)(node:12634) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added. Use emitter.setMaxListeners() to increase limit
Downloading assets... (36/590)(node:12634) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added. Use emitter.setMaxListeners() to increase limit
Downloading assets... (48/590)(node:12634) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added. Use emitter.setMaxListeners() to increase limit
Downloading assets... (72/590)(node:12634) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added. Use emitter.setMaxListeners() to increase limit
Downloading assets... (94/590)(node:12634) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added. Use emitter.setMaxListeners() to increase limit
Downloading assets... (111/590)(node:12634) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added. Use emitter.setMaxListeners() to increase limit
Downloading assets... (116/590)(node:12634) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added. Use emitter.setMaxListeners() to increase limit
Downloading assets... (208/590)(node:12634) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added. Use emitter.setMaxListeners() to increase limit
Downloading assets... (590/590)
Adding assets to archive...events.js:174
      throw er; // Unhandled 'error' event
      ^

Error: Read timeout: No data received on socket for 180000 ms
    at TLSSocket.response.connection.setTimeout (/home/deniz/dev/enteria/sanity/node_modules/@sanity/export/lib/requestStream.js:76:32)
    at Object.onceWrapper (events.js:277:13)
    at TLSSocket.emit (events.js:194:15)
    at TLSSocket.Socket._onTimeout (net.js:440:8)
    at ontimeout (timers.js:436:11)
    at tryOnTimeout (timers.js:300:5)
    at listOnTimeout (timers.js:263:5)
    at Timer.processTimers (timers.js:223:10)
Emitted 'error' event at:
    at emitErrorNT (internal/streams/destroy.js:82:8)
    at process._tickCallback (internal/process/next_tick.js:63:19)
AI Update

Thanks for the kind words about Sanity! Let me help you with both the export error and the dataset duplication question.

Fixing the Export Error

The MaxListenersExceededWarning and Read timeout error you're seeing is a known issue that occurs when exporting datasets with many assets (you have 590 assets). The error happens during the final "Adding assets to archive" step when the process times out after 180 seconds without receiving data.

Here are a few approaches to work around this:

If none of these work and you have an Enterprise plan, consider using Cloud Clone, which duplicates datasets server-side without needing to download locally.

Importing Assets - Good News!

When you successfully export a dataset as a .tar.gz archive, it contains:

The sanity dataset import command handles assets automatically when you import the .tar.gz file:

sanity dataset import production.tar.gz development

This single command imports both the documents AND the assets - you don't need to do anything special or import them separately. The CLI takes care of uploading all the assets to the target dataset.

If you only have the NDJSON file (from an export with --no-assets), then only the document references will be imported, but the actual asset files won't be available in the new dataset.

Complete Dataset Duplication Workflow

For a complete dataset duplication:

# 1. Export everything (documents + assets)
sanity dataset export production production-backup.tar.gz

# 2. Create a new dataset (if it doesn't exist)
sanity dataset create development

# 3. Import everything into the new dataset
sanity dataset import production-backup.tar.gz development

Alternatively, check out the Cross Dataset Duplicator plugin which lets you duplicate datasets directly from within Studio without using the CLI - super handy for content editors who aren't comfortable with command-line tools.

Hope this helps! If you're still hitting the timeout issue consistently, it might be worth reaching out to Sanity support as they may be able to help with large dataset exports or investigate if there's a deeper issue.

Show original thread
2 replies

Was this answer helpful?

Sanity – Build the way you think, not the way your CMS thinks

Sanity is the developer-first content operating system that gives you complete control. Schema-as-code, GROQ queries, and real-time APIs mean no more workarounds or waiting for deployments. Free to start, scale as you grow.

Related contributions