Possible EventEmitter memory leak detected during dataset export, JSON termination error, bug identified and fixed in new release.

14 replies
Last updated: Jun 12, 2020
Exporting dataset "dev" to "/tmp/sanity_export_module.tar.gz"
⠴ Exporting documents... (17/?)(node:68190) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
⠙ Exporting documents... (17/?)(node:68190) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
⠴ Exporting documents... (17/?)(node:68190) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
⠇ Exporting documents... (17/?)(node:68190) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
⠴ Exporting documents... (17/?)(node:68190) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
⠏ Exporting documents... (17/?)(node:68190) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
⠙ Exporting documents... (17/?)(node:68190) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
⠸ Exporting documents... (17/?)(node:68190) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
⠧ Exporting documents... (17/?)(node:68190) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
⠼ Exporting documents... (17/?)(node:68190) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
⠧ Exporting documents... (17/?)Unhandled rejection: SyntaxError: Unexpected end of JSON input ({"_createdAt":"2020-06-10T17:02:35Z","_id":"MmjVXHtlNsZV9S0PqvTXd5","_rev":"MmjVXHtlNsZV9S0PqvTXZo","_type":"mux.videoAsset","_updatedAt":"2020-06-10T17:02:35Z","assetId":"go2Jg936QgKew702LqdShzPI400rE4wbhh","data":{"aspect_ratio":"25:56","created_at":159)
    at JSON.parse (<anonymous>)
    at DestroyableTransform.module.exports [as mapper] (/###/node_modules/@sanity/export/lib/tryParseJson.js:5:17)
    at DestroyableTransform.flush [as _flush] (/###/node_modules/split2/index.js:44:21)
    at DestroyableTransform.prefinish (/###/node_modules/readable-stream/lib/_stream_transform.js:138:10)
    at DestroyableTransform.emit (events.js:311:20)
    at prefinish (/###/node_modules/readable-stream/lib/_stream_writable.js:619:14)
    at finishMaybe (/###/node_modules/readable-stream/lib/_stream_writable.js:627:5)
    at afterWrite (/###/node_modules/readable-stream/lib/_stream_writable.js:492:3)
    at onwrite (/###/node_modules/readable-stream/lib/_stream_writable.js:483:7)
    at WritableState.onwrite (/###/node_modules/readable-stream/lib/_stream_writable.js:180:5)
⠋ Exporting documents... (17/?)
AI Update

This error occurs when the sanity dataset export command encounters a corrupted or truncated JSON response from the Sanity API. The "Unexpected end of JSON input" shows that a document (in your case, a mux.videoAsset document) was cut off mid-stream during export.

Good news: This was a known bug that has been fixed!

According to this community answer, this exact issue was identified and resolved in Sanity CLI version 1.149.17 (released June 2020). The fix not only resolved the error but also significantly improved export performance.

Quick Solutions

1. Update your Sanity CLI (most important):

npm install -g @sanity/cli@latest

This should resolve the issue entirely. After updating, try your export again.

2. Export without assets (temporary workaround):

sanity dataset export dev --no-assets dev-documents-only.tar.gz

As you discovered, the --no-assets flag works because the issue typically occurs when streaming large asset data. This exports only your documents and skips asset files.

3. Use --no-compress (if you need assets):

sanity dataset export dev --no-compress

This can sometimes help with streaming issues.

About the MaxListenersExceededWarning

The MaxListenersExceededWarning you're seeing indicates the export process is creating many simultaneous connections. While this warning itself isn't fatal, it's a symptom of the underlying bug that was causing the JSON parsing failure.

If Problems Persist After Updating

  • Check your Sanity CLI version with sanity versions
  • Try exporting during off-peak hours when API load is lower
  • For very large datasets on Enterprise plans, consider using Cloud Clone which duplicates datasets server-side
  • Contact Sanity support if you continue experiencing issues

The export file that was created is likely incomplete and shouldn't be used for restoration without verification.

({"_createdAt":"2020-06-10T17:02:35Z","_id":"MmjVXHtlNsZV9S0PqvTXd5","_rev":"MmjVXHtlNsZV9S0PqvTXZo","_type":"mux.videoAsset","_updatedAt":"2020-06-10T17:02:35Z","assetId":"go2Jg936QgKew702LqdShzPI400rE4wbhh","data":{"aspect_ratio":"25:56","created_at":159)
That JSON is terminated early
Its not terminated in the dataset I’m exporting from, though.
sanity --dataset dev documents query '*[_id=="MmjVXHtlNsZV9S0PqvTXd5"]'
shows the full document. Also, if I delete this document and run the export again, it reports a similar error with a different document. It seems to me the stream is getting truncated and the parser doesn’t have the full document to parse
I also never hit these errors if I use the
--no-assets
option on the export command, which would suggest its not the documents themselves that are the issue, but more likely the retrieval of the assets
it could be some string encoding/escaping issue that is really specific to an asset in your dataset
That would be my hunch
Sounds like a bug on our end. I'll have a look in a few hours
Actually, which version of
@sanity/core
are you running?
sanity versions
should tell you
sanity versions
@sanity/cli                     1.149.13 (latest: 1.149.16)
@sanity/base                    1.149.13 (latest: 1.149.16)
@sanity/block-content-to-react     2.0.7 (up to date)
@sanity/color-input              1.149.7 (latest: 1.149.16)
@sanity/components              1.149.13 (latest: 1.149.16)
@sanity/core                    1.149.13 (latest: 1.149.16)
@sanity/default-layout          1.149.13 (latest: 1.149.16)
@sanity/default-login           1.149.11 (latest: 1.149.16)
@sanity/desk-tool               1.149.13 (latest: 1.149.16)
@sanity/image-url               0.140.19 (up to date)
@sanity/vision                   1.149.0 (latest: 1.149.16)
Thanks Espen
I've identified a bug that might be causing this - will see if I can get a fix out
That’s great Espen! Thanks for looking into it.
Ok, got a new release out - let me know if that solves your problem!

https://github.com/sanity-io/sanity/releases/tag/v1.149.17
Wow. Not only did that fix my issue, but my export is significantly faster now too. That’s awesome! Thanks Espen!
That's great to hear! Happy to help 🙂

Sanity – Build the way you think, not the way your CMS thinks

Sanity is the developer-first content operating system that gives you complete control. Schema-as-code, GROQ queries, and real-time APIs mean no more workarounds or waiting for deployments. Free to start, scale as you grow.

Was this answer helpful?