Possible EventEmitter memory leak detected during dataset export, JSON termination error, bug identified and fixed in new release.

14 replies
Last updated: Jun 12, 2020
Exporting dataset "dev" to "/tmp/sanity_export_module.tar.gz"
⠴ Exporting documents... (17/?)(node:68190) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
⠙ Exporting documents... (17/?)(node:68190) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
⠴ Exporting documents... (17/?)(node:68190) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
⠇ Exporting documents... (17/?)(node:68190) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
⠴ Exporting documents... (17/?)(node:68190) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
⠏ Exporting documents... (17/?)(node:68190) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
⠙ Exporting documents... (17/?)(node:68190) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
⠸ Exporting documents... (17/?)(node:68190) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
⠧ Exporting documents... (17/?)(node:68190) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
⠼ Exporting documents... (17/?)(node:68190) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 timeout listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
⠧ Exporting documents... (17/?)Unhandled rejection: SyntaxError: Unexpected end of JSON input ({"_createdAt":"2020-06-10T17:02:35Z","_id":"MmjVXHtlNsZV9S0PqvTXd5","_rev":"MmjVXHtlNsZV9S0PqvTXZo","_type":"mux.videoAsset","_updatedAt":"2020-06-10T17:02:35Z","assetId":"go2Jg936QgKew702LqdShzPI400rE4wbhh","data":{"aspect_ratio":"25:56","created_at":159)
    at JSON.parse (<anonymous>)
    at DestroyableTransform.module.exports [as mapper] (/###/node_modules/@sanity/export/lib/tryParseJson.js:5:17)
    at DestroyableTransform.flush [as _flush] (/###/node_modules/split2/index.js:44:21)
    at DestroyableTransform.prefinish (/###/node_modules/readable-stream/lib/_stream_transform.js:138:10)
    at DestroyableTransform.emit (events.js:311:20)
    at prefinish (/###/node_modules/readable-stream/lib/_stream_writable.js:619:14)
    at finishMaybe (/###/node_modules/readable-stream/lib/_stream_writable.js:627:5)
    at afterWrite (/###/node_modules/readable-stream/lib/_stream_writable.js:492:3)
    at onwrite (/###/node_modules/readable-stream/lib/_stream_writable.js:483:7)
    at WritableState.onwrite (/###/node_modules/readable-stream/lib/_stream_writable.js:180:5)
⠋ Exporting documents... (17/?)
Jun 12, 2020, 12:31 AM
({"_createdAt":"2020-06-10T17:02:35Z","_id":"MmjVXHtlNsZV9S0PqvTXd5","_rev":"MmjVXHtlNsZV9S0PqvTXZo","_type":"mux.videoAsset","_updatedAt":"2020-06-10T17:02:35Z","assetId":"go2Jg936QgKew702LqdShzPI400rE4wbhh","data":{"aspect_ratio":"25:56","created_at":159)
That JSON is terminated early
Jun 12, 2020, 12:32 AM
Its not terminated in the dataset I’m exporting from, though.
sanity --dataset dev documents query '*[_id=="MmjVXHtlNsZV9S0PqvTXd5"]'
shows the full document. Also, if I delete this document and run the export again, it reports a similar error with a different document. It seems to me the stream is getting truncated and the parser doesn’t have the full document to parse
Jun 12, 2020, 12:37 AM
I also never hit these errors if I use the
--no-assets
option on the export command, which would suggest its not the documents themselves that are the issue, but more likely the retrieval of the assets
Jun 12, 2020, 12:40 AM
it could be some string encoding/escaping issue that is really specific to an asset in your dataset
Jun 12, 2020, 12:45 AM
That would be my hunch
Jun 12, 2020, 12:45 AM
Sounds like a bug on our end. I'll have a look in a few hours
Jun 12, 2020, 1:59 AM
Actually, which version of
@sanity/core
are you running?
sanity versions
should tell you
Jun 12, 2020, 2:20 AM
sanity versions
@sanity/cli                     1.149.13 (latest: 1.149.16)
@sanity/base                    1.149.13 (latest: 1.149.16)
@sanity/block-content-to-react     2.0.7 (up to date)
@sanity/color-input              1.149.7 (latest: 1.149.16)
@sanity/components              1.149.13 (latest: 1.149.16)
@sanity/core                    1.149.13 (latest: 1.149.16)
@sanity/default-layout          1.149.13 (latest: 1.149.16)
@sanity/default-login           1.149.11 (latest: 1.149.16)
@sanity/desk-tool               1.149.13 (latest: 1.149.16)
@sanity/image-url               0.140.19 (up to date)
@sanity/vision                   1.149.0 (latest: 1.149.16)
Jun 12, 2020, 2:43 AM
Thanks Espen
Jun 12, 2020, 2:43 AM
I've identified a bug that might be causing this - will see if I can get a fix out
Jun 12, 2020, 2:56 AM
That’s great Espen! Thanks for looking into it.
Jun 12, 2020, 2:57 AM
Ok, got a new release out - let me know if that solves your problem!

https://github.com/sanity-io/sanity/releases/tag/v1.149.17
Jun 12, 2020, 3:40 AM
Wow. Not only did that fix my issue, but my export is significantly faster now too. That’s awesome! Thanks Espen!
Jun 12, 2020, 3:51 AM
That's great to hear! Happy to help 🙂
Jun 12, 2020, 4:01 AM

Sanity– build remarkable experiences at scale

Sanity is a modern headless CMS that treats content as data to power your digital business. Free to get started, and pay-as-you-go on all plans.

Was this answer helpful?