stream-json
stream-json copied to clipboard
Stream an object to reconstruct it on the frontend
Hello :wave:
First of all, thanks for the hard work done on this library. Understanding how to use it properly can take time, though it is worth the investment.
With @justinberiot, we wanted to use it for a particular use case. In our context, the JSON is not large (the complete data can fit in memory without any issue). The complexity is much more on the fact it can take a lot of time to get it fully, and it can be a deeply nested structure.
Our objective is to have a frontend which builds incrementally whenever some information is available. We managed to do so thanks to your library. Our code is available on the repo davidfou/stream-json-object.
Code snippet
// Available at https://github.com/davidfou/stream-json-object/blob/main/gif/index.mjs
import Stream from "node:stream";
import timers from "node:timers/promises";
import util from "node:util";
import StreamJSON from "stream-json";
import fetch from "node-fetch";
import _ from "lodash";
const updatePath = (path) => {
const lastPathValue = path[path.length - 1];
if (typeof lastPathValue === "number") {
return [...path.slice(0, -1), lastPathValue + 1];
}
return path;
};
await Stream.promises.pipeline(
async function* () {
const response = await fetch(
"https://raw.githubusercontent.com/davidfou/stream-json-object/main/demo/solar_system.json"
);
if (!response.ok) {
throw new Error("Oups");
}
yield* response.body;
},
StreamJSON.parser({
streamKeys: false,
streamValues: false,
}),
// Simulate some latency
async function* (source) {
for await (const chunk of source) {
await timers.setTimeout(50);
yield chunk;
}
},
async function* (source) {
let path = [];
for await (const chunk of source) {
const lastPathValue = path[path.length - 1];
switch (chunk.name) {
case "startArray":
path = [...path, 0];
break;
case "endArray":
path = path.slice(0, -1);
if (lastPathValue === 0) {
yield { key: path, value: [] };
}
path = updatePath(path);
break;
case "startObject":
path = [...path, null];
break;
case "endObject":
path = path.slice(0, -1);
if (lastPathValue === null) {
yield { key: path, value: {} };
}
path = updatePath(path);
break;
case "keyValue":
path = [...path.slice(0, -1), chunk.value];
break;
default:
yield {
key: path,
value:
chunk.name === "numberValue"
? parseFloat(chunk.value)
: chunk.value,
};
path = updatePath(path);
}
}
},
// events sent by a backend and reconstruction of the object on the frontend
async function* (source) {
let out = null;
let i = 0;
for await (const chunk of source) {
if (out === null) {
out = typeof chunk.key[0] === "number" ? [] : {};
}
out = _.set(out, chunk.key, chunk.value);
i += 1;
console.clear();
console.log(
util.inspect(out, { depth: null, colors: true, breakLength: 171 })
);
}
console.log("object recreated in %i steps", i);
}
);

Here is what we wonder about:
- The library is excellent at parsing some large JSON files. Though, it took us some time to build the code above. Could what we've made be valuable for your library?
- Do you have an opinion about the usage of the library? Would it be possible to make something similar and more maintainable by using the library more appropriately?
- Is another library more adequate for our problem?
- Any feedback is welcome :pray: