Because JSON.parse blocks the thread it's in, and JS is single threaded [1].
So even if you put it behind a promise, when that promise actually runs, it will block the thread.
In essence, using promises (or callbacks or timeouts or anything else like that) allows you to delay the thread-blocking, but once the code hits `JSON.parse`, no other javascript will run until it completes. And since no other javascript will run, the UI is entirely unresponsive during that time as well.
[1] Technically there are web-workers, and I looked into them to try and solve this problem. Unfortunately any complex-objects that get sent to or from a worker need to be serialized (no pass-by-reference is allowed except for a very small subset of "C style" arrays called TypedArrays). So while you could technically send the string to a worker and have the worker call `JSON.parse` on it to get an object, when you go to pass that object back the javascript engine will need to do an "implicit" `JSON.stringify` in the worker, then a `JSON.parse` in the main thread. Making it entirely useless for my usecase.
But continuing with that same thought process, I very nearly went for an architecture that used a web-worker, did the `JSON.parse` in the worker, then exposed methods that could be called from the main thread to get small amounts of data out of the worker as needed. Something like `worker.getProperty('foo.bar.baz')` which would only take the parsing hit for very small subsets of the data at a time. But ultimately the oboe.js solution was simpler and faster at runtime.
Another trick most people don’t realize is that not only is the fetch api is asynchronous but response.json() does the conversion in a background thread and is non UI blocking.
If you have a large json object. You can use the fetch api to work with it. If you need to cache it, use the cache storage api. Unlike localStorage which will freeze the UI, cache storage wont.
It’s slightly slower since it needs to talk to another thread but who cares as long as the UI is responsive to do other things.
I'm guessing with Oboe.js you solved this by capturing a stream(?) of JSON but only parsing relevant chunks as they appear and match the selector? Or do you simply load the larger chunks at once (either by a request or embedding JSON into the template server side) instead of streaming?
I could see the value in this for sure. I currently have a problem of loading a ton of JS for some users who have thousands of objects embedded in the view with Rails using toJSON() in a <script>. It’s creating far too much weight on the frontend. I’ve been considering fetching it via a simple REST request instead.
Node suffers from the same issues, but it's generally not as noticable in most cases. A similar situation in node would cause the server to not be able to respond to any other requests during the `JSON.parse` execution. But in the Node world, you have more options for how to get around those problems (like load balancing requests among several node processes).
But both server-side and client-side JS use the same system, the event loop. It's basically a message-queue of events that get stacked up, and the JS engine will one at a time grab the oldest event in that queue and process it to completion. Anything "async" will just throw a new event into that queue of events to be processed. The secret sauce is that any IO is done "outside" the JS execution, so other events can be processed while the IO is waiting to complete.
Take a look at this link, or search up the JS event-loop if you want to get a better explanation. It's deceptively simple.
Yes, node.js javascript runtime is based on V8, the same that runs in Chrome. Javascript is single threaded so anything that is not I/O bound will block the main thread. If you don't want to block the thread becasue you have long running calculation/parsing task, then you can use worker threads[1]. This will run your task in separate thread and not block the main one.
and not to beat a dead horse, but worker threads again wouldn't work in this exact situation even in Nodejs. They suffer from the same problems that web-workers do, meaning they use a structured copy algorithm to send data between workers (with the exception of TypedArrays), and therefore would hang the "main thread" just as long as if you did the `JSON.parse` directly in it.
It's a really annoying problem, and I'm actually really happy to see that many others have the exact same thoughts I had at the time, and that I wasn't just missing something obvious!
I generally use that as an example when explaining to people why Node isn't a great fit for a lot of workloads. They have to use these features internally, but you as the user with a CPU-intensive job don't have access to those features.
Maybe the worker could parse the JSON to build an index and then send over just the index. The main thread could then use the index to access small substrings of the original giant JSON string, parse those and cache the result?
what if you used multiple async ajax requests to load different parts of the UI in place of loading all 50MB at once? could that be what the OP meant by "promiseS"?
Promises are a way to deal with async code. Parsing JSON is synchronous and CPU-bound, so promises offer no benefit. And since web pages are single-threaded[0], there isn't really any way you can parse JSON in the background and wait on the result.
[0]: There is now the Web Workers API which does allow you to run code in the background. I've never used it, but I have heard that it has a pretty high overhead since you have to communicate with it through message passing, so it's possible you wouldn't actually gain anything by using it to parse a large JSON object.
Promises still run on "the main thread" so a CPU intensive task in a promise is still going to block things. You could use a promise if you delegated your CPU task to another process or some C code that did actual threading.
side note: legit question, I don't do web/app dev