I have a table of 100 rows x 10 columns mostly of numbers which change 10 times a second (think trading).
The source data sits in an array of objects.
On each update, I need to update the table cells with the latest values from the source array. The content of each cell can change, possibly it's css class too (from red it become blue for example).
I'm using Vue/Bootstrap-Vue table at the moment, but it's quite slow.
Is Svelte suitable for this? Other options to quickly update a <table> from data in an array of objects, one object per row?
It looks like svelte.dev and the REPL are up and running again, so here's a quick example of a 100x10 table updating every value 100 times per second, with a red/blue class being applied to each cell depending on its value.
It seems to do pretty well! I turned on the FPS meter in Chrome and scrolled around a bunch, and it seems smooth, reporting around ~50fps. It did get my CPU going pretty good, though.
works now, works well for me also, good FPS. looks pretty smooth also.
I would say his next problem is going to be getting the data fast enough to feed the the UI, a nicely packed websocket protocol would work well I'd guess
No matter which framework you describe, this is an iterative approach and will be inefficient. The optimized (or, at least faster) way to handle this would be to have a unique ID attached to a cell and offer a direct callback to it via the data.
E.g. the symbol for "GOOG" stored in a hash table, when GOOG updates; call that hash directly: "symbols[symbol].update(data)" then re-render the cell DOM element.
Svelte keeps a direct reference to DOM elements and mutates them when possible, if you do it right you can probably achieve vanilla-levels of performance.
For a realtime view of stock symbols, it's highly likely they will change. And a hashmap lookup is not free.
You really have to profile the code to figure out what's expensive. Maybe there is no scaling wrt. the number of nodes that change because repainting/relayouting is more expensive than updating the DOM.
They are basically free when you have this little data. 1k cells updates 10 times a second means that you have around 100 micro seconds per cell, you can do thousands of lookups in that time. The only things which could even come close to costing 100 micro seconds are if you accidentally re-flow the html each cell update, re-render the html each cell update, send a http request each cell update or if you go through all the data each cell or the framework you are using is extremely inefficient or other unnecessary work. Each of those are easily fixable by doing the html and javascript by hand instead of using frameworks.
At 10 times a second you might be hitting the browser’s table layout algorithm speed limits. I assume you’ve given it table-layout: fixed, and fixed-width columns?
It would certainly reduce it. With table-layout:fixed, the default behaviour is that every cell is the same width as the others. With the normal table layout algorithm it’ll attempt to rebalance as every new cell’s content is added. That makes for a very flexible layout that automatically scales to best fit the content, but have to flush and redo the layout for the whole table every time a column width is calculated to need to change.
It doesn’t even have to be a problem like that to be a performance problem. For example, imagine a five row table where the last column always has 1 digit in the first row, 2 in the second, 3 in the third etc. When rendering (without table-layout:fixed) the browser will check the last cell, see one digit, and size that column appropriately. It’ll then carry on rendering the second row, see two digits, then work out the new size for the column and rerender the entire table. Repeat for rows three, four and five.
Not a problem for a single layout pass, but a performance problem if the data changes every frame.
Good rec! Perspective was purpose built for this precise use case at JPMC, and has some unique features such as streaming Apache Arrow support, which can pretty easily handle datasets in the low millions of rows, at thousands of updates/second.
Exactly. Stuff like this is extremely simple to write in vanilla js, with a pointer to each of the dom objects you need to interact with. There's no point in using a framework (but of course you can encapsulate your vanilla js in a component to interface with whatever framework you're using).
Isn't the benefit of React/Vue the shadowDOM and not needing to do full page prints with every change? Honest question, I'm just going off what I hear everyone else saying so the vanilla JS, while the code may be more efficient, sounds less efficient since you'll need to re-print the entire page with every change.
> sounds less efficient since you'll need to re-print the entire page with every change.
Huh? No, not at all. As far as I understand, React has algorithms that replace only the html that changed in a dom subtree (and that is called virtual dom, not shadow dom, which is a different concept).
But if you already know exactly what has changed and where to change it in the page, there is no need for more complex algorithms to kick in. Just take the pointer to your div or cell and change the content.
Bottom line: React is written in vanilla Javascript. Can't do better than it.
> React is using Virtual DOM, which is not free. It is fast, but it is not free.
Of course, that's what I meant- maybe it wasn't clear. React can't be faster than vanilla js, it's written in vanilla js after all.
The whole virtual dom's purpose is to calculate the smallest possible update when you don't know (and don't care) what exactly has changed in your view. But that calculation of course has a cost. And if you know very well what changed and where, like in the case of the GP, nothing can be faster than changing it directly.
I am surprised how many people I interview say the same thing - pure JS has to re-render the page and React doesn't, including people who have a lot of JS experience.
Changing the DOM doesn't cause a full re-render, the browser will optimize the repaints to only the parts that have changed.
People don't understand how the DOM works, and its become a great way to filter out people.
"People don't understand how the DOM works" x 1000. Many front end devs I have worked with are not aware that the DOM IS A TREE (and implications of this)!
100 rows x 10 columns is about 1000 elements minimum. HN has an obsession towards the fastest js framework but most of the time its not the JS where the most time is spent, it's the render->paint cycle of the browser.
I bet if those 100 rows aren't even in the viewport you can get a big perf boost by only rendering a subset and rendering more when the user scrolls.
We've gotten ~30x boost on our tables with 100s of rows by using a progressive rendering algorithm.
virtualdom - preact/react/vue/snabbdom e.t.c is usually fast for most things. you can diff about a million things under a 10ms timeframe. Rendering a million things is an entirely different ballgame.
That's what we did too with Vue table library, made our own sorting algorithm and showed the first 100 results async and continued with rest of results in background until user cancels or changes sorting entirely
Just out of interest, what's the reason you need to update the values of the cells 10x per second? I get that that's the frequency the data is updating (maybe even faster) but is it useful for a person to see the data updating at that frequency?
Let's say you slowed it down to 1x per second, you'd increase max possible latency of an updated value being displayed by ~900ms - is that enough time to be important in your application?
Clicking on table cell initiates/closes trades. Even if you can't react in 100ms, is useful to see how fast they change (how volatile the market is). 1s is definitely too slow.
This is also a very good use case to actually dive under the abstraction a little and use hooks or lifecycle methods to directly manipulate the table.
You still gain a lot of benefits from your component framework this way (composability, life cycle management, easy interop with the rest of the app). It’s more work that you wouldn’t want to do all the time, of course.
You probably can implement a more efficient diffing algorithm than the general one for vue. When new data comes in, patch the objects that have changed without overwriting the whole array.
No framework can fully eliminate the underlying performance sinkhole that is the DOM. If you have to mutate, you will pay the price, period. The smartest framework can only help you eliminate redundant mutations.
The source data sits in an array of objects.
On each update, I need to update the table cells with the latest values from the source array. The content of each cell can change, possibly it's css class too (from red it become blue for example).
I'm using Vue/Bootstrap-Vue table at the moment, but it's quite slow.
Is Svelte suitable for this? Other options to quickly update a <table> from data in an array of objects, one object per row?