Chatroom is slow? I know, I’ll just use list virtualization!

Well now you have two problems.

List virtualization is hard, dear reader. So hard. “Pfft, I can build that in an hour,” I thought until I tried.

It’s been 3 days. Last night, I dreamed scrollbars and mouse wheel events.

Hold up, what’s list virtualization?

List virtualization is a performance improvement technique for large lists or tables. Whenever your UI becomes slow because you’re rendering too much stuff, you can use list virtualization to make it fast again.

We’ve been hit with this problem recently at Yup. When our sessions get long, like hundreds of messages between tutor and student, tutors start complaining about the UI feeling sluggish.

They can’t type, they can’t talk, it’s like teaching through a paper bag. Terrible.

We don’t know why it became such a big problem right now, but 600 milliseconds to add a message to a chatroom is just too much. Needs to be fixed.

That’s where list virtualization comes in.

Instead of keeping the whole hundreds-of-messages chatroom rendered, you render just 30 messages. Or 40, or 10. Whatever the number, you render a small subset of the messages.

And that gives you a performance boost. Fewer DOM nodes to deal with, faster rendering times. Especially on slow computers.

The problem may have become a problem because of the Spectre and Meltdown fixes that hit Intel processors with up to 30% slowdowns. When you’re already using a slow $300 computer… yeah.

How to virtualize a list, in theory

So, how do you virtualize a list?

Virtualizing a list is simple in theory. You maintain a window and move it around your list.

A windowSize variable tells you how many nodes you’re rendering, and a windowIndex variable tells you where to start rendering. Then you .splice your data array and render away.

I was doing it in Backbone, so it seemed super tricky. We had complex logic in place to append and remove and prepend messages to the list as necessary.

Think things like

  1. Find previous message with $("nth-child")
  2. Manipulate with $(ul).append
  3. Find the correct scroll position

Messy.

After a few hours of that approach, I gave up.

Screw DOM manipulation, it’s 2018 and the DOM is fast. Re-render it.

To my surprise, throwing away the list and re-rendering for every message insertion or scroll event works friggin’ great. Like really seriously great. Even in Backbone where you get none of React’s diffing magic.

I didn’t run proper benchmarks, but on my 2017 MBP at home, re-rendering a 30-element list happens in 10 to 15 milliseconds.

  1. $("ul").html("")
  2. elements.splice(windowIndex, windowSize)
  3. loop and append

That re-renders in just 10 milliseconds on my 15″ MBP, 30 milliseconds on my 13″ MBP. Both running RoR and sidekiq and webpack and the rest in the background.

13″ is so much slower because it’s just dual core. I’ll run a proper benchmark for this soon because I’m curious.

My point is that re-rendering is fast and easy. You can totally get away with re-rendering on every mousewheel event if your windowSize is small enough.

I now had a virtualized list. Worked great.

Except the UX is confusing. You get to the edge of the scrollbar and it just keeps going and going. Wat?

Where things get tricky

This strange scrolling UX is where things go belly up and life gets hard. Users don’t expect to hit the edge of the scrollbar before the edge of the content, you see.

So I tried a couple things and haven’t really figured this out. I think I’m on the right path, but I’m also starting to lose my mind.

Watch this gif. Scrolling up works great, but then you start scrolling down and everything goes topsy turvy.

The version in this gif tries to compensate for a few things.

  1. Uses an offset from the edge to start adding messages before you hit the edge
  2. Adjusts this offset to your scrolling speed
  3. Adjusts windowIndex delta to your scrolling speed

Adding messages is basically just moving the windowIndex. That part is easy.

But the mousewheel is a tricky beast. On my mouse, the smallest deltaY is 4 pixels. That’s okay.

The biggest deltaY I’ve seen is around 3000 pixels. In a single event call.

So that’s what I’ve been dealing with. I wish I could just use react-virtualized, but I can’t because “We’ll have time to rewrite later”. Can’t come soon enough 😅

Update: OMG I DID IT

It’s not as smooth as Slack, but it’s good enough for now.

Wasn’t that complicated after all. Just a bunch of maths with magic numbers.

Learned something new? 💌

Join 8,400+ people becoming better Frontend Engineers!

Here's the deal: leave your email and I'll send you an Interactive ES6 Cheatsheet 📖 right away.  After that you'll get an email once a week with my writings about React, JavaScript,  and life as an engineer.


You should follow me on twitter, here.