DEV Community

Avraam Mavridis
Avraam Mavridis

Posted on • Updated on

Implementing backpressure for smoother user experience in low-end devices

If you are building applications that consume real-time data you may have faced a situation where the component or service that consumes the data cannot keep up with the volume or speed of the produced data. The producer module of the system is emitting data faster than the consumer module of the system can process.

The consumer tries to keep up processing the data by increasing the amount of the system resources it is using (CPU, memory). That can be fine in high-end devices where the system resources are not limited, but in low-end devices, it can lead to battery draining or non-smooth user experience.

PULL VS PUSH STRATEGY

If you have designed your system with a pull strategy where the consumer is asking data from the producer whenever it feels ready to process (or in specified intervals) you can most of the times solve the problem by increasing the interval between two data pulls. Imagine that you have a web application that is sending GET requests to a backend endpoint every 50ms and updates the UI with some fancy animations. There could be a situation where the process that updates the UI is hanging because it is slower than the process that requests and processes the data. In cases like that we can increase the interval e.g. to 200ms, the UI will be less "real-timish" but at least it will be smoother.

setInterval(function(){
  axios.get('some-data-endpoint')
       .then(function(response){
           updateUI(response.data)
        })

}, 200)
Enter fullscreen mode Exit fullscreen mode

If your system is not or cannot be based on a pull strategy, but it needs to operate in a push-based fashion, where the producer is pushing data to the consumer, you have to take a different path to solve the problem. Imagine the scenario where your web app is using websockets, where the server is pushing to the UI real-time events (e.g. financial transactions).

socket.on('message', updateUI);
Enter fullscreen mode Exit fullscreen mode

In these situations, the way to solve the problem is usually by establishing a backpressure mechanism.

BACKPRESSURE

Reactive Manifesto has a better definition of backpressure than what I could probably write:

When one component is struggling to keep-up, the system as a whole needs to respond in a sensible way. It is unacceptable for the component under stress to fail catastrophically or to drop messages in an uncontrolled fashion. Since it can’t cope and it can’t fail it should communicate the fact that it is under stress to upstream components and so get them to reduce the load. This back-pressure is an important feedback mechanism that allows systems to gracefully respond to load rather than collapse under it. The back-pressure may cascade all the way up to the user, at which point responsiveness may degrade, but this mechanism will ensure that the system is resilient under load, and will provide information that may allow the system itself to apply other resources to help distribute the load.

There are two ways to achieve backpressure and we have to choose based on the needs of our application, the loss-less strategy, and the lossy strategy.

LOSS-LESS VS LOSSY

In the lossy strategy, we can skip values until a certain amount of time has been passed or after the occurrence of an event (e.g. mouse click). In this case, we elaborate only on the most recent value(s) and we can accept the fact that potentially we could lose some values. This usually is fair when the data is not critical.

Loss-less Strategy Lossy Strategy
Values are discarded and never at the Observer. Values are stacked and emitted in batches.
Example: Mouse positions sampled over a period of time Example: Real-time data from a socket using a buffer operator.
The app is using the latest position and ignores the previous ones. The app is processing the data in batches

Example

To demonstrate how we can implement backpressure I created a small example using RxJS and Websockets. Our dummy app is connecting with a remote socket server that is pushing data related to cryptocurrency prices and update the UI. First lets create a stream:

function getStream(){
  const socket = io.connect('streamer.cryptocompare.com');
  const subscription = ['ID-1', 'ID-2'];
  socket.emit('SubAdd', { subs: subscription });

  return Rx.Observable.create(function(observer){
    socket.on('m', function(data){
      observer.next(data);
    })
  })
}
Enter fullscreen mode Exit fullscreen mode

Then I created a simple react component that subscribes to the stream and updates the UI whenever a message arrives:

class App extends Component {
  state = {
    messages: []
  };

  componentDidMount() {
    const stream$ = getStream();
    stream$.subscribe(m => {
      this.setState({
        messages: this
          .state
          .messages
          .concat(m)
      })
    })
  }

  ...
  ...

  render() {
    return (
      <ul>
        {
         this
          .state
          .messages
          .map(msg => <li key={msg.id}>{msg.label}</li>)
        }
      </ul>
    );
  }
}
Enter fullscreen mode Exit fullscreen mode

I run the application and started measuring its performance. As you can see from the following gif even in my high-end device, when I try to scroll the frame rate drops significantly and the UI experience is terrible:

USING BACKPRESSURE

There are various operators that can help us achieve backpressure

  • sample()
  • throttleFirst()
  • buffer()
  • window()

Let’s see a few of them using Marble diagrams.

SAMPLING

In sampling, we glance at the sequence of the emitted values periodically and we use the last emitted value at each period:

Sampling is a lossy backpressure strategy.

THROTTLEFIRST

The throttleFirst is the same as sampling but instead of using the last emitted value, we use the first value that has been emitted in a specified period:

throttleFirst is a lossy backpressure strategy.

BUFFER

With buffer we can create a batch of emitted items and then the consumer can decide whether to process only one particular item from each collection or some combination of those items.

buffer is a loss-less backpressure strategy.

WINDOW

With window we can specify how many items we want to collect before closing and emitting the batch.

window is a loss-less Strategy backpressure strategy.

Example applying backpressure

To apply backpressure in our example the only thing we have to do is to add sampling using the sample operator:

class App extends Component {
  state = {
    messages: []
  };

  componentDidMount() {
    const stream$ = getStream();
    stream$.sample(500).subscribe(m => {
      this.setState({
        messages: this
          .state
          .messages
          .concat(m)
      })
    })
  }

  render() {
    return (
      <ul>
        {
         this
          .state
          .messages
          .map(msg => <li key={msg.id}>{msg.label}</li>)
        }
      </ul>

    );
  }
}
Enter fullscreen mode Exit fullscreen mode

Summary

Backpressure is a useful technique to achieve smooth user experiences, even for the users that do not have powerful devices. Unfortunately most of the browsers do not expose the hardware characteristics of the user’s machine (probably for privacy/security reasons), so as developers we have to either do browser sniffing and then guess the device’s capabilities, or try to find the sweet spot that will offer enjoyable experience for all of our users.

Top comments (2)

Collapse
 
masterxilo profile image
Paul Frischknecht

I would not consider any of these strategies to be a form of backpressure. Backpressure negotiates a data bandwidth that the receiver is comfortable with. None of those strategies you presented will reduce the bandwidth at which the server sends events.

It could still overwhelm you, especially if you use buffering or window...

There needs to be some kind of communication back to the sender. One simple approach is to just use polling/pulling instead of having the server push events.

Of course, even there, either the server has to wait or potentially buffer "past" data for a potentially very long time (as for example Kafka does).

Collapse
 
blindpupil profile image
Cesar Martinez

Thank you for sharing, I found it clear and helpful. Would be nice to see examples of implementations of other strategies. I appreciate that starting point though.