Blog

Debunking the Myth: Hardware vs Software Latency

By February 25th, 2026 No Comments

A common misconception is that hardware encoders and decoders inherently provide lower latencies than their software counterparts. But recent tests we have performed challenge this notion, revealing that our C-100 software outperforms two popular hardware decoders by nearly 50 milliseconds.

In this blog post, we’ll explore the details of our latency test and debunk the myth about hardware superiority in low-latency encoding and decoding.

What is Latency and Why is it Important?

(Skip this section if you already understand)

Latency refers to the delay between the moment a video source is captured and when it is displayed to the viewer. Many use-cases of professional video transport require as low a latency as possible, either for production reasons (such as remote production) or to keep latency low for viewers in sporting or news events.

At 60 frames per second (fps), a reduction of 50ms translates to three frames of video. This delay can significantly affect timing, making it critical for broadcasters to minimise latency wherever possible.

The myth of software having high latency

Unfortunately, owing to the streaming industry and its inherent high latencies, there is a belief that all software encoding and decoding has high latency. Just because the streaming and webcasting industry lacks an understanding about how to build low-latency pipelines, doesn’t mean all software encoding and decode is high latency.

So we decided to demonstrate this,

The Side-by-Side Latency Test

To demonstrate hardware vs software latencies, we conducted a side-by-side latency test comparing our C-100 software-based decoders with two well-known hardware decoders. The video below shows the results.

Here’s how to interpret the video:

Top left quadrant: Displays the time of the source content. This number represents the time attached to the content being viewed

Top right quadrant: Shows results using our C-100 software-based decoders, indicating the source time of the piece of content currently on screen. This value will necessarily be lower than the left screen since content cannot be displayed before it is sourced. A value closer to the source time indicates lower latency.

Bottom row: Displays the source time for the content decoded using the hardware-based decoders from two prominent vendors. As in the top right screen, these values reflect how long ago the content was sourced. A lower value indicates that the displayed content took longer to arrive, implying higher latency.

The results were telling. As shown in the video, our C-100 software-based decoder achieved a latency that was 50ms faster than the two hardware decoders.

How We Achieve This

A key factor in our performance advantage is the use of our in-house SDI card, which empowers us with full control over the entire stack from SDI data in to SDI data out. This level of control enables us to create a highly optimised pipeline for maximum performance and lowest latency.

We take substantial care and attention in our design to make sure latency does not accumulate. We know low-latency is critical for our customers and we are here to deliver latencies as low as 100ms end-to-end.

Beyond Hardware vs. Software

When striving for low latency, the conversation should extend beyond a simplistic hardware versus software discussion. The real differentiator lies in how highly optimised the processing pipeline is. Our test demonstrates that software-defined processing can not only match but even surpass traditional hardware options.

Leveraging a robust, well-designed architecture can yield exceptional results—regardless of whether hardware or software is being used. Our test debunks the myth that hardware encoders and decoders are inherently superior in terms of latency.

Learn more about our (software-based!) low-latency encoders and decoders.