Introduction

NotchLC was created to be a GPU powered codec that provided the quality needed to be an intermediary codec as well as the performance required to be a playback codec.

It brings the equivalent of 10bit accuracy in a scrubbable codec (with 12bit of Luma data) that is extremely fast to encode and decode, with a compression ratio of 8:1 to 4:1.

Software support

NotchLC is currently supported in:

Quality Levels & Bit Rates

Rather than specify target bitrates and end up with undetermined quality outcomes, NotchLC takes the reverse approach. When encoding you set a quality level, which is essentially an allowable ‘error level’ in encode. The encoder then searches for best fit compression in each block that meets this quality threshold. These defined coefficients of error give confidence in the quality of the resulting encoded video.
Typically our users are seeing compression ratios between 4:1 and 8:1 when compared to raw video. And as a general rule of thumb Notch (at Optimal Level) tends to be 30% larger than HAPQ (for a significant quality uplift).

NotchLC provides five quality levels:

  • Good
  • Very Good
  • Excellent
  • Optimal
  • Best

Optimal is your go-to quality level and fit for most day-to-day purposes. It is the sweet spot of file size and quality. If you utilise Best you may see significantly larger file sizes for only marginal gains in quality.

Luma, Chroma, Bit Depths & Compression

The LC in NotchLC stands for Luma & Chroma. NotchLC breaks colour data down into luma and chroma (YUV). 12bits of depth are assigned to luma data, as in many scenarios this is where bit depth is most perceivable. 8bits are assigned to each of the U & V channels. Total (28 bits per pixel). Alpha data is stored at 8bit depth.