Texture compression

As of the Oct 22 release of Lightning there is support for Texture Compression. Compression is a very common technique used to save bandwidth/memory usage/storage in computer science. It is used all over the place, in fact it's so common between me typing this article and you reading it there are a slew of compression algorithms being used. From data transmission to the way the data is stored on the server. Likely is there are bits compressed here and there for optimization.

So what does that mean for Lightning? One of the key parts of Lightning is how much textures, images and other graphical elements can be stored onto the graphical processing unit's (GPU) memory. Each thing you see on the screen needs to be created from the CPU, uploaded to the GPU and processed to be drawn on the screen. Every individual item consists out of instructions on how to draw it and the actual content (if any) in buffers.

In Lightning's case images typically consume most of the memory, such as poster art or banners. But aren't images already compressed using JPEG and PNG compression techniques? Yeah! They are! However regardless of the type of image that you are using every image gets uploaded to the GPU as uncompressed RGB/A data. That's right, no matter what compression or image extension you're using in the source of the image the browser will decompress and decode it. And once you hit the OpenGL drawImage APIs we're talking raw RGB/A buffers. That means that storing images on the GPU is basically 1 large bucket of colored RGB/A pixels. In turn the rendering pipeline understands those buffers and allows to create a scene where your picture is now rendered on screen.

Now as you can imagine that takes up quite a substantial amount memory, effectively storing multiple RGB/A "pixel buckets" for each picture you want to render on screen. This is where Texture compression comes in... imagine storing the same buffers but instead of raw pixels onto the memory we compress those buffers into a much smaller dataset. As such the GPU supports hardware accelerated decompression providing super fast deflation of the compressed data. Creating very limited impact on performance while being able to store much more (up to 6 times!!) images in the same amount of available graphics memory.

So how does this work? It starts with two things:

The texture container is used for transport, with correct headers, endianness and what not so the browser can understand what is inside. The container needs to be parsed and typically devices support 1 or more containers for compressed textures.

The more important thing is the actual compression scheme. This is a very bespoke feature of a GPU and different devices will provide different levels of support. The level of support is linked by its respective supported OpenGL version:

ETC stands for Ericsson Texture Compression and its a lossy compression algorithm developed by Ericsson Research (code named iPACKMAN). It takes groups of 4x4 pixel data and compresses that into a single 64-bit word.

texture compression

ETC is part of the standard OpenGL specification and the GPU provides hardware accelerated support for decompression of the 64-bit words stored in memory. ETC2 is backwards compatible with ETC1 but adds support for the alpha (RGBA versus just RGB) channel.

Lightning will support ETC1 with a PVR or KTX texture container coming the October 2022 release. This is to be extended in the future with more formats and more compression algorithms. Support for the various containers and compression schemes may vary by the implemented OpenGL driver on the device you're using. In our experience the .PVR container seems best supported on our usual devices.

Using compressed textures

Okay so we got support in Lightning. Now how do we use it? Ideally you'd want most, if not all, textures compressed. Currently the Lightning project does not provide any CLI or packaging functionality that helps transcode & compress images to compressed textures. So for now this will be something you will have to figure out yourself.

We've experimented with a tool called PVRTexTool which is available for all major operating systems. The tool allow you to easily create a compressed texture in various formats from a source image:


The PVR TexTool supports various OpenGL ES 1.0 / 2.0 and 3.0 presets with different supported compression schemes. Once compressed it allows you to save the image into various containers such as .pvr, .ktx and .dds formats.

But compressing all images by hand is tedious? Yeah, we agree. For now the best we can offer is the PVRTexTool CLI that allows a developer to automatically generate compressed textures based on the certain presets that work for you. In the future it would be nice to add such an option to the Lightning Command Line tool (and of course it is open source so if you cant wait, don't hold back).

How about my device?

Yeah the texture compression stands or falls with your device supporting it or not. We gotchu! There are two test examples developed as part of this, one is inside the Lightning source: https://github.com/rdkcentral/Lightning/tree/master/examples/texture-compression

This is a simple example page with .ktx and .pvr containers.

If you rather have something straightforward without dealing with source code we've also added a test case to our Strike benchmarking tool. You can find the tool here https://strike.lightningjs.io

In the bottom right corner there are 3 compatibility tests that you can run, run this icon for the compression compatibility test:

Strike icon

This should bring you to a screen that looks something like this:

Strike test

Happy testing!

-The Lightning team.