Texture Compression Mode ETC1S

@BunBunBun Would it be possible for us to get a project with just the atlases so that we can experiment a bit ourselves? It would really help us get to the bottom of this. You can share with bjorn@defold.se or mathias@defold.se


Nope, I tested in blank projects with atlases only for saving time.

Yes, this times when I do “Project -> Budle -> HTML5 application”
also Release mode

12 atlasses with different sizes (most of 2048x2048)

Done! :slight_smile:


Thanks! :heart:


I found it’s possible to control quality for ETC1S compression. What settings are you using in this experimental build?

Because this example from @BunBunBun looks almost as good as webp_lossy example. Just interesting if it’s possible to tweak it a bit to get a bit better result.

1 Like

we currently only use the “quality” setting for etc1s:

Also note that there is the “basisu” command line tool here. which allow you to control these parameters.
Please try them locally to find a set of parameters that work as a replacement.


I think I found the reason of bad quality of ETC1S. It’s because of pre-mult alpha!
Here is example:


I tested these 3 options in Texture Packer when created atlas from png images

All the tests:
ETC1S -comp_level 6.zip (3.4 MB)


I would like to play a bit more with that, but I’m not sure what parameterы for webp encoder correspond to WEBP_LOSSY_NORMAL, WEBP_LOSSY_FAST, WEBP_LOSSY_HIGH and WEBP_LOSSY_BEST


thanks for experiments @AGulev! indeed, in your pictures without “premultiply alpha” it looks better,
but this is the result: (built in Defold version sha1: aa82e11a5e6619fb8634d90df3d42e494678685b)

1 Like

This is example with ETC1S -comp_level 5 (prev. one was 6)
I use just

./basisu -comp_level 5 test-keep-transperent-pixels.png

ETC1S -comp_level 5.zip (1.0 MB)

1 Like

@AGulev The previous WebP settings are here:


@Mathias_Westerdahl about the building time, @AGulev found interesting thing, if to add the path to atlases here:

building faster
UASTC_HIGH: time ~6m
UASTC_HIGH: time ~4m43s

for ETC1S_HIGH, 2:46 vs 01:32, dunno how that works :hushed:

Bob.jar builds “too many” files. E.g. separate .png files.
E.g if you specify the test.atlas to use the A.png, bob will build both as textures, and if the texture profile matches, it compresses both.

I recommend specifying “*.atlas” in the pattern.


Much better now!

It is interesting if LZ4 takes a lot of time and if Defold applies it to ETC1S. Because as I understood it’s useless for ETC1S

LZ4 shouldn’t take much time no.
It’s generally the texture compression algorithm that takes time.
You can compare with turning off the texture compression (or use the debug property “project.compress_archive = false”).

Sure, it’s unnecessary to compress the ETC1s, since it’s already compressed, but we currently have no way of distinguish between UASTC/ETC1S compressed textures.


Please, pay attention this is comparison of sizes and quality and nothing more. I don’t take into account all aspects.

BasisU and WebP are totally different. Both have it’s pros and cons. I did these tests just because some html5 games have higher priority in disk space (download size) but not memory or decoding speed. It is important for some but not for all the games.

I use the newest version (latest master 427ebe62ebd6c7fed40994b1f963e66aa3993fce) basisu at the moment.

So, it seems like UASTC has too good quality for the goal we want to achieve, but ETC1s has too low quality for our goal =)

I tested everything for texture without premultiplied alpha (not sure what the reason of quality issue in case we use it, it’s separate thing for the other investigation)

It seems like for the best quality of ETC1s we should use:

./basisu -comp_level 6 test.png -q 255 -linear -max_endpoints 16128 -max_selectors 16128

Also, I played a bit with RDO for UASTC, but even with extreme values result was far away from the goal.

So, I kept only one with the smallest size + norm quality:

./basisu -uastc -uastc_rdo_m test.png -uastc_rdo_l 5 -uastc_rdo_d 8192 -uastc_level 0

I tested two WebP versions. The newest WebP(1.2) and WebP we previously used in Defold (0.5).

cwebp -q 75 test.png -o test.webp


cwebp -q 90 test.png -o test.webp

I didn’t measure decompression and compression speed and the other metrics, but I noticed that webp 1.2 decoder twice as bigger that webp 0.5 decoder JFYI.


webp vs basis.zip (2.5 MB)


png files with backgrounds like this (just example. forum convert it to jpeg, pls download archive)

with backgrounds.zip (1.6 MB)



compressed and uncompressed size are the same… see attached images.

ETC1S: doesn’t work for me at all, build just stops (180 displayed a message, something like “encoding error”, 181 is silent )


This seems strange.

2048 x 2048 x 1 = 4 mb which is the expected size of the .basis file it produces. However, for the LZ4 compression to not compress the .basis file, that’s very unexpected.

Could you share that image with me?


I’m wondering why Defold includes JPEG/PNG decompressor but doesn’t have an option to compress RGBA atlases into PNG, RGB atlases into PNG/JPEG?

JPEG can provide a good balance between quality and size for lossless RGB images, and there are many compression libs like Guetzli. PNG can be compressed very well, too.

It’s a good option if a developer doesn’t care about the memory size of textures.


I’m wondering why Defold includes JPEG/PNG decompressor but doesn’t have an option to compress RGBA atlases into PNG, RGB atlases into PNG/JPEG?

I think the primary intended use for the JPEG/PNG decompressors was to support downloading dynamic content on the fly on the web.

I can’t say for certain why we don’t used .png as the format for lossless image compression. I think the issue only came up when we wanted as small images as possible. By then WebP existed, and produced much smaller textures, for lossless images. It also supports lossy compression. So, as a source package, I think it was a better deal.

I think both time (decompression) and quality were also considered when not choosing jpeg. (I don’t think I have those design documents anymore.)

In the near future, I believe we’ll have a core functionality (Basis Universal), and have other compression strategies as pipeline extensions (WebP etc).