PNG decoding - Why red and blue are swapped?

Hi!
Looking at png_decoder.c, I can see how a PNG file is decoded by a call to lodepng_decode32.

What I don’t understand, is why png_decoder then swaps red and blue (when using 32bit color depth):

static void convert_color_depth(uint8_t * img, uint32_t px_cnt)
{
#if LV_COLOR_DEPTH == 32
    lv_color32_t * img_argb = (lv_color32_t*)img;
    lv_color_t c;
    lv_color_t * img_c = (lv_color_t *) img;
    uint32_t i;
    for(i = 0; i < px_cnt; i++) {
        c = LV_COLOR_MAKE(img_argb[i].red, img_argb[i].green, img_argb[i].blue);
        img_c[i].red = c.blue;
        img_c[i].blue = c.red;
    }

How is Littlevgl raw image format different from “RGBA” that lodepng generates? And why did you choose using this different raw image format on Littlevgl?

Thanks!

Amir

Hi,

See this comment.

LittlevGL has ARGB format which is the most common as I saw so far. PNG has RGBA in big endian. When it is interpreted as little-endian it becomes ABGR (compared LittlevGL’s ARGB).


To make it clearer:


    lv_color_t c = LV_COLOR_MAKE(0x11,0x22,0x33);
    uint8_t * b = (uint8_t *)&c;
    printf("0x%02x, 0x%02x, 0x%02x, 0x%02x\n", b[0], b[1], b[2], b[3]);
    printf("0x%08x\n", c.full);
0x33, 0x22, 0x11, 0xff
0xff112233

So if interpreted as uint32_t is looks like ARGB but because of the little endian interpretation, it is stored as B, G, R, A.

The current color format was compatible with Linux frame buffer (/dev/fb0) so I thought it’s should be a good choice. (It might be different on other machines but there were complains so far)

I see, thanks to the explanation @kisvegabor!

I tried taking the data produced by lodepng_decode32 without any manipulations. So as expected, blue and red are swapped. But what about the alpha?

See the image below - I would expect a transparent background instead of a black background.

image

Any idea?

I’ve tested in on dev-6.0 at it’s working for me. Here is the v6.0 compatible code:
png_decoder_test.c (125.1 KB)

I’ve didn’t tried it v5.3 for a while but it should the same way.

Oh - it was my mistake.
I used LV_IMG_CF_TRUE_COLOR instead of LV_IMG_CF_TRUE_COLOR_ALPHA.
Now it work just fine. Thanks!

Glad to hear that! :slight_smile:

When I was compiler this demo , I got a error LV_IMG_FORMAT_TRUE_COLOR_ALPHA was undeclared, I can’t search this macro in the lvgl.

It’s defined here: https://github.com/littlevgl/lvgl/blob/7c90b84560aa8cea64c1f8f08b4398d3fb29e7da/src/lv_draw/lv_img_decoder.h#L83

WOW, Thanks for your reply.
But LV_IMG_FORMAT_TRUE_COLOR_ALPHA is equal to the LV_IMG_CF_TRUE_COLOR_ALPHA ?anyway I’ll give a try.

I think CF stands for “Color Format”, all that enum members start with LV_IMG_CF_