Weird background on GIFs


I am getting a weird background on my GIFs as I cycle through them. It is worth noting that I am currently converting my GIFs to C arrays with a CF_RAW color format using LVGL’s online image converter tool. Sometimes, the background flashes white before turning transparent again. Other times, it flashes a different color. I have also observed cases where a white background persists, but the bounding box around the image is transparent.

What MCU/Processor/Board and compiler are you using?

ESP32 Lyra-T/gcc

What LVGL version are you using?


What do you want to achieve?

I would like to be able to play GIF all the way through and have it retain its transparent background.

What have you tried so far?

  • Experimenting with different GIFs
  • Trying different variations of the CF_RAW color format when using the online converter.
  • Using the LVGL simulator on Eclipse (same issue)
  • Loading a GIF from file (same issue)

Code to reproduce

    SemaphoreHandle_t xGuiSemaphore = get_semaphore();

    // Init the gif object one time
    if(!gif) {
        gif = lv_gif_create(lv_scr_act());

    while (xSemaphoreTake(xGuiSemaphore, portMAX_DELAY) != pdTRUE) {}

    lv_gif_set_src(gif, &mario_example);
    lv_obj_align(gif, LV_ALIGN_CENTER, 0, 0);


Screenshot and/or video

In these two examples, a colored background flashes before the GIF reverts to its original transparent state.


It’s harder to tell with the ghost, but you can see the outline of the bounding box near the bottom of the screen.


Is this issue solved? I’m having the same issue with my GIF loading. It’s with a green background at the very beginning.

I’ve also seen this problem. It looks to be related to how transparent backgrounds are handled. When the gif decoder is initialized, the canvas is filled with the background color, and full opacity. It looks like consider LV_COOR_DEPTH to lower memory usage · lvgl/lv_lib_gif@596cb32 · GitHub was the commit that introduced this change.

Note that the “return to background color” disposal method seems to switch on whether the last graphical control extension had transparency set to 1. lv_lib_gif/gifdec.c at master · lvgl/lv_lib_gif · GitHub

Interestingly, the 89a gif specification is rather sparse in describing how the global Background Color Index and local Transparency Color Indexes should interact. Background Color is described as:

The Background Color is the color used for those pixels on the screen that are not covered by an image.

While transparency Index is described as:

The Transparency Index is such that when encountered, the corresponding pixel of the display device is not modified and processing goes on to the next pixel.

It seems somewhat open to interpretation how transparent backgrounds should be handled. The de-facto standard with modern gif decoders (e.g. in web browsers) seems to be to treat the background as transparent by default.

@kisvegabor do you have thoughts on this one? It seems like at one point in time, gifdec did handle background as transparent by default.

To save memory and for the sake of simplicity I started to think about supporting only global color table. This way the decoding would be much simpler and we could use a single INDEXED_8BIT color format. (meaning widht x height byte RAM usage).

What do you think about it?