Transparency in image files


I’m currently porting a project from LVGL v6.1.2 to v7.4.
In this project, we use png images with alpha layer.

I found that alpha channel for transparency is not well managed if LV_ANTIALIAS is set to 0 :
In the function _lv_blend_map, a clipping is made on mask values which contain alpha channel value in case of an image.
May be a specific value in the lv_draw_mask_res_t enum would help.


What MCU/Processor/Board and compiler are you using?

NXP LPC54628 / IAR for ARM V8.50.1.24811
Microsoft Visual Studio Community 2019 Version 16.6.5 for simulation

What do you experience?

If flag LV_ANTIALIAS is set to 0 transparency in image is not well managed : transparency is set to 0 or 100% depending if alpha channel is greater than 128.

I’ve tested with lv_ex_img_1() and it works for me. Is there anything special in your configuration?

Hello Gabor,

Thank you for your interest in my problem.
I looked at lv_ex_img_1() and, yes, I use another configuration :

  • My image is stored in a file in png format and I use the lib lodepng to decode it.
  • In my callback function decoder_info, I set the color format flag of header image to LV_IMG_CF_RAW_ALPHA as I’m using an image with alpha channel.
  • I’m configured with LV_COLOR_DEPTH = 16

My problem is reproducible both on my development board (LPCXpresso546xx) and with the Visual Studio simulator and is independent of using png_decoder or not.

I saw that, when the image is decoded, alpha channel values are put in mask_buf used by _lv_blend_map() (lv_drw_img.c:412 and lv_drw_img.c:432).In this function LV_ANTIALIAS is checked to “round” (set to 0 or 0xFF) the values of the mask (lv_draw_blend.c:226-235) but rounding alpha channel destroys transparency effects.
I checked the image img_cogwheel_argb.png used in your example and although it has an alpha channel, the transparency is used on a few pixels, so it’s hard to detect.
If you put a background color you can see small differences between images. Differences are more visible using gradient color.
I send you the png I used in example and 2 screenshots of my VS simulation (LV_ANTIALIAS=0/1). Here is the code to generate this example

lv_obj_t * img1 = lv_img_create(lv_scr_act(), NULL);
lv_obj_set_style_local_bg_color(img1, LV_IMG_PART_MAIN, LV_STATE_DEFAULT, LV_COLOR_BLUE);
lv_obj_set_style_local_bg_opa(img1, LV_IMG_PART_MAIN, LV_STATE_DEFAULT, LV_OPA_COVER);
lv_img_set_src(img1, &test_alpha);
lv_obj_align(img1, NULL, LV_ALIGN_CENTER, 0, 0);

I generate the test_alpha.c using the online image tool and setting color format to true color with alpha

Best regards

The PNG itself seems to have gotten lost. Could you try re-uploading it?

Here is the png source file


Originally the file is gradient from blue to yellow. I put an alpha channel and use blue as “transparent color”. For this raison I put a blue background to my image.

Oh got it. And it’s really an issue.

The problem is that the alpha channel of the images and other masks are stored in a common buffer and - as you already found it - the disabled antialiasing just rounds the values in this buffer to 0 or 255.

Unfortunately, I can’t give you a good workaround for this. It’d be possible to handle antialiasing in an earlier stage of drawing but it’d require a huge update in the drawing functions.

I was thinking about optimizing the drawing functions but it’s just a plan now. Anyway, I’ll keep this in mind.

So, I’ve just come across this issue as well. I switched to LV_ANTIALIAS = 1 and that did solve the problem. Do you suppose I will see a performance impact from this switch with certain UI elements, or should it be imperceptible.

Unless things have changed, the difference is negligible with v7, as the rendering engine always uses antialiasing internally.