Color conversion with lv_canvas_rotate

Description

I noticed a different resulting colors if I draw a bitmap on a canvas using lv_image_create or lv_canvas_rotate on device with 8 bit depth.

What MCU/Processor/Board and compiler are you using?

PC Simulator

What do you want to achieve?

Create an animation on a transparent layer. The layer is defined as transparent and put on layer_top so I can superimpose objects on all screens. There’s a different resulting color when rotating the bitmap.

What have you tried so far?

I tried a source bitmap with different colors. Results using rotate function are:

  1. solid green source (RGB 0,255,0), resulting color on screen RGB 0,180,0
  2. solid white source (RGB 255,255,255), resulting color on screen RGB 180,180,85

(BTW, these numbers seem the ones used in function lv_color_to32)

Am I doing something wrong?

Code to reproduce

// Create clock creen
scrClock = lv_obj_create(NULL, NULL);

// Load clock background
dial = lv_img_create(scrClock, NULL);
lv_img_set_src(dial, &clock);
lv_obj_set_pos(dial, 0, 0);

layerOSD = lv_canvas_create(lv_layer_top(), NULL);
static lv_color_t cbuf[LV_CANVAS_BUF_SIZE_TRUE_COLOR_CHROMA_KEYED(320, 300)];
lv_canvas_set_buffer(layerOSD, cbuf, 320, 300, LV_IMG_CF_TRUE_COLOR_CHROMA_KEYED);

/*Transparent background*/
lv_canvas_fill_bg(layerOSD, LV_COLOR_TRANSP);
    
lv_obj_set_pos(layerOSD, 0, 0);
dial = lv_img_create(layerOSD, NULL);
// altHand is an image of a white hand with LV_COLOR_TRANSP background
lv_img_set_src(dial, &altHand);
lv_obj_set_pos(dial, 160, 150);  // This shows the hand in white
lv_canvas_rotate(layerOSD, &altHand, 0, 147, 22, 13, 132);  // This shows the image as yellowish (RGB = 180,180,85)

Screenshot and/or video

If possible, add screenshots and/or videos about the current state.

Ok, I’ve been browsing the code a bit and found out a modification that will make my color white bright again…

In colors.h I changed the function lv_color_mix to avoid mixing colors fwhen the depth is 8 bit as follows:

#if LV_COLOR_DEPTH == 1
ret.full = mix > LV_OPA_50 ? c1.full : c2.full;
#elif LV_COLOR_DEPTH == 8
LV_COLOR_SET_R(ret, (uint16_t)((uint16_t) LV_COLOR_GET_R(c1)));
LV_COLOR_SET_G(ret, (uint16_t)((uint16_t) LV_COLOR_GET_G(c1)));
LV_COLOR_SET_B(ret, (uint16_t)((uint16_t) LV_COLOR_GET_B(c1)));
LV_COLOR_SET_A(ret, 0xFF);
#else
/LV_COLOR_DEPTH == 16 or 32/
LV_COLOR_SET_R(ret, (uint16_t)((uint16_t) LV_COLOR_GET_R(c1) * mix + LV_COLOR_GET_R(c2) * (255 - mix)) >> 8);
LV_COLOR_SET_G(ret, (uint16_t)((uint16_t) LV_COLOR_GET_G(c1) * mix + LV_COLOR_GET_G(c2) * (255 - mix)) >> 8);
LV_COLOR_SET_B(ret, (uint16_t)((uint16_t) LV_COLOR_GET_B(c1) * mix + LV_COLOR_GET_B(c2) * (255 - mix)) >> 8);
LV_COLOR_SET_A(ret, 0xFF);
#endif

This solves my problem, but I’m not sure if it makes any sense for a general use…

Hi,

From your solution it seems it happens because simply there are not enough colors with 8-bit color depth to correctly mix two colors. Therefore artifacts might appear.

The other problem might be that lv_color_mix is inaccurate. The mix parameter is in 0…255 range but >> 8 divides by 256. It can make a noticeable difference with a lower color color depth and when you mix the same colors or when mix == 0 or 255.

I’ve fixed the inaccurate calculation in the dev-7.0.

Thank you! :+1: