Question about the color formats

I am working with the LVGL MicroPython binding and I am a bit out of sorts on how the color modes work.

I wrote a driver for the st7796 interface and it does work using RGB565. The colors are off and I believe I know why that is. I am wanting to get the driver working for RGB666. I should be able to because the interface needs to receive 3 bytes of data R, G and B and the driver ignores the lowers 2 bits of each byte. To get this to work I compiled LVGL setting the color depth to 32 and this is where I get fuzzy on what needs to be done. The flush method in the ili9XXX class I am using as a kind of guide to see what is going on. I see it uses the size of lv_color_t in order to calculate the length of the buffer. Since I set the color depth to 32 bits that means there is an alpha channel in the buffer. This interface only accepts RGB and no alpha. Is there a way to get LVGL to populate the buffer with a 24 bit color depth?

I understand that I will more than likely have to adjust the flush method in C code in order to handle the buffer size being a multiple of 3 instead of 4. That is not such a big deal to do. Just need to know how to stop LVGL from putting the alpha channel into that buffer. I am surprised that LV_COLOR_DEPTH cannot be set to 24 or even 18 (RGB666).

Any help is greatly appreciated.

Hi,

LVGL v8 (the current version) doesn’t support 24 bit color format, so you need to convert the either the 16 or 32 bit data to 24 (or 18) bit before sending them to the display.

v9 will support it but it’s still heavily under development.

Yeah I saw that in another post. I got the color sorted out for RGB565 which is good. Having to strip out that alpha channel would slow things down a considerable amount. It doesn’t look horrible using RGB565 but I would imagine it would look a lot better using RGB666. in order to utilize RGB888 an 8 or 16 channel interface would be needed. That just chews up a lot of GPIOs to do.