I am working with the LVGL MicroPython binding and I am a bit out of sorts on how the color modes work.
I wrote a driver for the st7796 interface and it does work using RGB565. The colors are off and I believe I know why that is. I am wanting to get the driver working for RGB666. I should be able to because the interface needs to receive 3 bytes of data R, G and B and the driver ignores the lowers 2 bits of each byte. To get this to work I compiled LVGL setting the color depth to 32 and this is where I get fuzzy on what needs to be done. The flush method in the ili9XXX class I am using as a kind of guide to see what is going on. I see it uses the size of lv_color_t in order to calculate the length of the buffer. Since I set the color depth to 32 bits that means there is an alpha channel in the buffer. This interface only accepts RGB and no alpha. Is there a way to get LVGL to populate the buffer with a 24 bit color depth?
I understand that I will more than likely have to adjust the flush method in C code in order to handle the buffer size being a multiple of 3 instead of 4. That is not such a big deal to do. Just need to know how to stop LVGL from putting the alpha channel into that buffer. I am surprised that LV_COLOR_DEPTH cannot be set to 24 or even 18 (RGB666).
Any help is greatly appreciated.