Setting 16-bit color value in set_px_cb?

Description

I would like to set the pixel passed into set_px_cb to a 16 bit color value. It seems that “buf” passed into the callback is cast to (uint8_t *)… Why isn’t it cast to (lv_color_t *)?

I recast it to (lv_color_t *) to set the color, but I’m not sure why it was uint8_t to begin with.

What MCU/Processor/Board and compiler are you using?

IMXRT1052

What do you want to achieve?

Set pixel to incoming color unchanged in set_px_cb

What have you tried so far?

Recast buf to lv_color_t.

Code to reproduce

lv_color_t *vdb_px_p = (lv_color_t*)buf;

Screenshot and/or video

N/A

I think it’s because the only case where you would want to use set_px_cb is for tiny systems where you want to compress as many bits into the VDB as possible.

You shouldn’t need to do this. Why can’t you set LV_COLOR_DEPTH to 16?

Because I am using this callback for my halftone fill, so every other pixel is just kept as-is.

The buf parameter in set_px_cb is raw-data byte-serie.
Some case you can use the other color-format
than the normal LV_COLOR_DEPTH in lv_conf.h.

Such as you can use 16bit LV_COLOR_DEPTH with alpha 8bit (3 bytes per pixel),
then you can cast the buf to 3bytes, not cast to lv_color_t* for the correct color with alpha.

1 Like

Thanks for the explanation, I just wanted to make sure I wasn’t missing something!