Lv_disp_is_true_double_buf() return error value


<littlevGL 6.1>
In bool lv_disp_is_true_double_buf(lv_disp_t * disp), scr_size = disp->driver.hor_res * disp->driver.ver_res; Does it need to multiply bytes per pixel?

As when initialize frame buffer, the driver.buffer->size is widthhightbytes per pixel, lv_disp_is_true_double_buf() will be always return false even set double buffer.

Use the ```c and ``` tags to format your code:

 * Check the driver configuration if it's TRUE double buffered (both `buf1` and `buf2` are set and
 * `size` is screen sized)
 * @param disp pointer to to display to check
 * @return true: double buffered; false: not double buffered
bool lv_disp_is_true_double_buf(lv_disp_t * disp)
    uint32_t scr_size = disp->driver.hor_res * disp->driver.ver_res;

    if(lv_disp_is_double_buf(disp) && disp->driver.buffer->size == scr_size) {
        return true;
    } else {
        return false;

Screenshot and/or video

If possible, add screenshots and/or videos about the current issue.

No, because we store buffer sizes in pixels, not bytes, as you can see from this sample of how to initialize a buffer and register it.

I could be wrong, but I don’t think so. Have a look at lv_disp_buf_init; it stores the buffer’s size in pixels.

This function is checking whether you are using true double buffering, meaning that both buffers’ sizes (in pixels) must be equivalent to the size (in pixels) of your display itself.

It might be worthwhile double-checking your code to make sure that is true. If it still isn’t working, please send a code sample of your buffer initialization/registration.

Got it, thank you.