DMA Usage Problem with Two Buffers


I use FRDM-K32L3A6 board. I run the library with SPI Polling successfully. I can also run with DMA but I need to wait for flushing to finish.

What MCU/Processor/Board and compiler are you using?

FRDM-K32L3A6 Board, MCUXpresso

What LVGL version are you using?


What do you want to achieve?

Fully DMA without waiting in while();

What have you tried so far?

Code to reproduce

Add a code snippet which can run in the simulator. It should contain only the relevant code that compiles without errors when separated from your main code base.

With the following code, it works:

void my_flush_cb(lv_disp_drv_t * disp_drv, const lv_area_t * area, lv_color_t * color_p)
    /*put all pixels to the screen at once
     * Can be done by DMA as well.
     * */
//    disp_p = disp_drv;

    uint16_t * color;

    color = (uint16_t*) color_p;

    lcdDrawMultiPixels(area->x1,area->y1,area->x2,area->y2, color);

    /* IMPORTANT!!!
     * Inform the graphics library that you are ready with the flushing*/

However, if I put flushing inside the SPI Callback, the screen is not correctly filled. I assume that the buffers are get mixed:

void LPSPI_MasterUserCallback(LPSPI_Type *base, lpspi_master_edma_handle_t *handle, status_t status, void *userData)

    isTransferCompleted = true;

Screenshot and/or video

If possible, add screenshots and/or videos about the current state.

Do you use screen-sized buffers or smaller ones?

I use two buffers sized 240x30x2, 240 X , 30 Y and two byte.

I see, then it looks good. I don’t think the buffers are messed up. I almost always use this buffering mode and see no issue like this.

Can you send an image about a messed up screen?

This is when I put lv_disp_flush_ready(&disp_drv); inside my_flush_cb();

And this one is when lv_disp_flush_ready(&disp_drv); inside SPI_CallBack();

I see. Probably the issue is with the buffer initialization. Please show how you initialized lv_disp_buf_t

static lv_disp_drv_t disp_drv;
//lv_disp_drv_t * disp_p;

//buffer definitions
static lv_color_t buf1[240 * 30 * 2];
static lv_color_t buf2[240 * 30 * 2];

void main (void)


    static lv_disp_buf_t disp_buf;

    uint32_t size_in_px = 240 * 30;

    lv_disp_buf_init(&disp_buf, buf1, buf2, size_in_px);

//    lv_disp_drv_t disp_drv;                 /*A variable to hold the drivers. Can be local variable*/
 	lv_disp_drv_init(&disp_drv);            /*Basic initialization*/
 	disp_drv.buffer = &disp_buf;            /*Set an initialized buffer*/
 	disp_drv.flush_cb = my_flush_cb;        /*Set a flush callback to draw to the display*/
 	disp_drv.monitor_cb = display_monitor;

    lv_disp_t * disp;
    disp = lv_disp_drv_register(&disp_drv); /*Register the driver and save */



“Unfortunately”, all looks good. What if you use only 1 buffer?

It shouldn’t matter but size_in_px = 240 * 30; should be size_in_px = 240 * 30 * 2;