Set Resolution at Runtime

Is it possible to set the screen resolution at runtime?

The issue we have is that we build LittlevGL as a common library which our projects link to and dynamically create the objects that need. However we now need to target a different size resolution and from what I can tell his is only defined in the lv_conf.h which is part of the pre-build library. Unless anyone can suggest another way of doing this?

I believe in v6 it’s not using the lv_conf.h for the resolution.
for example I use this:

  /*Set the resolution of the display*/
    disp_drv.hor_res = 160;
    disp_drv.ver_res = 128;

So I re-read the link above, and it specifically mentioned it uses lv_conf.h for the resolution by default. But if you don’t use the defines then you can pass it as an argument?

1 Like

Thanks for the reply @mhel.

I am just trying this in the V6 Visual Studio Simulation but not sure where to put it. I assume is should go in the hal_init function where the disp_drv is initialised. But if I put it just after the lv_disp_buf_init() then it crashes. If I put it after the lv_disp_drv_register() it does not crash, but the resolution does not change.

I was curious, so I tried it myself and it works.

I made a simple test like this.

int main(int argc, char *argv[]) {
	if(argc <=1 ){
		printf("missing arguments\n");
	int x = atoi(argv[1]);
	int y = atoi(argv[2]);
	printf("H:= %d, V:= %d\n", x, y);

   /*LittlevGL init*/


and here' the display init taken from the example.

static void display_init(const int x, const int y) {

    /*Linux frame buffer device init*/

    /*Add a display to 
     * the LittlevGL for the frame buffer driver
    static lv_disp_buf_t disp_buf;
    static lv_color_t buf_1[LV_HOR_RES_MAX * LV_VER_RES_MAX];            /*A screen sized buffer*/
    static lv_color_t buf_2[LV_HOR_RES_MAX * LV_VER_RES_MAX];            /*Another screen sized buffer*/
    lv_disp_buf_init(&disp_buf, buf_1, buf_2, LV_HOR_RES_MAX * LV_VER_RES_MAX);   /*Initialize the display buffer*/
    lv_disp_drv_t disp_drv;
    lv_disp_drv_init(&disp_drv);            /* Basic initialization */
    /*Set the resolution of the display*/
    disp_drv.hor_res = x;
    disp_drv.ver_res = y;

    /*Used to copy the buffer's content to the display*/
    disp_drv.flush_cb = fbdev_flush;
    /*Set a display buffer*/
    disp_drv.buffer = &disp_buf;

I did however get an error if I declare buf1 buf2 like this:

 static lv_color_t buf_1[x*y]; 

for not being constant.

Yes this is what I have but mines using the Windows, Visual Studio version. It actually crashes in the Monitor.c file, so I think this is an issue specifically with the Windows version.

I get the exception below within the flush function.

Exception thrown at 0x00007FFFFD44127E (vcruntime140d.dll) in visual_studio_2017_sdl_x64.exe: 0xC0000005: Access violation writing location 0x00007FF6811E7000.

This appears to be a conflict between the default resolution as set in the lv_conf file and the value you set using the disp_drv.hor_res & disp_drv.ver_res calls. It also appears that the Windows form that gets created is created using the default values.

Therefore the current work around is to set the default value to something bigger than you will be using (800x640 for example) and then the dynamically set it to something you want… as long as its no greater than this.

It just means its a bit nasty to look as as you get black padding to the right and left.

I have tested mine on the actual device though. It’s a nanopi neo2 with a 128x160 LCD attached to spi.
I am not able to get the demos run on this since I don’t have a need for all the features it uses.

@TheGrovesy While LittlevGL itself supports dynamically changing the resolution, I’m not sure that the SDL driver was updated to accommodate that yet. I think that is why you are having issues.