Working with a non-embedded GPU is a little bit different.
With GLES (or other high-level GPU) we can’t mix GPU and software rendering (at least not in a trivial way). Therefore all the draw functions of LVGL should be reimplemented with the given GPU. Theoretically, it’s possible but needs a larger amount of work.
I think the GPU drawers can be hidden behind the same API as the current LVGL draw functions (e.g. lv_draw_rect). So the LVGL draw part could be replaced by something else.
However, it just came to my mind that on a more powerful system the drawn objects can be buffered. In comparison, LVGL always draws a button when it’s scrolled, altough the button didn’t really change, only its position. So if the drawn button was buffered as an image, and it was “rebuffered” if its style or size changes only its buffered image needs to be drawn.
There is no such concept now, but we are thinking about making objects “buffereable” to allow rotating, scaling, etc them. The two things have similar roots.
I just remember, there is the well known imGUI library where they have bindings for almost all targets. You can get inspiration and some ideas from it
Integrating Dear ImGui within your custom engine is a matter of 1) wiring mouse/keyboard/gamepad inputs 2) uploading one texture to your GPU/render engine 3) providing a render function that can bind textures and render textured triangles.