Standalone binary .mpy module for lvgl (so we can "just use it" in micropython)

It’s really easy to build C code into a .mpy file which micropython users can simply “import”. No need to rebuild the entire micropython kernel or anything! (Instructions (tested and working great!) here: Native machine code in .mpy files — MicroPython latest documentation )

It’s also easy for most reasonable programmers to build LCD screen output and touch input “low level” drivers using the same method.

I took a stab at trying to make your code build this way, without much luck… your codebase is massive.

It would be AWESOME if you released a single “lvgl.mpy” module for a few assorted MCUs (esp32 etc), along with a sample LCD and touch “xxx.mpy” display companion driver, which lvgl.mpy uses to access the hardware.

As it stands right now - there’s no easy-to-use display solution for any micropython MCU’s - making this separate and modular would totally fix that.

In LVGL v9 we will have drivers right in the LVGL repository. We could get started with it later then I anticipated, however we are already working on the first display controller drivers.

They will have an interface like lv_ili9341_create(320, 240, send_command_cb, send_colors_cb);

So these driver are platform independent the callbacks will be the bridge between the display controllers and the MCU. If we enable all display controllers when we compile LVGL (we will have ~10 in the end) and provide some examples for ESP, STM, etc, I think it will be easy enough to get started with.

Something like:

#Init your SPI
my_spi_init(mosi_pin, clk_pin, cs_pin, clock_speed)

def send_command(disp, cmd, cmd_lend, data, data_len):
   #send the command+data via SPI

def send_colors(disp, area, px_buf):
   #send the colors via SPI

disp = lv.ili9341(320, 240, send_command, send_colors)


disp = lv.st7789(320, 240, send_command, send_colors)

What do you think?

Not sure if you’re showing me an application that uses the display, or driver code to run it?

Whoever sets up the MCU board (i.e. the factory selling it) needs to install ili9341.mpy onto that board, but, NAME it Display.mpy (same for Touch.mpy)

If the user later changes the screen, they simply copy the new screen driver (e.g. st7789.mpy) on as Display.mpy

The overall goal is that solutions authors can write their code which “just works” without them having to deal with whatever display hardware is in the MCU.

Advanced users doing custom stuff (e.g. both LCDs at once) can use the second one as-is (import st7789.mpy as Display2)

Another helper-driver is probably needed, to deal with hardware contention - application authors should not have to concern themselves with pins, SPI details and data rates, I2C ports, etc etc… but there does need to be a way different drivers to negotiate with each other (e.g. so multiple devices using different clocks know how to use individual clocks when needed).

I’m actually pretty good at all this stuff (40 years experience!) if there’s an opportunity for me to contribute a fully-sane multiplatform hardware driver ecosystem - I’m happy to put in the work. I’ve had some pretty amazing results making LCD screen drivers work faster too (my best result so-far was slightly more more than 100x speedup from an atmega328p on a small transflective screen)

Actually, we could really use some help with the MicroPythin binding. :slightly_smiling_face:

How could your suggestion work in the practice? Will we have a CI action which creates the required files as an artifact?


I would recommend building one or two hardware display LCD drivers and a couple of touch drivers (and, if you do sound too - at least one audio driver, and since we’re doing that, may as well build a few LED drivers also - to control onboard LED/Flashlight(esp32cam)/neopixel/etc and LCD backlights(with dimming) components in a portable cross-platform way) all of which ship as .mpy binaries - but way more importantly than that - meticulously documenting that process, so these drivers serve as the examples and starting-point for everyone else who wants their own hardware to work with lvgl.

Before doing the above, putting some decent thought into the best way to implement and support cross-platform necessities so that lvgl end-users never have think about “pins” or “spi” or “i2c” or “pwm” or any hardware-specific mess at all (this isn’t too hard - just a lot of thinking, with feedback from others with a variety of hardware usage experience). For example - the command to “turn on the LED”, or to “Make the backlight half as bright” or to “beep at XXXhz” etc etc all need to given to LVGL users in a way that whatever-they-write works just-fine, even when the board it’s running on is changed to a new one with not a single pin or lcd or other similarity to the first.

And finally, ship a single lvgl.mpy binary module. This runs on any stock-standard micro-python (and probably also circuit-python?) board, and uses the above to access the hardware.

Oh - and one last step - plenty of working examples and clear instructions, from the very start of everything (even as far back as CH340 drivers, and installing python on your workstation even) - not assuming anyone knows anything) about how to install or use stuff at all.

I’m capable of doing almost all the above, but I’ll need a lot of help with compiling the lvgl parts, and while I’m highly experienced at porting, IoT, MCU’s and so on, I’m still new to micropython itself, and even newer to lvgl, so there’s a lot of stuff about your stuff that I know I do not know :slight_smile:

This might be possible if we can also build the portions of the MCU SDK that are needed for LVGL to run. an example is SPI from the ESP-IDF. the reason why this needs to be done is because MicroPython does not have all of the different features/options exposed to the interpreter for the different MCU’s that can be used. It only exposes the common parts between them for the most part. This is to keep the API consistent across all of the different MCU’s that MicroPython supports. With the ESP32 you are not able to use DMA memory transfers if you use the MicroPython SPI. that is a pretty significant performance hit not being able to use that.

The gen_mpy script should produce the code needed to get it done. It would have to be tweaked to add some specific bits that are needed for the mpy compilation.

I personally do like this better than building it into the MicroPython firmware. We need to look into memory use and any performance losses to see if it would be the way to go. What is nice is it would remove the crazy integration needed in the MicroPython build system. and it simplifies compiling for different MCU’s. It would also allow binary releases to be made as well.

To be sure we are on the same page could you sketch the rough architecture of all these? I mean mainly:

  • what is in C and MicroPython
  • what can be replaced by the user
  • what do we ship, what is for the user
  • where are the hardware dependent parts

cc @andrewleech @matt.trentini

My experience is with micropython, I’ve only looked at lvgl fleetingly so far, and not at all in pure-c usages.

What I’ve worked on previously is just getting lvgl to build as a user-c module for micropython so it can work with unmodified official micropython sources; this method does require a user to compile the whole lot in one go.

The dynamic c modules are powerful in that they can be build separate to the micropython firmware and re-used by multiple people, however it does require separate mpy’s for each architecture (similar to binary python wheels in that sense).
One catch with the dynamic c mpy is that it cannot be frozen into a firmware image, so users will always need to copy it on manually after the micropython firmware is flashed.

Dynamic C module api is slightly limited too, it only has access to a subset of internal micropython c functions, whereas user-c ones, being compiled in, can use any header file in the codebase.

Ideally… long term maybe it’d be best to be able to compile as either user-c or dynamic, which is feasible… however it’s easiest to get user-c modules working first in my experience.

As to the other questions, I’m not sure what overall architecture is feasible, but my preference would be for LVGL to basically just manage drawing into the standard micropython framebuf.Framebuffer then “standard” display drivers can be written to handle drawing Framebuffer to screen. Framebuffer doesn’t currently have any features for partial-screen-updates but things like this can and should be added really. I’d like to see Framebuffer extended with whatever api hooks is needed for LVGL (or any other gui toolkit) to be able to use it in a standard way across all micropython ports.

If Framebuffer can be the api used to separate / bind LVGL to micropython, then LVGL project doesn’t really need to worry about anything device specific, screens and their drivers becomes a user project thing and any python or c display drivers can be used, they just need to talk to Framebuffer.

Input drivers I also think should be written as a python (or micropython c) driver which just knows how to communicate with a published api needed by lvgl, these drivers can then be registered with lvgl. By memory this is already how that works.

1 Like

@kdschlosser - Any language, including inline native assembly, and all device hardware feature use/manipulation can appear in the .mpy file. There’s a reasonably good chance that with some careful memory management, hardware DMA features are still possible even when not exposed through the interpreter - again - it would all be up to the skill of the author of the display driver .mpy component.

My thinking:-

In “MicroPython” is the users code, everything else is “C” - LVGL itself, hardware drivers ( low level access to audio, keyboard, mouse, joystick, buttons/other-inputs, LEDs/other-outputs, graphics hardware output, touchscreen input), and any intermediate mapping layers needed for feature consolidation/discovery etc. Looking at how SDL does it would probably reveal almost all the things we need to know.

In a perfect world, board manufacturers ship all the .mpy files themselves - so the user need not supply anything other than their own code, which runs unchanged on whatever board it’s put on (within reason). Users can replace any of these things (e.g. load a newer lvgl.mpy driver, or load new/different/extra device .mpy files etc) or load them themselves if they get/make a board without them included.

lvgl ships one file: “lvgl.mpy” - which comes with a release flavor for each different MCU processor. I would recommend also shipping optional popular display/touch/LED/etc driver files, so the community understand how it works, and can find the source and instructions from then on for building their own drivers (and hopefully sharing back). Optional examples would also ship of course, all of which would be pure python, and all machine independent (literally the same python works on any board - each example would demonstrate “best practice” for writing code: device discovery, flexible programming (e.g. handling different sized screens etc)).

hardware-dependent parts (e.g. display.mpy and touch.mpy and led.mpy) would be included on the board when it ships from the manufacturer. if it’s not included, users get the right one from “wherever” (mfg website, github, your samples if its a popular screen,…) and put it on their board.

Also, it is a no-brainer to flash the VFS image at the same time as flashing the micropython firmware in the same step (and even just join the file on the end to ship as-is), so if someone does still want to ship a single “ready to use” firmware.bin file, it’s a simple process of loading the correct .mpy files into the VFS and concatenating that onto the official micropython firmware release image.

probably infeasible? my current favorite toys* only have 48k ram available. a smallish 320x240 display would need153k for a frame buffer…

*=no affiliation

just tested allocating a framebuffer on my esp32cam build (has spiram - works OK) - so - framebuffer might be feasible for those boards. probably not a good idea to make spiram a requirement for lvgl though, but probably not too hard to support the framebuf idea at the same time as direct-write-to-display-driver if it’s all being written at the same time though; should there ever be a use case for needing that.

I’d suggest it’s likely that if lvgl worked well in micropython, they’d remove the framebuf support entirely (no need wasting space on something useless and redundant.)

With SPI:-

MicroPython v1.22.0-preview.118.g841422817 on 2023-11-09; ESP32 module with Camera with ESP32
Welcome to MicroPython!                                         
WebREPL connected                                               
>>> import sh                                                   
http://mpy-esp32cam-05.local/ init ok                           
machine:/ upy$ free                                             
stack: 3280 out of 15360                                        
GC: total: 128000, used: 43632, free: 84368, max new split: 3997696                                                             
 No. of 1-blocks: 725, 2-blocks: 163, max blk sz: 142, max free sz: 2641

vs (no SPI):-

MicroPython v1.22.0-preview.118.g841422817 on 2023-11-04; Generic ESP32 module with ESP32
Type "help()" for more information.
>>> import sh
http://mpy-esp32-lcd02.local init ok
machine:/ upy$ free
stack: 3264 out of 15360
GC: total: 128000, used: 43376, free: 84624, max new split: 15872
 No. of 1-blocks: 692, 2-blocks: 154, max blk sz: 142, max free sz: 1066
machine:/ upy$ 

Oh, I had no idea that lvgl could work without a full framebuffer, I thought I’d always had one internally at the moment.

If it’s possible to use on devices without enough ram for the full frame attached screen already then I certainly agree that making Framebuffer a requirement would be a step backwards.

It is actually best to have the framebuffer size to ((width * height) / 10) * sizeof(lv_colot_t) for most applications. If DMA memory is available and you have enough memory to use double buffering then that is what you should do. So while the MCU is dumping the frame buffer LVGL is off loading the second buffer with data. Doing that is a pretty sizeable boost in performance.

1 Like

Thanks for that clarification @kdschlosser, that certainly changes my thinking.

It’s certainly a strong argument for tighter integration between lvgl renderer and display driver, considering lvgl needs to know where and how to render the partial buffers and/or manage the double buffer.

I certainly understand the benefits of a well configured dma based double buffer system, I’ve used this sort of thing on a number of embedded video capture and audio generation systems over the years. This in particular sticks out in my mind as being tricky to code & deploy in a semi-device-agnostic way though.
I’m most familiar with STM32 these days, and on that family alone there’s a ridiculous tapestry of different dma engine types and configuration options.
On top of that, micropython already uses a number of dma channels / configurations on many stm chips behind the scenes, but the config of these is rather custom to each chip and the peripherals used.
It’s not the most elegant by any means… but the config of all stm port dma setups is contained in this one file:

It’s certainly possible for a C extension to configure and use dma separately to that “built in but not exposed to the user” dma driver; I’ve done so just recently in a user-c-module, but I had to customise it to just my one chip (stm32wb55).
As such any effort to make an stm dma driver will need to have custom configs for each range of stm chip and dodge the channels already used in micropython. Not impossible, but somewhat messy.

Separately, it’s standard in micropython to have things like the pins assigned to a particular peripheral like spi1 to be defined in python board config. It’d be great if these can be reused cleanly by lvgl display drivers, not sure how well this might work though.

So lvgl.mpy is a “dynamic c modules” that @andrewleech mentioned and it’s something pre-compiled that needs to copied to the MCU somehow, right? If so, what is the best way of doing that in case of e.g. ESP32 or STM32? Can we dynamically load it from an SD card? If so will we need a lot for RAM for that?

Regarding the drivers: to avoid having various driver on many places I suggest adding all of them to LVGL directly with an API similar to what I described here. This way the user can easily use any supported display controller (the display drivers are in C but a MicroPython API is generated automatically), and write their own custom callbacks for their display wiring. Of course we can provide some ready to use Python modules for development boards where we know how the display is connected to the MCU.

tthe MPY gets compiled as a separate thing from MicroPython.

ok so in CPython you have the source files and those are .py files. But you also have .pyd files. these files are basically a shared C library that uses the CPython API to behave like a .py file but have all of the benefits of being a C shared library. Namely speed. the .mpy file is to micropython as the .pyd is to CPython.

The .pyd file is not compiled at runtime and neither is a .mpy file.

The mpy file has to be compiled using a compiler that is suitable for the architecture the file is going to run on. Ideally it wouldn’t be board specific. This would hold true if LVGL was compiled separately from any drivers. It’s the drivers that make it board specific. You would have to compile LVGL specific to the architecture is all. There are only 4 or 5 different architectures which makes it easier to release a binary so the user would not have to compile it.

I just spent some time and looked at the SPI driver that is written for MicroPython for the ESP32. It is soo close to being what is needed. a couple of small tweaks and there would be no need to have the drivers access the ESP IDF directly.

MicroPython has provided a way to access specific blocks of memory from inside of Python. so the DMA buffers can be made and those buffers can be used and passed to machine.SPI. What needs to be changed in machine.SPI is it stalling the program when writing the buffer to the display.

the esp-lcd component in the esp-idf has all of the communications built into a single entry point to handle the displays. IDK if stm32 has something similiar or not. Maybe @andrewleech can answer that question. If it does then we can find some kind of a common API between them and come up with a way to make display drivers easier to do with.

If that is not possible to do then we can roll out own base driver that would handle all of the low level bit shifting of the data and turning pins on and off. This can be done using the viper code emitter so the code would be in Python or it can be written in C code and compiled as an mpy file.

adding the drivers to LVGL directly is going to make it far more complex to compile for any platform. This is because of low level access to the varying architectures being different. Building the high level portion of the driver into LVGL is fine because this would not be dealing with the hardware directly. The high level is interaction with the display driver IC but the low level is the bus the IC communicates on. It’s the bus that causes the complexity. Each platform available is not going to have the say way of handling the low level access to the bus.

for example.

I8080 is the same across all platforms. SPI is the same as well. These things are standalone specifications on how the communications work. But if you use SPI on an STM32 and then on an ESP32 what is different is how to access the pins for those 2 MCUs. If it is being run on Linux one again you have another way that is different on how to access the underlying GPIO pins.

To my knowledge there is nothing available in terms of a library that would do for GPIO pins like what is seen in SDL2 for accessing media for Linux, Windows, OSX… etc…

If there was something available it would make life a whole lot easier I will say that.

Here is the start of a cross platform I2C and SPI driver

I have not been able to locate a cross platform i8080 or RGB driver.

Yes. Pick one of:-

  • Buy a module that ships with it
  • Copy the file over onto the device
    ampy --port /dev/ttyS20 put lvgl.mpy
    or use something like Thonny
  • Flash a firmware distributable that includes the VFS partition that already has the file on it
  • Flash just the VFS partition with the file in it
    Typically, those last 2 options would include whatever application you’re interested in as well - a webcam, a remote sensor, robot dog controller, etc etc…

This makes no sense, and there would never be enough room on an MCU to hold all the drivers for all the hardware that the MCU is never going to use because it’s not attached to it.

No - instead - when you upload “lvgl.mpy”, you also find the “display.mpy” file that goes with whatever display you’re using, and upload that as well. In future - you will never need this step, because manufacturers of the hardware with the display in it, will include the file with the board when it ships (kindof like they already do for non-micropython “legacy” builds - you know how it looks like it works when you first turn it on? That.)

If you’ve got 2 displays at once, you could upload ili9341.mpy and st7789.mpy, then:-

import ili9341 as display
import st7789 as display2

… but you would probably want to rename your “main” display to “display.mpy” so that any other micropython software you download “just works” without you having to change it first.