Console widget for running simple user scripts within lvgl

I am building a small toy controller for the kids. The idea is to allow the kids to write some simple scripts on the PC, store them on SD card and on the esp32/lvgl based target device select one from the UI to run it in a window.

This is a little more complex than it sounds as a typical script like that would use print for output and would potentially run a few seconds (e.g. containing stuff like “robot, drive forward for ten seconds” using “time.sleep()”. This would usually result in a blocked gui as long as the script runs. So I wrote the “Console” widget. It runs the given script in the background using a _thread and while doing that monitors its output via dupterm. A lvgl task monitors for data catched via dupterm and appends it to the console widgets text view.

I am pretty surprised that this seems to work :slight_smile: Actually during my experiments i got all sorts of recursion exceptions and random crashes. But now it seems to work. So I share it here for your entertainment.

This is the widget (

import lvgl as lv

import uos
import _thread
from uio import IOBase

class Console(lv.label):
    class Wrapper(IOBase):
        def __init__(self):
            self.buffer = ""

        def write(self, data):
            self.buffer += data.decode('ascii').replace('\r', '')
        def get_buffer(self):
            retval = self.buffer
            self.buffer = ""        
            return retval
    def watcher(self, data):
        d = self.wrapper.get_buffer()
        if d != "": self.ins_text(lv.LABEL_POS.LAST, d);
        if not self.running:

    def execute(self, code):
        exec(code, {} )
        self.running = False
    def __init__(self, *args, **kwds):
        super().__init__(*args, **kwds)

    def run(self, fname):
        # read script
        f = open(fname)
        code =

        # start wrapper to catch script output
        self.wrapper = self.Wrapper()

        # run script in background
        self.running = True
        _thread.start_new_thread( self.execute, ( code, ) )

        # start task to read text from wrapper and display it in label
        self.task = lv.task_create(self.watcher, 100, lv.TASK_PRIO.MID, None);

It is used like so:

from ili9XXX import ili9341
from xpt2046 import xpt2046
import lvgl as lv

from console import Console

# startup lvgl
disp = ili9341(miso=19, mosi=23, clk=18, cs=5, dc=32, rst=27, spihost=1, power=-1, backlight=33, backlight_on=1, mhz=80, factor=4, hybrid= True)

touch = xpt2046(cs=26, spihost=1, mhz=5, max_cmds=16, cal_x0 = 3783, cal_y0 = 3948, cal_x1 = 242, cal_y1 = 423, transpose = True, samples = 3)

# Create the main screen and load it.
scr = lv.obj()

# Just a button to prove that the lvgl is working
# while the script runs
def on_btn(obj, event):
    if event == lv.EVENT.CLICKED:

btn = lv.btn(scr)
lv.label(btn).set_text("Click me!");
btn.align(scr, lv.ALIGN.IN_TOP_LEFT, 10, 10)

# setup console
console = Console(scr)
console.align(scr, lv.ALIGN.IN_TOP_LEFT, 10, 60)


# execute "" in console"")

while True:

and is able to run scripts like this

import time

print("Hello world!");
for i in range(10):
    print("I:", i);
1 Like

That’s great!
Thank you for sharing this!

Using threads is one option, but it may have some disadvantages (heavy weight, unsafe, etc.)

Another option is using uasyncio.
uasyncio allows you use the “async/await” concurrent programming scheme which is very popular today. Anyone who ever wrote JavaScript code would be very familiar with this. Python 3 also supports that.

On Micropython it’s relatively new, but I’m using it and it works well with LVGL. In my case I’m doing async network operations without blocking the GUI, but that would work the same way for IO operations.

The idea is that instead of threads you create “tasks” which are actually non-preepmtive co-routines. Under the hoods there is only a single thread, and tasks are scheduled on that thread.

To use LVGL with uasyncio you need to import and create an lv_async object. lv_async creates a uasyncio task that calls the LVGL event loop (lv.tick_inc and lv.task_handler).

Do not import lvesp32 on esp32 , or lvstm32 on stm32 when using lv_async, since lvesp32/lvstm32 are scheduling LVGL event loop using preemptive threads. lv_async is replaces them.

On the unix port set auto_refresh to False on the SDL driver and set refresh_func like this: lv_async(refresh_func = SDL.refresh), since by default the SDL driver is doing both display handling and the LVGL event loop.

Why not use WiFi instead of SD card?

It’s very easy to use FTP or Telnet. Both are already part of lv_micropython:

    import uftpd
    import utelnetserver

I’m also using mdns so I don’t need to keep track of the IP:

    import espidf
    if espidf.mdns_init() == 0 and espidf.mdns_hostname_set('esp32') == 0:
        print("mdns: esp32.local")

Asyncio is nice. There are even versions of my demo “user” scripts that are entirely written with lvgl and use lv_tasks for timing stuff and they come with their own custom UI to control the robot. This would be the kind of script i write them to play with.

But a typical “python for kids” book teaches the kids to use print() and time.sleep() and my goal is to allow them to use exactly this stuff on the esp as well. Later on they can write their own very specialized scripts, no question. But these simple scripts are what they are teached in every single python beginners book.

And why SD card? Because that is simple and even works when they take their toys with them to school. There actually is the telnet and ftp on my device as well and it can also be used to upload user scripts. And ampy can of course also be used via USB. The SD card is just one example.

So …

Why SD card and not telnet and ftp? This question actually got me thinking. But ftp and telnet are so '90s. So I wrote some small uwebserver like thing instead. The idea was to have a simple file upload form.

In the end I grabbed a copy of blockly ( and made the webserver on the esp serve that. Together with blockly’s ability to generate python code it becomes a pretty cool setup.

Small, dark and shaky video here.

1 Like

Haha! You are right! It’s a bit old-fashioned.
But if you want something more 2020’s, HTTP/2 is also supported!
So go ahead and create a QUIC or GRPC service!

I’m not familiar with blockly, it looks cool for kids! Is it really useful for creating something practical? For me it’s easier to look at Python code than wrap my mind around these crazy colorful blocks.

Wow very nice!!

You can try sending this (with some commentary) to Hackaday, I would say there is a good chance they would add it in their Blog. Just mention you are using LVGL+Micropython :wink: (we had an old item there already)

And if you ever put all of this into a GitHub repo, please send us a link!

Yes, kind of. It doesn’t make too much sense to try to write bigger programs that way. But as you said, it’s great for kids. It’s actually always great whenever you want an inexperienced user to add functionality. This could e.g. also be used to do some user customization in a home automation project where you’d like to enable the user to configure things like “whenever it gets dark outside switch the lights on for two hours”.

Since this project will probably (hopefully) grow further and it may become difficult to split the LCGL+µP+Blockly part out of it I have done exactly that.

Find the code at

This is the total minimum. Especially the Blockly part has only been enabled for a minimal setup. With very little effort many more blocks can be enabled and it’s also possible to add own custom blocks to e.g. control the motors of a robot toy or read an ambient light sensor.

Thanks for sharing this!

I am still working on this. I now also have CodeMirror (a code text editor in html5) in this setup and some small project management (both not online yet). The Blockly generated Python code can afterwards even be loaded into CodeMirror. So you can use Blockly to get started and then continue directly with the generated Python.

The idea still is to make the entire setup self-contained with no references to outside world so you can even run the ESP32 as an AP with the phone connected only to this.

Currently I am trying to implement some means of remote view. Hence the parallel thread about reading the frame buffer.

In the end the kids should be able to program simple python scripts and apps using a WiFi connected smartphone as an editor tool and then be able to “play” by running the apps directly on the target robot without needing the phone.

1 Like

I gave this a try. If I neither import lvesp32 nor async_utils then still the user interface works. That’s unexpected. After your explanations I expected lvgl to stop working and only if I create an instance of lv_async it would work again. That’s not the case?

I believe that’s how it should work. Are you importing those in another startup script (like by chance? Or is that the script you’re working with?

You are right, and that happens because ili9XXX import lvesp32, unfortunately.
To work around this you can call lvesp32.deinit after importing ili9XXX, before creating an instance of lv_async. That’s what I’m doing currently.
In the future it might be a good idea to stop importing lvesp32 from ili9XXX and requiring the users to import lvesp32 explcitly.