LVGL Editor v0.1 is released!

And… to answer my 1st question myself: just me being a newbee. Of course lv_xml_create() just accepts a parent node, I have to read the code before I ask questions! Anyway, created some bindings for xml-loader. Works great! Thanks!

Let me answer multiple questions here:

  • Yes, there will be a VSCode plugin, planned for v0.2 so in a few month.
  • You can choose to export C code and forget the XMLs. These are pure LVGL UI code.
  • We are thinking hard about how get rid of Docker and provided a simpler getting started flow.
  • The expat XML parser is only a few files ad you can build it with LVGL
  • We have selected XML and will stick to it for a while for sure. There is no plan to add support to YAML in the near future.
  • With lv_xml_create(parent, "widget_name", attributes_array) you can create widgets on any parents to replace “innerHTML”. The widgets created by lv_xml_create are identical to the widgets created by lv_..._create(). So you can delete, move, adjust them as you wish.
  • I’ll add an lv_obj_set/get_name function very soon. The related discussion is here. There also should be an lv_obj_get_by_name(parent, "name") function which returns the first child with that name (maybe not direct child, but grand child).

Does v0.1 already support registration of events?
If not, will this be available in v0.2?

Yes, events are planned for v0.2 (end of March)

LVGL Editor XML is very cool, so that our MCU can dynamically load the GUI components of our design system on demand. This is not available in Squareline and NXP.

A design system supports different ports to execute separately. For example, App, Web, Desktop, even the MCU side. From a unified perspective, introducing the tool stack of Web development into embedded GUI development is the general trend of modern UI design and development.

XML can be used for Figma Plugin, then I connect the figma to other team. Thus, more modern UI design and development can be achieved.

UI Component Design System tools: Figma, Sketch, the world’s top designers all use Figma to design.

UI Component Design System Develop and multiple platform: BuilderIO, Mitosis.

UI Component Design System Preview and Document tools: Storybook, HiStorie.

UI Content management System (CMS): Adobe AEM, Drupal, WordPress…

Here, I hope that LVGL will have its own modern UI tool stack in the future. Integrate the design, development, preview, and documentation of the above-mentioned design systems, and combine them with a page builder similar to CMS. Ordinary editors can simply drag and drop to output the GUI interface and interaction.

Regarding the dynamic display of GUI, one day in the future, we also expect LVGL to support json or Protocol buffer. Why do I bring this up? The resources of MCU can be very limited, and at most it can process the displayed content. But how to display, how to display dynamically, and who is responsible for other highly complex applications after GUI, more CPUs are needed to support it.

According to Apple’s latest Carplay UI architecture, the mobile phone was originally responsible for rendering and passing the content to the car computer in the form of video stream. The new architecture is to render the GUI locally, and Apple iPhone can send a very small amount of data to complete the fast GUI display and interaction. This interaction logic is similar to the serial port LCD screen.

Regarding the positioning of LVGL and XML, I think it can be like this:
Parallel screen development: 8080/RGB>LVGL Engine, just like Chrome in the web side.
Serial screen development: XML>JSON/Protobuf>AT/API, just like HTML+CSS+JS in the Web side.

This can simplify the development difficulty of GUI rendering, organization, and communication on the MCU side. Even a Web development engineer can be responsible for GUI development on the MCU side.