Recently I got an invitation to OpenAI Codex, an AI tool that learn from all Github open source code and is capable to generate new code… I cant stop to test it!..
One of the things I tried is to generate UIs directly from description. and it worked! It works for python wxwidgets, pyqt and even for lvgl. Not always of course, but simple examples work. I dont know why but it generates v7 API…
So given the prompt:
# This example shows a window with 5 elements:
# - A dropdown list, named "com_port", with different COM ports.
# - A dropdown list, named "test", with different tests.
# - A button, named "exec_one" (with same text), to execute only one test.
# - A button, named "exec_all"(with same text), to execute all tests.
# - A label, named "result"(with same text) with the test/s result.
import lvgl as lv
scr = lv.obj()
Generates:
(in the generated code, I added the button labels cause tries to set with button.set_text)
The lvgl can be readed on the screenshot, but wx and qt are a little bit longer. I think is also curios for ppl read them so I attached here. These examples arent modified by me (exept that on wx example, it generated python2 print statemets, so I added parentesis ) UI Codex.zip (2.5 KB)
Interesting! I personally suspect the reason why it uses the v7 API is because it is learning from code examples on GitHub, and there are probably more of these for v7 than v8.
That’s awesome! It’s great to see how AI tools like OpenAI Codex can be so useful in quickly generating code from simple descriptions. The fact that it can generate UI components for different frameworks like wxWidgets, PyQt, and LVGL directly from descriptions is impressive. It’s fascinating to see how the AI adapts to various libraries and generates code that can run without much manual intervention.
I’ve also had some success with AI tools for coding, and while they are great for generating simple UI layouts, I do notice that the more complex the request, the more tweaks you need to make. For example, sometimes the generated code uses older versions of APIs, like you mentioned with the v7 API in LVGL. That’s something to watch out for, especially when libraries evolve quickly.
It’s also a fun experiment to see how it generates code for different frameworks just by switching imports. I imagine the PyQt5 example you got was particularly interesting since that’s one of the more feature-rich UI frameworks.
It’s exciting to think about the future possibilities for developers who can use these tools to speed up repetitive tasks, like setting up UI elements, while still having the flexibility to tweak the generated code for more complex features.
Thanks for sharing this, it definitely gives me some inspiration to experiment more with AI-assisted coding!