Keyboard read callback: sending Utf-8 chars


LVGL supports utf-8, so in the read callback (read_cb) for LV_INDEV_TYPE_KEYPAD, how can i send a utf-8 key ?

Do you have any code/explanation of what you are trying to do?

Your post doesnt comply with the standard request form and is difficult to understand what you are trying to achieve…



This is how you hook a physical keyboard to lvgl: keypad-or-keyboard

In read_keyboard you specify the input key in data->key. data->key is uint32_t but a utf-8 key is char *.

What do you want to achieve?

So how can you notify lvgl that the keyboard is sending characters like ‘ò’, ‘à’ and so on?

Do you get normal characters output?
If you do but are missing ò’, ‘à’ it might be missing in the font. If thats the case you will need to create your own font with all of the characters you need including A>Z and ò’, ‘à’ etc.

The problem is that if ‘ò’ is pressed on the keyboard, i cannot tell lvgl that such a key is pressed, because data->key wants an ascii code (integer), but ‘ò’ is a string in utf8: “\xc3\xb2”.

Any utf-8 “character” can be stored within 4 bytes, be it an array of chars or as an uint32_t. Idk what lib you are using to handle uft8, but it must have a function that returns a uint32_t or at least a function to convert to it.

At first, i tried a large utf-32 number and it was simply ignored (that’s why this post). Now if i try with 0xf2, it’s either ignored or blank and if i continue to edit (text area), sometime i get a crash.

0xF2 is invalid, the first byte in utf8 only goes up to 0x7f

You can use websites like this Convert Hex to UTF8 - Online Hex Tools to test. For example if you input 0xc3b2 you will get your “ò”.

Also have you checked the lv_driver repo? evdev.c and xkb.c do exactly what you want and are quite straightforward to follow.

Since it’s clear that data->key doesn’t accept a string, i thought that, being uint32_t, it accepted a utf32 code, that’s why i tried 0xf2.

Anyway, xkb.c at line 199 seems to suggest that utf8 bytes must be packed into a uint32_t. But assigning 0xc3b2 (or changing endianness) to data->key doesn’t work either.

This works:

	uint32_t key = 0;
	key |= uint8_t(ch[0]) << 0;
	key |= uint8_t(ch[1]) << 8;
	key |= uint8_t(ch[2]) << 16;
	key |= uint8_t(ch[3]) << 24;

In short:

key = *(uint32_t *)ch;