Coordinate systems

This is my first project using LVGL. I use a 4 inch TFT display with the NT35510 controller and the XPT2046 touch controller.

I have noticed in the lvgl documentation that lvgl assumes the origin (0,0) of the TFT (Display) coordinate system is in the upper left corner, but I have not been able to find a definition of the axis.

Before any rotations, does lvgl assume the display x-axis is horizontal pointing to the right?

Does lvgl assume the display y-axis is vertical pointing down?

The parameter horizontal resolution is that the resolution of the x-axis?

With the NT35510 it is possible to change around the (0,0) point and the x- and y-axes. I want to make sure I get it right from the outset.

For the touch function a driver is needed. What shall the output be of that driver? I would have thought, it shall be the (x, y) coordinates of the touch point, but I have noticed in a topic in this forum that a range is given as 200 to 3800, which looks as the raw 12 bit output of the XPT2046, that is without any calibration. Please, what shall the output be?

The coordinate system of the XPT2046 is not necessarily the same as that of the display, but if the output of the touch driver shall be the (x,y) coordinates of the touch point, it is my task to ensure the coordinate systems are the same. Hence, I need some info on what is required/assumed for the touch driver.

Is there in the documentation a paragraph on coordinate systems with details on axes and origins? (I have searched but found only that the origin shall be in the upper left corner before any rotations.


1 Like

The origin is based on how you set the alignment. As a default the alignment is set to the upper left corner. so positive X axis is to the right and positive Y axis is down. The coordinate system used is similar to the Cartesian system except it is flipped vertically so y positive is down. This is the standard for dealing with 2D graphics on computers.

Now I said it is base on alignment. if you align an object to the center the objects origin will be based at the center of it’s parent and also at the center of the object itself. so if you have a parent object that is 200 x 200 and a child that is 100 x 100 and the child’s coordinates are set to 50 x 50 the bottom right corner of the child will be at the bottom right corner of the parent. and if you set the child’s position to -50 x -50 the top left corner of the child will align with the top left corner of the parent.

Thanks a lot. That clarifies the issue. In fact, the coordinate system is a complicated issue in lvgl. One has to keep track of parent child relations, but your explanation helps a lot and reduces significantly the number of cuts-and-trials.

On the touch response, is it correct that the (x,y) shall be in the display coordinate system i.e. (0,0) at top left corner, x-axis to the right and y-axis down?


The touch coordinates will be displayed as the same system with 0, 0 always being the upper left corner. If you rotate the screen the touch driver needs to have it’s coordinates mapped so they will report that way to LVGL. Remember the digitizer doesn’t know that the display has been rotated and it doesn’t care either. So no matter what way the display is orientated it will always return the same coordinates. you need to alter those coordinates in the read callback function so they will be in alignment with what LVGL needs.

I have written a simple “remap” function that takes care of doing this easily and it works with all touch displays.

uint16_t remap(uint16_t value, uint16_t  old_min, uint16_t  old_max,uint16_t  new_min, uint16_t new_max) {
    return uint16_t ((((value - old_min) * (new_max - new_min)) / (old_max - old_min)) + new_min)

The functions use is simple. Say the digitizers resolution is 480 x 320 in landscape fashion. Here is an example of how I go about getting the proper coordinates.

This is pseudo code I just keyed out so it may or may not have typos and errors in it. It is for example purposes only.

#define MAX_WIDTH = 480 
#define MAX_HEIGHT = 320

enum {
    ROT_0 = 0,

typedef uint8_t touch_rot_t;

uint16_t remap(uint16_t value, uint16_t  old_min, uint16_t  old_max, uint16_t  new_min, uint16_t new_max) {
    return (((value - old_min) * (new_max - new_min)) / (old_max - old_min)) + new_min;

void get_touch_point(uint16_t in_x, uint16_t in_y, lv_point_t * p, touch_rot_t rot) {
    switch (rot) {
        case 0:    p->x = (lv_coord_t) remap(in_x, 0, MAX_WIDTH, 0, MAX_WIDTH);
                   p->y = (lv_coord_t) remap(in_y, 0, MAX_HEIGHT, 0, MAX_HEIGHT);
        case 1:    p->x = (lv_coord_t) remap(in_y, 0, MAX_HEIGHT, 0, MAX_HEIGHT);
                   p->y = (lv_coord_t) remap(in_x, 0, MAX_WIDTH, 0, MAX_WIDTH);
        case 2:    p->x = (lv_coord_t) remap(in_x, 0, MAX_WIDTH, MAX_WIDTH, 0);
                   p->y = (lv_coord_t) remap(in_y, 0, MAX_HEIGHT, MAX_HEIGHT, 0);
        case 3:    p->x = (lv_coord_t) remap(in_y, 0, MAX_HEIGHT, MAX_HEIGHT, 0);
                   p->y = (lv_coord_t) remap(in_x, 0, MAX_WIDTH, MAX_WIDTH, 0);

It’s actually a pretty simple utility function to do the coordinate mapping for the touch display.

Dear Kevin,

Thank you for taking the time to clarify. It helps a lot.


No worries m8, I just hope it helps you out.