ST7701 with LVGL 9 (Waveshare 2.8” 480x640 ESP32-S3)

It’s not the ADC that’s the problem, it’s the reference voltage. Do you have a ground shared between whatever it is that is inputting the voltage and the display board? You need to have those on a common ground. You also need to look at what the min and max voltages are that is going to be seen on that GPIO. You want to make sure you don’t go above 3 volts. You might need to adjust the attenuation of the GPIO and also do a calibration of the ADC to get the voltages dialed in properly.

1 Like

and just to let you know. the schematic you are looking at where those pin definitions came from is not the right schematic.

This is the correct one…

It is, in fact the right one. I have both the 2.8 and the 2.8B here. Sorry, I’ve been trying out too many options and maybe not explaining things very well.

EBD2 - the “2.8” - 2.8” 240x320 res
EBD4 - the “2.8B” - 2.8” 480x640 res

EBD4 is the one I was having trouble getting the screen refresh rate to be fast enough.

EBD2 I have the screen working now and internal ADC working on GPIO10 (ADC1_CH9) but the ADC isn’t as stable as the CYD was.

any time you deal with an ADC It is always wise to use a smoothing algorithm to get a normalized reading.

Here is an example.

uint16_t smoothing_buffer[30];
uint8_t sample_index = 0;

uint16_t second_smoothing_buffer[10];
uint8_t second_sample_index = 0;

uint16_t get_smoothed_value(uint16_t *buf, int buf_len) 
{
    uint16_t min_value = 65535;
    int min_index = -1; 
    uint16_t max_value = 0;
    int max_index = -1;
    
    uint32_t sample_sum = 0;
    
    // we pitch the higest and the lowest values treating them as outliners.
    for (int i=0; i<buf_len; i++) {
        if (min_value > smoothing_buffer[i]) {
            min_value = smoothing_buffer[i];
            min_index = i;
        } else if (max_value < smoothing_buffer[i]) {
            max_value = smoothing_buffer[i];
            max_index = i;
        }
    }
    
    for (int i=0; i<buf_len; i++) {
        if (i == min_index || i == max_index) {
            continue;
        }
            
        sample_sum += (uint32_t)smoothing_buffer[i];
    }
        
    uint16_t res = (uint16_t)(sample_sum / ((uint32_t)buf_len - 2));
    
    return res;
}



while (1) {
    uint16_t adc_sample = get_sample();
    
    if (sample_index < 30) {
        smoothing_buffer[i] = adc_sample;
        sample_index++;
    } else {
        memcpy(smoothing_buffer, smoothing_buffer + 1, 29 * sizeof(uint16_t));
        smoothing_buffer[29] = adc_sample;
        
        uint16_t smoothed_value = get_smoothed_value(smoothing_buffer, 30);
        
        if (second_sample_index < 10) {
            second_smoothing_buffer[second_sample_index] = smoothed_value;
            second_sample_index++;
        } else {
            memcpy(second_smoothing_buffer, second_smoothing_buffer + 1, 9 * sizeof(uint16_t));
            second_smoothing_buffer[9] = smoothed_value;
            
            // this is the value you used. This is a normalized ADC value which 
            // removes any jitter from the signal.
            smoothed_value = get_smoothed_value(second_smoothing_buffer, 10);
            
            // optionmally reset the smoothing so it starts from scratch again
            second_sample_index = 0;
            sample_index = 0;            
        }
    }
}

The sizes of the buffers you would tweak to what you need. and you might be OK using only a single smoothing buffer. It just depends on the amount of jitter you are seeing. you can also alter how many values are to be treated as bad values due to them being too high or too low.

The link you provided in your initial post is to the 2.8B so that is what I was going by.

1 Like

Here’s my process right now…

  1. Initialize and start ADC for continuous reading
  2. When the samples come in, they come in as values between 0 and … maybe 2700 (integers)
  3. I then look for the minimum and maximum values in the incoming buffer and normalize them into values that are between -1 and 1
  4. I then put those values through a signal conditioner (provided by GitHub - cycfi/q: C++ Library for Audio Digital Signal Processing)
  5. Then I pass those values into that same library’s zero-crossing pitch detection algorithm
  6. It spits out a frequency and I then pass that frequency through some additional averaging/smoothing filters (1EU and Exponential)

All of that was working much better on the CYD. But, I’ll keep trying and see if I can improve the grounding situation and other things. It could be that it’s a bit more jittery because I’ve got parts of it connected to a breadboard and who knows what noise that may be introducing.

Yeah, sorry - my bad … jumping all over the place trying to quickly find the best solution for the end product. I was hoping I was going to have all of this solved a couple weeks back. We were “this close” to ordering new PCBs and enclosures to start the first round of evaluation kits. I should have known better. :wink:

So I threw a DMM on the output of my PCB onto the pin that goes to ADC1_CH9 of the EBD2. At rest, it’s measuring 3.3V. When I strum as loud as I can, it drops to about 2 VDC.

I’m wondering if I should put a voltage divider on the output so that the at-idle voltage is in the middle of 3.3V so the A/C signal could swing above and below that?

On my PCB from the output of the Op Amp before it connects to EBD2 it goes through a 470nF coupling capacitor so nothing from the PCB should be contributing to anything to the DCV that I’m seeing. I don’t recall doing this measurement with the CYD so I don’t know if this is the same behavior or not.

But, based on the incoming values of the ADC on CYD that I remember, I think I saw values between 0 and 4095 if I’m remembering right. It could be doing things this way on EBD2 because this is normally supposed to be an SCL GPIO that I’ve hijacked.

After initializing ADC I tried using this but it still stays at 3.3V:

    gpio_config_t adc_gpio_conf = {
        .pin_bit_mask = (1ULL << GPIO_NUM_10),
        .mode = GPIO_MODE_DISABLE,
        .pull_up_en = GPIO_PULLUP_DISABLE,
        .pull_down_en = GPIO_PULLDOWN_DISABLE,
        .intr_type = GPIO_INTR_DISABLE,
    };
    gpio_config(&adc_gpio_conf);

That makes me think that SCL is hard-wired in this dev board to 3.3V.

If I add a 10K pulldown resistor from GPIO10 to GND it puts the DC voltage around 1.7V, pretty close to in the middle and my readings are a bit nicer. Still not 100% smooth, but I’ll fiddle around with some additional smoothing techniques with the circuit and in software.

Oh shit I know what is going on. You are using the I2C lanes and there are devices that are attached to those lanes. There is also going to be a pullup resistor also attached to those pins as well. You cannot use that GPIO for what you are wanting to do because of the pullup resistor and also because of the IC’s reading the same pin and it could end up reading it as data which would cause the IC to then send data on that same wire.

This is able to be easily rectified by cutting the traces at the IC’s and the pullup so you are only connected to the guitar at that point.

How about this little bad larry…

https://www.amazon.com/LILYGO-ESP32-S3-Display-Bluetooth-Development/dp/B0CKVSSQHS

4 ADC GPIO’s are broken out. It has an 8 lane I8080 display IC so it will perform better with the display DPS. It is not as high a resolution which might be the deal breaker because it is only 320 x 240.

That, unfortunately has a screen that looks like we’d have the same mounting issues as we did with CYD.

Good news with EBD2 though. I tried this out and it WORKS! I’m using ADC2_CH4 (GPIO15).

CONFIG_ADC_CONTINUOUS_FORCE_USE_ADC2_ON_C3_S3

As a bonus I have the screen for EBD2 working and it’s fast. I’ve got brightness levels working and also screen rotation via esp_lvgl_port.

So, I think we’ve found our candidate.

I can set you up with faster rotation code if you want.

Getting that sampling off of GPIO10 solved the issue with the readings being erratic I am guessing. I was right about the IC’s and the pull resistor causing it a hard hour.

Just to let you know if you want to smooth out the signal even more you can add a 100nf ceramic cap to the GPIO pin and that will help to eliminate noise.

1 Like

Rotation is more or less a one-time thing that people will do. It seems like it’s running just as fast rotated vs when it wasn’t. I think the esp_lvgl_port is probably doing most of the work there?

Yep, I’m sure GPIO10 is hard-wired for 3.3v. I’m glad ADC2_CH4 seems to work well.

Regarding the 100nF cap, I’ve tried it and don’t notice any difference really. I think I have it working just as well as it ever did on the CYD.

One thing I also added is adc_new_continuous_iir_filter(). Although, I don’t really see much of a difference. It’s mentioned on this page and I’ve tried all the different coefficient values with seemingly no effect: Analog to Digital Converter (ADC) Continuous Mode Driver - ESP32-S3 - — ESP-IDF Programming Guide v5.2.5 documentation

Have you tried doing a calibration of the ADC? I do suggest doing that.

What is the maximum voltage seen from the guitar feeding the GPIO? what is the maximum ADC value that you are seeing?

The wire from the guitar what is the state of the pin when the guitar is not being used? Is it floating? If it is floating I would guess that the line sees voltage when in use. Would there be any issue with using a pull down resistor?..

I am asking these questions because there is some fine tuning that can be done to the ADC. things like lowering the ADC input voltage. This would give you a higher precision to the samples.
As an example. If the wire form the guitar is < 2.2V then we can change the attenuation to 6DB which would remap the input voltage range of 0v - 2.2v to 0 - 4095. Right now it is set to a range of 0 - ? to 0 - 4095

The reason why the max input voltage is a question mark is because it is going to use the voltage seen on the VDD_A pin as the maximum input voltage. This is not an ideal thing to use depending on how well the power supply on the board is made. Due to the dynamic current use of the ESP32 more load on the power supply can cause the voltage to drop and that would alter what your ADC reading are. Where ass using a lower attenuation you would not see that happen. If the voltage is higher than 2.2 then using a voltage divider to get the max voltage down to 2.2v or slightly lower like 2.15 would give you a much larger range of ADC values. Better would be to use a level shifter but that adds additional complexity and cost. Using a voltage divider should not be an issue.

This is from the IDF SDK header files.

 * Due to ADC characteristics, most accurate results are obtained within the following approximate voltage ranges:
 *
 * - 0dB attenuation (ADC_ATTEN_DB_0) between 100 and 950mV
 * - 2.5dB attenuation (ADC_ATTEN_DB_2_5) between 100 and 1250mV
 * - 6dB attenuation (ADC_ATTEN_DB_6) between 150 to 1750mV
 * - 12dB attenuation (ADC_ATTEN_DB_12) between 150 to 2450mV

So it is telling you to get the most accurate readings for the current atten which is 12DB you don’t want the voltage to be above 2.45. But with the non consistant volage that is seen on VDD_A when using that attenuation there could be some variance in the ADC samples.

It would be better to use the 6db attenuation and get the voltage down to the 1.75 mark which is the suggested maximum to get the most accurate readings.

I’ve seen that be mentioned but when I looked into it before, I was thoroughly confused. Is that something you do once per device? Is it something you do always at boot?

Max voltage I’ve seen so far on the ADC2_CH4 pin (GPIO15) is 1.2VDC when measuring with my DMM. The actual values coming in are 0 to 4095 using ADC_ATTEN_DB_12.