The analog response time is the time it takes from when an analog input changes to when a responding analog output changes. This total response time can be broken down into several parts as an analog input goes into the Quarto and makes it way to an analog output:
- DAC Update Latency
- Analog Output Filter
- Analog Input Filter
- ADC Measurement Latency
- Interrupt Latency
- MCU Processing Time
DAC Update Time
The DAC Update Latency Time is 1µs2.
Analog Output Filter
The output filter between the DAC and the Quarto's Analog Output has a bandwidth of 180 kHz2. While this filter isn't a simple delay circuit, as the delay varies for different frequency components, for frequencies at 100 kHz or below, the delay is 1.2µs.
Analog Input Filter
The input filter in front of the Analog Input has a bandwidth of 400 kHz. While this filter isn't a simple delay circuit, as the delay varies for different frequency components, for frequencies at 100 kHz or below, the delay is 550ns.
ADC Measurement Time
The ADC Measurement Time is 800ns1.
Please see Measuring Interrupt Latency App Note note for details, but the worst-case interrupt latency is 210ns.
MCU Processing Time
Clearly this depends on the function that that is executing. In the Measuring Interrupt Latency App Note the processing time was only 160ns. More complex functions will take longer to run. If unsure how long something takes to execute, as done in the app note, can you use a trigger line to measure the execution time with an O-scope. The code would be:
triggerWrite(1,HIGH); // Set Trigger 1 high at start on function
triggerWrite(1,LOW); // Set Trigger 1 low after function completes
But for straight-forward arithmetic functions (no loops), 500ns is a safe estimate.
Entire Loop Delay
In the case of a servo where the Quarto's analog output is connected to its analog input, then we can measure the delay through the entire closed loop. If at t=0, we look at a signal at the Quarto's analog input, then that signal gets to the ADC at time 800ns. The interrupt latency adds 210ns, so the MCU gets the data at time 1.01µs. Assuming the MCU takes 290ns to calculate the DAC update value, then at t=1.3µs the MCU updates the DAC. That update hits the DAC output at t=2.3µs and gets through the output filter at 3.5µs and through the ADC input filter at 4.05us. This means that the closed-loop signal delay is 4.05µs. However, this ignores the sampling time of the ADC. If the ADC is sampling every 1µs, then some information is delayed by nearly 1µs before the ADC samples it and other information is barely delayed at all. A reasonable approximation is to assume that all signals are delayed by the average delay which is half the sampling time. When sampling every 1µs, this adds an additional delay of 500ns. If sampling at 2µs, then the extra delay would be 1µs.
Putting this all together, for servo loop sampling every 1µs, we have a total delay of 4.05µs + 500ns = 4.55µs. We would expect this setup to oscillate at 110 kHz, which is basically the measured oscillation frequency of 110 kHz.
In the case of a two channel servo where the sampling rate is every 2µs, then the total delay is 5.05µs, which should oscillate at 99kHz, which again is what we measure.