Dear Support,
I experienced a behavior of my T7-Pro, which I can't explain by myself.
Running the T7 in "Command Response-Mode", at a resolution index of 8, which corresponds to an effective resolution of 19.1 bits at a voltage input range of +-10 V (gain 1).
This led to a series of measurements, which I obtained at 100 Hz. (Please see picture attached)
At a resolution of 19.1 bits, I expect a LSB of 3,5592x10^-5 V.
I obtained a LSB of 7,9x10^-5 V, which is more than 1 bit less, than expected.
Could you please explain to me, why this happened?
Sincerely
Jaques
When using the 16-bit high-speed ADC (ResolutionIndex = 1-8), the raw resolution is truncated to 18 bits. The oversampling and noise reduction math is done in 32-bit variables in the processor, but the final values are truncated to 18 bits. The reason is that due to the INL and DNL limits of the ADC any resolution beyond 18 bits would not be very meaningful.
So in Table A.3.1.1, the 5 numbers that show >18.0 for ResolutionIndex = 1-8 are statistical values telling you that you will get very low noise readings. These are effective resolution, as defined in the text above the table:
https://labjack.com/support/datasheets/t-series/appendix-a-3-1