Hi,
I'm measuring 0 - 10 volts using a U3-HV at AIN0 with respect to AIN1 using the following program. The converted voltage is close by not quite what I'd expect. It's almost spot on compared with a DVM reading for 0 - 5 VDC. It's off by 0.1 - 0.2 VDC when measuring 5 - 10 VDC. If I recalculate the coefficient for a known 10 volts then the error in measurement shifts to the 0 - 5 VDC range. If my math is correct it should have 20.6 / 2^12 = 5.03 mV of resolution. I looked at other similar topics such as U3-HV Differential inputs and re-read sections 2.6.2 and 5.4 for help but it still doesn't make sense. Any suggestions?
Thanks!
int handle = 0;
double bits = 0;
string serialNumber = "320077941";
double volts = 0;
const double coefficient = 0.0006245;
LJUD.OpenLabJack(LJUD.DEVICE.U3, LJUD.CONNECTION.USB, serialNumber, true, ref handle);
LJUD.eGet(handle, LJUD.IO.GET_AIN_DIFF, 0, ref bits, 1);
volts = coefficient * bits - 20.6;
Console.WriteLine("Bits : {0}", bits.ToString("f0"));
Console.WriteLine("Voltage : {0}\r\n", volts.ToString("f2"));
Thread.Sleep(1000);
Where do these signals come from, as actual differential signals at high voltages like this are fairly unusual:
https://labjack.com/support/app-notes/differential-analog-inputs
So you have a DVM connected to AIN0 & AIN1 and are looking at the DVM readings and U3 readings at the same time? Normally then you take a couple DVM readings versus U3 readings and come up with a slope and offset that fits those points, as opposed to just doing math on nominal values to come up with a slope and offset.
Is the voltage changing on both AIN0 and AIN1, versus GND? Since each has its own scaling circuitry, if both are changing you can get a non-linear response such that a simple slope/offset equation is not sufficient.
Try taking single-ended readings of AIN0 and AIN1, and subtracting in software to get the difference.
I need to measure and record the 0 - 10 VDC drop across a 500 Ohm resistor that's connected to the 0 - 20 mA output of a temperature to current transducer. The outputs of the transducer are isolated from earth ground so using GND or SGND as a reference will induce noise into the measurement.
At this point, I'm just testing my application with the U3-HV at the bench using a linear DC power supply to simulate the 0 - 10 volts i.e. (+) terminal to AIN0 and (-) terminal to AIN1.
If I modify this to take 2 single ended measurements and calculate the difference in my program, not sure what I'd use as a gnd reference to the LJ other than earth GND per the application note on differential readings?
The process of converting the 2^15 bits read from a unipolar 10 volt measurement into a DC voltage is still a mystery to me, could you please clarify?
Thanks!
The process of converting the 2^15 bits read from a unipolar 10 volt measurement into a DC voltage is still a mystery to me, could you please clarify?
Start with coming up with a linear equation, and use this to convert raw counts from the U3 to a voltage:
y = mx + b
Where m is the slope and b is the offset. In your case y is volts and x is raw counts. The equation written out with units for your case is:
y [volts] = m [volts/count] * x [counts] + b [volts]
For calibration, you need some sort of reference. Sounds like you are using your DVM as the reference, so you assume it gives the correct voltage and use those voltages to calibrate.
You want to gather some x,y data pairs, or points, that you are going to use for your calibration. You need at least 2 points. With 2 points the slope intercept will give an exact line fit that goes through those 2 points. With more than 2 points you will do a
When I search "slope intercept calculator" the first hit looks good for 2 points:
https://www.miniwebtool.com/slope-intercept-form-calculator/
When I search "line fit calculator" the 2nd hit seems useful for 2 or more points:
http://www.endmemo.com/statistics/lr.php
I need to measure and record the 0 - 10 VDC drop across a 500 Ohm resistor that's connected to the 0 - 20 mA output of a temperature to current transducer. The outputs of the transducer are isolated from earth ground so using GND or SGND as a reference will induce noise into the measurement. At this point, I'm just testing my application with the U3-HV at the bench using a linear DC power supply to simulate the 0 - 10 volts i.e. (+) terminal to AIN0 and (-) terminal to AIN1.
It sounds like you have 2 floating voltages connected to the U3. The difference between + and - might be 5 volts, but the voltage of + and - versus U3-GND is not defined so they could be 5V and 0V or they could be 100V and 95V.
This is likely your main issue here. You can't have totally floating signals, and is a common mistake when using differential connections. See "Differential inputs must have a reference" on the Differential App Note:
https://labjack.com/support/app-notes/differential-analog-inputs
You could try adding a 10k resistor from AIN1 to GND. This will keep 10 kohms of isolation between U3-GND and the low side of your shunt resistor, but will hold AIN1 to a voltage near 0 as defined versus U3-GND.
Another option is to isolate the U3 upstream on the USB connection, and then just connect the low side of your shunt right to U3-GND and do a single-ended measurement:
http://microcontrollershop.com/product_info.php?currency=USD&products_id...