Etruscian wrote:Thanks for the informative post.
I'm thinking of getting a hall sensor for the current, that would be the safest.
May well need an op-amp scaling stage, as a resitor divider or other resistive means will have problems with the existing onboard resistor dividers.
For the voltage, wouldn't it suffice to use less accurate resistors and make a mapping which negates the differences? So see what the ADC reads, check with a multimeter (i know they aren't 100% accurate) on the bare open clamp circuit of the solar panels and adjust accordingly.
Using 1% resistors are cheap these days and usually have better temperature co-effiecents (ppm/deg C) than 5%. Mapping could be done but you would need an accurate and preferably calibrated multimeter and a stable source of at least 75V as reference, also you better do it at several ambient temperatures as they will vary. Each resistor will vary with temp slightly differently and the wider the initial tolerance the more marked the effects.
Does the single ended differential inputs mean that it will measure the voltage in respect to an internally wired ground, or a reference voltage? Wouldn't measuring the voltage on the DC side be easier then, since there will be no negative component and thus no need to raise the average voltage to say 2v3?
The negative sides of the ADC inputs are tied to GND, the same GND as the Pi, so all measurements are +/-5V range at the ADC connector relative to that GND. Most hall effects that have 5V o/p are raised anyway so positve current is from 2V5 to 5V, with no current being 2V5, you would need an op-amp stage with voltage reference to change that.
You will still need to drop the about 250V down to 5V but putting external resistors, you do not know what tolerance they have used for the ADC front end so hope they have used 1% or better. Rough calculations show you would need to put a 308k5 resistor in series as minimum to scale the Analog signals to 5V level from 250V. Using standard values like 300k or 330k will change the scaling dramatically, 300k would drop the max input voltage to approx 236V, upping to 330k will extend range to approx 260V.
To map the input ranges to ADC ranges you need to do an 18bit conversion (several probably averged to eliminate noise) for best resolution with at least 75 to 150V on the external resistor, so you got max number of bits active. This would have to be from a stable source. I have PSUs capable of 150V DC do you?
Just another techie on the net - For GPIO boards see http:///www.facebook.com/pcservicesreading