measuring efficiency of a solar thermal panel
Posted 11 November 2005 - 11:49 AM
Posted 11 November 2005 - 01:29 PM
Posted 11 November 2005 - 04:43 PM
The part you pointed out could be made to work, but it is not particularly accurate and it has a current output that you would have to convert to voltage. A better choice is the LM34 from national.com. The LM34CAZ is very accurate and easy to use. National will send you a couple free samples from their web site. The LM34 is better than the LM35 because you get more volts/temp, and also because with a single +5 volt supply you get a range of +5 to +300 deg F (-15 to +149 deg C).
Another good option is the EI-1022 sold by us. It works from -40 to +100 degrees C, and comes as a handy probe assembly. The absolute accuracy is not as good, but it can be calibrated to provide good results.
Posted 15 November 2005 - 12:14 PM
Posted 04 March 2013 - 08:14 AM
Since the analog input bias currents on the U12 get negative with decreasing voltage, I did some testing with the LM34CAZ connected to the U12 at low temperatures. With nothing added, the LM34CAZ could not measure below 56 degrees F. With a 9k resistor from signal to ground (AI0 to GND), I could measure to 9 degrees F. With a 4.7k resistor I could get below the minimum rated temp of 5 degrees F.
Did the same testing with a UE9, which has much lower analog input bias current, and the LM34CAZ did not need any load resistance to reach 5 degrees F.
I ran a test with the U12 and a 10k resistor, it got down to 5deg F but it was reading 49 deg F at room temp!
Guys! is there any way to get the full range on the U12?
Posted 04 March 2013 - 05:45 PM
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users