Yes, I did connect the 4K7 resistor between the 5V and the 3.3V. If I interpret your answer correctly, I'm lucky my Pi is not broken? I did something like in the picture I added (the website google links to no longer exists). The resistor was placed closer to the Pi than to the sensor.
No voltage divider of level shifter required because pull up is still 3.3V
Thanks for the correction.. I edited the offending post to strike through the misinformation
Scrubbing the data sheet for the sensor, there is nothing to indicate (tables, graphs, etc.) that the sensor's temperature measurement would be effected by the supply voltage used. I would think if this were the case, then the datasheet would clearly indicate this.pfletch101 wrote: ↑Wed May 15, 2019 7:16 pmCould it be something as simple as heating due to increased power dissipation in the sensor at the higher supply voltage?
Not necessarily! I was not suggesting that the relationship between the actual temperature of the sensor and the returned result was affected by the supply voltage - this would certainly need to be documented in the datasheet. What I was suggesting (though I didn't and don't a priori think it terribly likely) was that the almost certainly increased 'wasted' current flow with increased supply voltage through the circuitry around the sensor was heating it up, so that it was correctly reading a higher temperature.ptimlin wrote: ↑Sat May 18, 2019 6:18 pmWed May 15, 2019 7:16 pm
I still don't think so. These sensors have been around for decades, long before Raspberry Pis existed and also very popular with previous hobbyist micro-controllers and boards (Microchip PIC, Atmel AVRs, and a slew of others) so they have been very well used in the field for a longgggg time. I would think that if running these at different voltages within the supply specification caused an actual temperature change in the sensor it would be well documented. But searching back when this was first posted here, I could not find any references to temperature reading being affected by different supply voltages.
I didn't say it was likely, and it is probably ruled out by the speed of response to the voltage change, and I also don't want to beat a dead horse, but people mostly use a single supply voltage, rather than switching voltages. The observed difference was small enough that it might be written off to device variation if you didn't actually record the measurements at two different supply voltages.ptimlin wrote: ↑Mon May 20, 2019 7:08 pm
omegaman477 wrote: ↑Wed Jul 03, 2019 2:15 pmAll temperature sensing projects, I MEAN ALL, should be calibrated against a reliable standard or reference.
Direct read digital sensors have lead the world into a false sense of accuracy. An DS18xxx sensor can be up to +/- 2ºC incorrect on OFFSET.
Regardless of what the sensor tells you, calibrate it at 0C and 100C, and use this calibration offset for all data points.
Most sensors are relatively linear (that is if they they are 4C out at the lower limit they are 4C out at the top). So you need to only correct OFFSET, not SPAN.
Calibrate people, calibrate. Don't be so lazy.
Back in the old days (sigh you remark) you HAD to calibrate. Nothing has changed.
Seems like you soldered pullup straight on ds18b20 pins, so you have a small heater near the temperature sensor.