Temperature and pressure gauge sensors characteristics
I’m working on custom car gauges for measuring car coolant temperature and oil temperature and pressure. To save some cash I decided to try to use cheap automotive sensors from China, specifically sensors from Banggood.com. The issue with these sensors is the lack of datasheets and/or characteristics specifications, so I decided to measure the characteristics of the sensors by myself.
Temperature gauge sensor
There was basically zero information about the temperature sensor. I expected that it is NTC thermistor and the question was only what is its resistance at 25°C. Sadly my lab is not equipped with any precise heat source or thermometer so I end-up using a kettle and Fluke 17B+ multimeter with K-type thermocouple sensor. I find out that the resistance of the temperature sensor is 100KOhm @ 25°C and I also confirmed the NTC characteristics with my measurement.
Temperature gauge sensor measuring setup and results
My setup is really non-scientific and guarantees only poor precision and temperatures below 100°C.
My main goal was just to find its room temperature resistance and confirm NTC characteristics which I did. Reference temperature was measured with Fluke 17B+ with K-type thermocouple. The thermocouple was glued to the gauge sensor with Kapton tape. Sensor resistance was measured with Rigol DM3085E.
Temperature [°C] | Resistance [Ohm] |
13,8 | 165760 |
20,0 | 125435 |
25,1 | 98401 |
28,8 | 83451 |
35,0 | 64030 |
40,3 | 51386 |
44,9 | 42719 |
50,5 | 34460 |
55,1 | 28950 |
60,8 | 23514 |
65,6 | 19746 |
70,6 | 16613 |
75,4 | 14187 |
80,5 | 12020 |
85,2 | 10346 |
90,4 | 8790 |
95,2 | 7490 |
99,6 | 6640 |
Exponential regression shows the increased error below 30°C. This was most likely caused by the imprecision of my homemade calibration setup. The sensor behaves as a generic 100 kOhm NTC.
Pressure gauge sensor
The amount of information for the pressure sensor is a bit better, but I wanted to check sensor precision, linearity, and hysteresis anyway.
Pressure gauge sensor measuring setup and results
For the pressure sensor test, I made a simple jig with an analog gauge and a valve that is connected to my small compressor with a pressure regulator. The compressor in my lab is able to achieve pressures only up to 5.2 bar, so I wasn’t able to test a whole range of the sensor.
Pressure [bar] | Resistance (G) [Ohm] | Low-Pressure Alarm (WK) [Ohm] |
0 | 7,5 | 0 |
0,5 | 24,016 | 0 |
0,75 | 27,311 | 0 |
1 | 30,648 | 0 |
1,2 | 34,001 | 0 |
1,4 | 37,386 | 0 |
1,6 | 40,632 | Open |
1,8 | 43,935 | Open |
2 | 47,269 | Open |
2,2 | 53,818 | Open |
2,4 | 57,128 | Open |
2,6 | 60,48 | Open |
2,8 | 67,038 | Open |
3 | 67,36 | Open |
3,1 | 70,353 | Open |
3,2 | 70,399 | Open |
3,4 | 73,944 | Open |
3,6 | 77,015 | Open |
3,8 | 80,328 | Open |
4 | 86,896 | Open |
4,2 | 83,672 | Open |
4,4 | 93,565 | Open |
4,6 | 96,861 | Open |
4,8 | 100,193 | Open |
5 | 103,545 | Open |
5,2 | 106,981 | Open |
Resistivity values are really close to specifications. Low-pressure alarm threshold was observed at 1.5 bar instead of specified 0.8 bar. The pressure dependence of resistance is linear, but the resistance is changing with ~3 ohm discrete steps. This is most likely due to sensor implementation as a pressure-sensitive element that is connected to a wire-wound potentiometer. I observed an error at 4.2 bar. I’m not sure if it’s a sensor design issue or if the sensor is somewhat damaged. Resistivity also has noticeable hysteresis ~0.5bar.