The raspberry pi reads this data using a python script and stores it in a mysql database.
There are php pages that are used to extract and display the data.
The table that stores the sensordata is pretty simple: it just contains an Primary ID, SensorID, DateTime and SensorValue(float)
Code: Select all
CREATE TABLE IF NOT EXISTS `SensorData` (
`ID` int(10) unsigned NOT NULL AUTO_INCREMENT,
`SensorID` tinyint(3) unsigned NOT NULL COMMENT 'SensorID',
`DateTime` datetime NOT NULL,
`Value` float NOT NULL,
PRIMARY KEY (`ID`),
KEY `DateTime` (`DateTime`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1 AUTO_INCREMENT=1835196 ;At the moment I filled the table with some data (1 000 000 records) to see how it would perform in real life. 1 000 000 records = +/- 1 year of logging. But the this is getting terribly slow.
My python script does about 5 selects and 1 insert. When the table is nearly empty it goes very fast (< 100ms) but it gets slower as data is added.
Up to the point where the script takes +60 seconds.
Am I asking too much from the raspberry or might there be a solution?