I use a Butterworth digital low-pass filter to separate gravity from acceleration as the accelerometer readings for gravity drift over temperature and time (i.e. I can't just sample gravity at the start and use that throughout the run).
Sampling rate is 50Hz, cutoff is 0.05Hz (20s), 6th order.
Since the filter is an IIR, then the filter needs to be 'charged' for it to return a close approximation to the 'correct' value.
There must be a way to calculate how many iterations it takes charge the filter to a given % accuracy, but google isn't being as helpful as it usually is.
I currently am using a completely arbitrary 20s; I'd rather do it less empirically.
Suggestions?
TIA