carefull now, i'm a bit thick.

i'm using a machine/software at work to collect data, and analyse it.

there are filters i can apply to reduce the noise/spikes in the data.

(if one data point says 0.05, but the other 500 say 0.01ish, then i don't care about the 0.05)

the filter has a setting - a standard deviation value.

(default setting = 3)

now, my less-than-confident understand of 'standard deviation' is that the value is in the same units as the data (in this case, millimeters).

and standard-deviation is the square-root of the average-squared-deviation.

and it's given the name 'sigma'

so, a default setting of 3 will be letting all my data through - right?

and if i want to remove the spikes, i should set the filter to 0.02ish?

But, i've got conflicted understanding; the bell-curve distribution diagram that keeps popping up on google says that +/- 1 sigma covers 68% of the data, and 2 will cover 95%.

so, what's the difference between my standard deviation (sigma) value of 0.01ish and the standard deviation curve sigma?

and what do i set my filter to?

if i get these numbers wrong, we won't be able to calculate our path through hyper-space, and we'll overshoot the earth by 12 light years...