I’m being thick, and could do a hand from someone “in the know”
I’ve never been good at maths, and thankfully most of the time my job doesn’t require me to do anything more complex than count up to 20 which I can do if I take my shoes and socks off.
I’m currently doing something that is a lot outside my comfort zone and could really do with checking that I’m on the right lines before I make a complete arse of myself at work tomorrow!
I’m measuring some samples (F) by getting their mass (m), and measuring a particular property(DI), that I want to plot in terms of unit mass (DI/m vs F).
So, I’ve done the measurements and I’m aware that the weighing out bit has an inherent error (I’m going to assume that the machine doing the measurement of DI doesn’t have an error)
So I’ve calculated the standard error of my masses and have a value (2.5×10^-5 for what it’s worth). But I’m confused about how I turn this error into an error bar for Excel to stick on my plot.
If m actually = m +/- 2.5×10^-5 then do I work out what DI/+m, and DI/-m are, calculate the standard deviation and then standard error of the difference between to the two, and this becomes the standard error of DI/m?
I appreciate I’m a bit slow, so please be gentle 🙂