Notes on error propagation

Notes on Uncertainty, Error Propagation, and so forth

Suppose you make a measurement of some quantity associated with an object (e.g. mass of an object, speed of an object, metabolic rate of an object, etc), and you just plot your data is a simple-minded way.
Figure 1. Fulmars (northern fulmars), and measured metabolic rates of a small sample of males and females.
There are of course important statistical measures of the measurements, such as mean, standard deviation, and standard error of the mean. These are discussed in some detail in the appendices.
Figure 2. Principal statistical measures of some quantity '$x$'.

4 Important Questions:

  1. From the graph in figure 1, what is $\overline{M}$, the mean metabolic rate for these to populations?
  2. From the calculated standard deviation, how many digits can we keep in the 'best value' of the measured quantity, how many are 'significant'? Discuss.
  3. Knowing that the standard deviation itself is an estimate how many digits of it should we keep (or are significant)? Discuss.
  4. Express the best for each population in the form $M = M_{best} \pm \Delta M$; is there a significant difference between that of the 2 populations? Is the discrepancy between the 2 measurements greater than equal to or less than the uncertainty in those two quantities?

Then there are circumstances that arise in the course of an experiment in which we need an estimate of uncertainty for a quantity calculate from measured quantities that are themselves uncertain. Maybe it's a simple as a speed multiplying a time that gives us a distance or something (and we've measured some how the speed and its uncertainty, and separately the time and its uncertainty and so on). We ask generally what is the uncertainty of a quantity that is a product of quantities, each with an uncertainty, say, \begin{equation} c = (a \pm \Delta a)(b \pm \Delta b), \end{equation} what is $\Delta c$? If we may take the errors to be independent of one another, standard usage holds that square of the relative error in $c$ is the sum of the relative errors in $a$ and $b$ (true for division too), that is, \begin{equation} \frac{\Delta c}{c} = \left(\sqrt{ \left(\frac{\Delta a}{a}\right)^2 + \left(\frac{\Delta b}{b}\right)^2 } \right), \:\: or, \:\: \Delta c = c \left(\sqrt{ \left(\frac{\Delta a}{a}\right)^2 + \left(\frac{\Delta b}{b}\right)^2 } \right). \end{equation} It gets a little (modestly) trickier if the function relating the variables is nonlinear. What if $c$ is a function two somethings, I don't know, say $\rho$ and $\sigma$, \begin{equation} c = k\sqrt{\frac{\rho}{\sigma^3}}, \end{equation} where now one of the variables, $\sigma$ enters the relation with a negative exponent (in this case, $-3/2$). If indeed the uncertainty estimates are independent, then, we would estimate that the relative uncertainty in $c$ is \begin{equation} \frac{\Delta c}{c} = \left(\sqrt{ \left(\frac{1}{2}\frac{\Delta \rho}{\rho}\right)^2 + \left(\frac{3}{2}\frac{\Delta \sigma}{\sigma}\right)^2 } \right), \end{equation} where care has been taken to use coefficients equal to the exponents represented as magnitudes, without their algebraic signs, lest we think that one uncertainty can subtract from another.} And as before, you can simply multiply through by the 'best value' for $c$ to get the estimate of its uncertainty. The care to keep the coefficients positive may seem unecessary given that they enter in as the square, however, the analysis that supports this estimate of the uncertainty requires it. It maybe understood as the result of a first order Taylor expansion. To simply the matter, suppose that $c = k/\sqrt{\sigma}$, where $k$ is some constant, and that we perform an estimate of the uncertainty with a Taylor expansion, truncating all but the 'linear term' in the expansion (which by the way is sometimes called 'linearlizing' the equation, making a verb of some noun, which we do all the time in the English language) \begin{equation} c \pm \Delta c = \frac{k}{(\sigma \pm \Delta \sigma)^{1/2}} = \frac{k}{\sqrt{\sigma}(1 \pm (\Delta \sigma/\sigma))^{1/2}} \approx c \left( 1 - \frac{1}{2} (\pm) \frac{\Delta \sigma}{\sigma} + ...\right) \equiv c \left( 1 + \frac{1}{2} \frac{\Delta \sigma}{\sigma}\right). \end{equation} We have dropped the higher order terms in the Taylor expansion assuming that $\Delta \sigma/\sigma \ll 1$, and in any case we will take that as an estimate, and after multiplying through with $c$ on the right hand side, $c$ is common to both sides, and can be subtracted, leaving \begin{equation} \Delta c = + c \frac{1}{2} \frac{\Delta \sigma}{\sigma}. \end{equation} Whenever there are multiple variables, they all contribute in a sum, which if independent, maybe thought of as components of an abstract vector, the sum of the squares of which are the magnitude of $\Delta c/c$, and hence the result shown in equation 4.