calculating absolute and percentage error in temperature measurement?

Not sure how to do this, anyone got any ideas?

3 Answers

Relevance
  • Bert K
    Lv 7
    10 years ago
    Favorite Answer

    Absolute Error is when you subtract the accepted value from your measured value…

    Absolute Error = Measured Value - Accepted Value

    * A positive answer means you are over the accepted value.

    * A negative answer means you are under the accepted value

    Percentage Error is the most common way of measuring an error, and often the most easy to understand.

    Percentage Error = Absolute Error / Accepted Value

    So, if you measured a pencil to be 102mm long, and an independent lab with high tech equipment measured it as 104mm, the percentage error is…

    Percentage Error = (102mm - 104mm) / (104mm) = -0.02

    Which means you got a -2% error. The minus sign just means that you were under the accepted value.

    In high school labs, don’t be surprised if you obtain errors of 25%. The important part is, can you explain your errors!

    .

    • Login to reply the answers
  • 3 years ago

    Percent Error Calculation

    • Login to reply the answers
  • 10 years ago

    Bert is correct except for one point:

    Temperature in ºF or ºC has an arbitrary zero point, therefore percentage error is meaningless. If you really want percent, you have to convert temperatures to a true zero base, such as ºK or ºR.

    One example will show this. If your thermometer has a 1º error, then for a 100º measurement, that is 1% error. For a measurement at 1º, that changes to a 100% error, and at 0º, an infinite percent error.

    .

    • Login to reply the answers
Still have questions? Get your answers by asking now.