Tuesday, May 19, 2020

What Is the Difference Between Accuracy and Precision

Accuracy and precision are two important factors to consider when taking data measurements. Both accuracy and precision reflect how close a measurement is to an actual value, but accuracy reflects how close a measurement is to a known or accepted value, while precision reflects how reproducible measurements are, even if they are far from the accepted value. Key Takeaways: Accuracy Versus Precision Accuracy is how close a value is to its true value. An example is how close an arrow gets to the bulls-eye center.Precision is how repeatable a measurement is. An example is how close a second arrow is to the first one (regardless of whether either is near the mark).Percent error is used to assess whether a measurement is sufficiently accurate and precise. You can think of accuracy and precision in terms of hitting a bulls-eye. Accurately hitting the target means you are close to the center of the target, even if all the marks are on different sides of the center. Precisely hitting a target means all the hits are closely spaced, even if they are very far from the center of the target. Measurements that are both precise and accurate are repeatable and very near true values. Accuracy There are two common definitions of accuracy. In math, science, and engineering, accuracy refers to how close a measurement is to the true value. The ISO (International Organization for Standardization) applies a more rigid definition, where accuracy refers to a measurement with both true and consistent results. The ISO definition means an accurate measurement has no systematic error and no random error. Essentially, the ISO advises that accurate be used when a measurement is both accurate and precise. Precision Precision is how consistent results are when measurements are repeated. Precise values differ from each other because of random error, which is a form of observational error.   Examples You can think of accuracy and precision in terms of a basketball player. If the player always makes a basket, even though he strikes different portions of the rim, he has a high degree of accuracy. If he doesnt make many baskets but always strikes the same portion of the rim, he has a high degree of precision. A player whose free throws always make the basket the exact same way has a high degree of both accuracy and precision. Take experimental measurements for another example of precision and accuracy. If you take measurements of the mass of a 50.0-gram standard sample and get values of 47.5, 47.6, 47.5, and 47.7 grams, your scale is precise, but not very accurate. If your scale gives you values of 49.8, 50.5, 51.0, and 49.6, it is more accurate than the first balance but not as precise. The more precise scale would be better to use in the lab, providing you made an adjustment for its error. Mnemonic to Remember the Difference An easy way to remember the difference between accuracy and precision is: ACcurate is Correct (or Close to real value)PRecise is Repeating (or Repeatable) Accuracy, Precision, and Calibration Do you think its better to use an instrument that records accurate measurements or one that records precise measurements? If you weigh yourself on a scale three times and each time the number is different, yet its close to your true weight, the scale is accurate. Yet it might be better to use a scale that is precise, even if it is not accurate. In this case, all the measurements would be very close to each other and off from the true value by about the same amount. This is a common issue with scales, which often have a tare button to zero them. While scales and balances might allow you to tare or make an adjustment to make measurements both accurate and precise, many instruments require calibration. A good example is a thermometer. Thermometers often read more reliably within a certain range and give increasingly inaccurate (but not necessarily imprecise) values outside that range. To calibrate an instrument, record how far off its measurements are from known or true values. Keep a record of the calibration to ensure proper readings. Many pieces of equipment require periodic calibration to ensure accurate and precise readings. Learn More Accuracy and precision are only two important concepts used in scientific measurements. Two other important skills to master are significant figures and scientific notation. Scientists use percent error as one method of describing how accurate and precise a value is. Its a simple and useful calculation.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.