Science, Tech, Math › Science What Is the Difference Between Accuracy and Precision? Accuracy is close to a known value; precision measures repeatability Share Flipboard Email Print boonchai wedmakawand / Getty Images Science Chemistry Basics Chemical Laws Molecules Periodic Table Projects & Experiments Scientific Method Biochemistry Physical Chemistry Medical Chemistry Chemistry In Everyday Life Famous Chemists Activities for Kids Abbreviations & Acronyms Biology Physics Geology Astronomy Weather & Climate By Anne Marie Helmenstine, Ph.D. Chemistry Expert Ph.D., Biomedical Sciences, University of Tennessee at Knoxville B.A., Physics and Mathematics, Hastings College Dr. Helmenstine holds a Ph.D. in biomedical sciences and is a science writer, educator, and consultant. She has taught science courses at the high school, college, and graduate levels. our editorial process Facebook Facebook Twitter Twitter Anne Marie Helmenstine, Ph.D. Updated November 02, 2020 Accuracy and precision are two important factors to consider when taking data measurements. Both accuracy and precision reflect how close a measurement is to an actual value, but accuracy reflects how close a measurement is to a known or accepted value, while precision reflects how reproducible measurements are, even if they are far from the accepted value. Key Takeaways: Accuracy Versus Precision Accuracy is how close a value is to its true value. An example is how close an arrow gets to the bull's-eye center.Precision is how repeatable a measurement is. An example is how close a second arrow is to the first one (regardless of whether either is near the mark).Percent error is used to assess whether a measurement is sufficiently accurate and precise. You can think of accuracy and precision in terms of hitting a bull's-eye. Accurately hitting the target means you are close to the center of the target, even if all the marks are on different sides of the center. Precisely hitting a target means all the hits are closely spaced, even if they are very far from the center of the target. Measurements that are both precise and accurate are repeatable and very near true values. Accuracy There are two common definitions of accuracy. In math, science, and engineering, accuracy refers to how close a measurement is to the true value. The ISO (International Organization for Standardization) applies a more rigid definition, where accuracy refers to a measurement with both true and consistent results. The ISO definition means an accurate measurement has no systematic error and no random error. Essentially, the ISO advises that accurate be used when a measurement is both accurate and precise. Precision Precision is how consistent results are when measurements are repeated. Precise values differ from each other because of random error, which is a form of observational error. Examples You can think of accuracy and precision in terms of a basketball player. If the player always makes a basket, even though he strikes different portions of the rim, he has a high degree of accuracy. If he doesn't make many baskets but always strikes the same portion of the rim, he has a high degree of precision. A player whose free throws always make the basket the exact same way has a high degree of both accuracy and precision. Take experimental measurements for another example of precision and accuracy. You can tell how close a set of measurements are to a true value by averaging them. If you take measurements of the mass of a 50.0-gram standard sample and get values of 47.5, 47.6, 47.5, and 47.7 grams, your scale is precise, but not very accurate. The average of your measurements is 47.6, which is lower than the true value. Yet, your measurements were consistent. If your scale gives you values of 49.8, 50.5, 51.0, and 49.6, it is more accurate than the first balance but not as precise. The average of the measurements is 50.2, but there is a much larger range between them. The more precise scale would be better to use in the lab, providing you made an adjustment for its error. In other words, it's better to calibrate a precise instrument than to use an imprecise, yet accurate one. Mnemonic to Remember the Difference An easy way to remember the difference between accuracy and precision is: ACcurate is Correct (or Close to real value)PRecise is Repeating (or Repeatable) Accuracy, Precision, and Calibration Do you think it's better to use an instrument that records accurate measurements or one that records precise measurements? If you weigh yourself on a scale three times and each time the number is different, yet it's close to your true weight, the scale is accurate. Yet it might be better to use a scale that is precise, even if it is not accurate. In this case, all the measurements would be very close to each other and "off" from the true value by about the same amount. This is a common issue with scales, which often have a "tare" button to zero them. While scales and balances might allow you to tare or make an adjustment to make measurements both accurate and precise, many instruments require calibration. A good example is a thermometer. Thermometers often read more reliably within a certain range and give increasingly inaccurate (but not necessarily imprecise) values outside that range. To calibrate an instrument, record how far off its measurements are from known or true values. Keep a record of the calibration to ensure proper readings. Many pieces of equipment require periodic calibration to ensure accurate and precise readings. Learn More Accuracy and precision are only two important concepts used in scientific measurements. Two other important skills to master are significant figures and scientific notation. Scientists use percent error as one method of describing how accurate and precise a value is. It's a simple and useful calculation.