Online College Courses for Credit

4 Tutorials that teach Accuracy and Precision in Measurements
Take your pick:
Accuracy and Precision in Measurements

Accuracy and Precision in Measurements

Author: Ryan Backman

Determine accuracy and precision in measurements.

See More

Try Our College Algebra Course. For FREE.

Sophia’s self-paced online courses are a great way to save time and money as you earn credits eligible for transfer to many different colleges and universities.*

Begin Free Trial
No credit card required

37 Sophia partners guarantee credit transfer.

299 Institutions have accepted or given pre-approval for credit transfer.

* The American Council on Education's College Credit Recommendation Service (ACE Credit®) has evaluated and recommended college credit for 33 of Sophia’s online courses. Many different colleges and universities consider ACE CREDIT recommendations in determining the applicability to their course and degree programs.


Video Transcription

Download PDF

Hi. This tutorial covers accuracy and precision in measurement. So we'll start with just kind of the classic example of accuracy versus precision. So let's say that you have two archers so two people that are shooting bow and arrows. And they are aiming for the bullseye on the target. So let's assume that the first archer takes a few shots. And all of his shots end up in this little area. All these little blue marks represents the first archer.

Now, let's take a look at a second archer. Let's say he makes those shots there, the red dots. So if we take a look at the first archer, the first archer we say has high precision but low accuracy. So he's very precise. All of his shots ended up in the same area. But he had low accuracy because none of his shots were very close to the bullseye.

Now, if we take a look at the second archer, the second archer has pretty high accuracy. Most of his shots were right pretty close to that bullseye area. He had high accuracy. But he had low precision. His shots were all in this general area, but they're not very predictable. Some are in that area-- that in the gray region here. Some are in the white region where if you compare it back to archer one, he was very precise.

So to get formal definitions of both of those two terms, accuracy is the degree of closeness a measurement is to its true value. So if we're assuming the true value is the target, the target would be here. So he would have low accuracy over here because he's not very close to it to his target to his true value where this guy has a higher accuracy because most of his shots are pretty close to that target value.

Now, precision is the degree to which repeated measurements show the same result. Again this guy, all of his shots all are very close. So repeated measurements show the same results. His repeated shots were all around the same area where his precision was lower because his shots are less predictable. So repeated measurements did not necessarily show the same results for the second guy.

So now, thinking about that in a statistical term, when taking measurements in statistics, high precision and high accuracy are ideal. So if we had both high accuracy and high precision, all of those dots would be right in the middle very close to each other. So we want the same thing in statistics.

Low accuracy in measurement introduces much variability to the data. So when it's low, you're going to have more variable data. So you're going to not be able to rely on that as much because it's not close to the true value to the target value.

Low precision in measurement introduces possible bias to the data. This may also cause the data to be unreliable. So low accuracy causes variability. Low precision causes bias. Both of those things are generally bad when we're doing a statistical study. So ideally, when we're taking measurements, we want to be highly precise and highly accurate. So that is the tutorial on accuracy and precision in measurement. Thanks for watching.

Terms to Know

The extent to which the values, when considered all together, center around the correct value for a variable.


The extent to which the values are very close to each other, even if they are not near the correct value.