- #1
h6872
- 10
- 0
Hi everybody!
This might seem like a terribly easy question, but I can't seem to figure it out for the life of me. I've been given a set of values for time and velocity:
t (s) v (m/s)
0 0
3 7
7 16
12 33
18 48
23 53
27 67
34 86
And told that the errors at (delta)t = 0.2 s and (delta)v = 3.0 m/s are the same for all experimental points.
How would I go about determining the error for these values individually (as I'll have to graph the points)? The question asks for the acceleration and error, and the method I've been given seems to involve plotting each point with its associated error bars.
But couldn't I use the slope of my graph (if I plot these points without the error bars) to determine acceleration, and take the acceleration calculated from 3.0/0.2 as my expected value, and use the: (|your value-standard value|)/standard value equation?
And yet, I've been explicitly told not to consider this difference an error.
Please help! I'm completely confused!
This might seem like a terribly easy question, but I can't seem to figure it out for the life of me. I've been given a set of values for time and velocity:
t (s) v (m/s)
0 0
3 7
7 16
12 33
18 48
23 53
27 67
34 86
And told that the errors at (delta)t = 0.2 s and (delta)v = 3.0 m/s are the same for all experimental points.
How would I go about determining the error for these values individually (as I'll have to graph the points)? The question asks for the acceleration and error, and the method I've been given seems to involve plotting each point with its associated error bars.
But couldn't I use the slope of my graph (if I plot these points without the error bars) to determine acceleration, and take the acceleration calculated from 3.0/0.2 as my expected value, and use the: (|your value-standard value|)/standard value equation?
And yet, I've been explicitly told not to consider this difference an error.
Please help! I'm completely confused!