If you drive one mile at 120 mph and one mile at 80 miles per hour, what is your average speed?
![]() |
If you drive one mile at 120 mph and one mile at 80 miles per hour, what is your average speed?
120 + 80 = 200
200/2 = 100
The average in this case is 100
basically Plus every number together then divide them by how many there are.
20,20,30,40,40 : 150/5 = 30 :D
It's not as simple as averaging the two speeds together. What if you traveled the first mile at 120 mph, then the second mile at 0.001 mph? If you average those two numbers together, you'd get something just barely over 60 mph. However, it should be pretty obvious that it's going to take you a long, long time to drive that second mile. In fact, it will take 1000 hours, which is nearly 6 weeks! If it takes you 6 weeks to travel 2 miles, you're clearly not averaging 60 mph!
You have to calculate the speed using d=rt (distance = rate * time).
How long does it take to travel a mile at 120 mph? Time = distance/rate, so 1 mile/(120 mph) = 1/120 of an hour (i.e. 30 seconds).
How about the time to travel a mile at 80 mph? 1/80 of an hour (i.e. 45 seconds).
So what's the total time? 1/120 + 1/80 = 1/48 hr (i.e. 75 seconds).
So now what's the average speed? Average speed = (Total distance traveled) / (total time) = (2 miles) / (1/48 hr) = 96 mph.
All times are GMT -7. The time now is 06:14 AM. |