A pilot plans to make a flight lasting 2 hours and 30 minutes. How far can he fly from the airport at the rate of 600 mph and return over the same route at the rate of 400 mph?
Does anyone have any ideas?
![]() |
A pilot plans to make a flight lasting 2 hours and 30 minutes. How far can he fly from the airport at the rate of 600 mph and return over the same route at the rate of 400 mph?
Does anyone have any ideas?
Do you mean the plane has to go both ways within 2 hours and 30 minutes?
Yes
I would guess it to be 1250 miles
The easy way would be averaging:
The average of 400 and 600 is 500
Then you could substitute.
The pilot flies for 2 1/2 hours at a speed of 500mph
500*2 1/2=1250 miles
Then to find out how far each way, you use ratios
The ratio in the original question was 600/400 or 6/4
This ratio easily goes into 1250
6+4=10 1250/10=125 4*125=500 6*125=750
So, he went 750 miles going one way and 500 miles going the other way
Now, you check:
750 miles going 600mph equals 1 1/4 hours
500 miles going 400mph equals 1 1/4 hours
1 1/4= 1 1/4 hours equals 2 1/2 hours
By checking you've also found out the duration of going and coming
ask about something if I went too fast for you.
No that's perfect. Thank you
out rate = 600 mph, in rate = 400 mph, total time = time = 2 hours, 30 min = 2 1/2 hours = 5/2 hours. If he flies outbound for t hours, his outbound distance is 600t and his inbound distance is 400(5/2 - t). So 600t = 400(5/2) -t) = 1000 - 400t, sxo 1000t = 1000, t = 1 hour. And distance in and out is 600 * 1 = 600 miles.
Hi Elisha, again you seem to have gone down the wrong path here. You can easily see that 600 miles is wrong.
If you go 600mph (as the question states), then that will only take 1 hour. However the questions states that he is flying for 2 1/2 hours.
Could you refrain from reanswering questions that are already well answered, unless you have something to add.
Thank you!
I'm only answering the questions that have been incorrectly answered before. In this question outward bound the pilot flies 600 mph for 1 hour and travels 600 * 1 = 600 miles; inward bound he travels 400 mph for one hour, thirty minutes and travels 400 * 3 / 2 = 600 miles.
Elisha's answer is correct... you can't average the speeds for this, because you are travelling over the same amount of ground at the two different rates. You can only average speeds if the time spent is the same for each speed.
600 miles is the correct answer, as the question asks for how far from the airport the pilot will get, which is at the end of the 600mph leg.
Worthbeads' answer is clearly incorrect because the distances out and back must be equal to return to the starting point.
Ah okay, in that case you are right, you just didn't explain it very well :)
And distance in and out is 600 * 1 = 600 miles.
I took this to mean that the total distance travelled is 600miles
I think that you have to multiply 600 by the number of hours for example the flight was for 2hours and 30 minutes so 600x2=1200. Then you have to divide 600 by 2 because of the 30minutes left over from the flight so you do 600/2=300. Then you add 1200+300=1500.he would travel 1500miles.Quote:
Originally Posted by mathgrl_3905
again this is what I think on how you should solve this problem
All times are GMT -7. The time now is 05:32 AM. |