A driver completed the first 20 miles of a 40-mile trip at an average speed of 50 miles per hour. At what average speed must the driver complete the remaining 20 miles to achieve an average speed of 60 miles per hour for the entire 40-mile trip? (Assume that the driver did not make any stops during the 40-mile trip.)
A. 65 mph
B. 68 mph
C. 70 mph
D. 75 mph
E. 80 mph

Respuesta :

Answer:

Step-by-step explanation:

D

I am not sure if I did this the easiest way, but here's how I went about it.

40 miles/60 mph = average trip time

20 miles/50 mph = time it took to complete first 20 miles

20 miles/X mph = time it took to complete second 20 miles

therefore

40/60 hours = 20miles/50mph + 20miles/X mph.

Solved for X and you get 75

The drive must have an average speed of 70 miles per hour in the last 20 miles of the entire 40-mile trip. (Answer: C)

To determine the resulting Average Speed for the entire trip, we need to apply the concept of Weighted Average, whose expression for this case is presented below:

[tex]\bar v_{r} = \frac{x_{1}\cdot \bar v_{1} + x_{2}\cdot \bar v_{2}}{x_{1}+x_{2}}[/tex] (1)

Where:

[tex]\bar v_{r}[/tex] - Average speed for the entire trip, in miles per hour.

[tex]\bar v_{1}[/tex] - Average speed for the first 20 miles, in miles por hour.

[tex]\bar v_{2}[/tex] - Average speed for the last 20 miles, in miles per hour.

If we know that [tex]\bar v_{r} = 60\,\frac{mi}{h}[/tex], [tex]\bar v_{1} = 50\,\frac{mi}{h}[/tex] and [tex]x_{1} = x_{2} = 20\,mi[/tex], then we find the average speed for the last 20 miles of the 40-mile trip:

[tex]60\,\frac{mi}{h} =\frac{(20\,mi)\cdot \left(50\,\frac{mi}{h} \right) + (20\,mi)\cdot \bar v_{2}}{40\,mi}[/tex]

[tex]\bar v_{2} = 70\,\frac{mi}{h}[/tex]

The drive must have an average speed of 70 miles per hour in the last 20 miles of the entire 40-mile trip. (Answer: C)

For further detail, please see the following link to a related question: https://brainly.com/question/18554478