If you rent a car for one day and drive it for 100 miles the cost is $40:00 if you drive it 220 miles the cost is 46.00 use the linear function to find out how much you will pay to rent the car for one day if you drive it 300 miles

Respuesta :

If you divide $6.00 by 120, you come up with 5 cents per mile. So the slope of the line (m) would be 0.05, or 1/20.

We also know that it costs $40.00 at 100 miles, so when x=100, y=40. If you multiply 100 by 0.05 you get 5, which means the rate goes up $5.00 for every 100 miles. So if you take your original y=40 ($40.00 for 100 miles) and subtract 5, that means y=35 for 0 miles, and that gives you your zero-intercept (b).

So your equation would be y = 1/20x + 35

Now, if x = 300 miles, then

y= 1/20*300 + 35

y = 15 + 35

y = 50, therefore it would cost $50.00 to drive the rental car 300 miles.
It will cost 50 if you need me to explain I can