Say the distance traveled is 7.2 meters and the time it took to get there was .28 seconds. To find the average speed, you would have to divide the distance ( 7.2 meters ) by the time (.28 seconds ) which is around 25.7 meters per second. How can it be 25.7 meters per second if the total distance traveled was 7.2 meters? The meters in average speed is greater than the distance? Shouldn’t the meters per second be way smaller than the actual distance? Why is it greater? Could someone please help me understand this?