To figure this out we simply divide speed by time, so, the time it took to get 11 miles done is 2 hours, so we take the miles ran (11) and divide it by the time it took (2hrs), to get the total of 0.09 (since 2 hours is written as 120 because you do it in minutes, and 60 minutes is in an hour!)
So, the average speed of the runner is 0.09
Disclaimer: It does NOT mean the runner was running a constant speed of 0.09, it just means the average!
Hope this helps you out!