Since we want to get 1/5 to a decimal we have to change the bottom to the tenth place. To get that we have to times the numerator and the denominator by 2 to get it to 10.
1*2 / 5*2 = 2/10 = 0.2
since 2 is over 10, that means the 2 would be in the tenths place.
Your answer is 0.2
(if it were 2 over 100 then 2 would be in the hundredths place [0.02])