A company claims that the mean weight per apple they ship is 120 grams with a standard deviation of 12 grams. Data generated from a sample of 49 apples randomly selected from a shipment indicated a mean weight of 122. 5 grams per apple. Calculate and interpret a 95% confidence interval for the mean weight per apple.

Respuesta :

The 95% confidence interval for the average apple weight is (119.14, 125.86).

An apple typically weighs 120 grams with a 12-gram standard deviation, according to one manufacturer. The average weight of an apple was 122. 5 grams, according to statistics created from a sample of 49 apples that were randomly chosen from a shipment.

Let, n = 49

x = 122.5 

σ = 12

To calculate and interpret a 95% confidence interval for the average apple weight, use the formula;

⇒ (x ± 1.96*σ/√n)

⇒ (122.5 ± 1.96*12/√49)

⇒ (119.14, 125.86)

Consequently, the average apple weight has a 95% confidence interval of (119.14, 125.86).

To learn more about Confidence interval click here:

brainly.com/question/24131141

#SPJ4