This dataset contains information on the 401K (retirement) contributions of employees. In this exercise, we want to establish the relationship between participation in a 401K pension plan and the employer contribution percentage to the plan. Each record in the dataset below refers to one individual company. "prate": refers to the percentage of eligible workers in each company who have an active 401K account. "mrate": refers to the "match rate" - the percentage of employee contribution that is matched by the employer. For instance, if a company has a 75% match rate, then when an employee contributes $100 to their 401K account, their employer will contribute another $75 to the 401K. In linear regression, we have the y values (dependent variable), X values (regressors), and the b values - (the coefficients of X values): y=y1 y2 yn X=1.0 x1 1.0 x2 1.0 xn b=b0 b1 In the above equations, the y-values are the numbers for "prate": (y1, y2, y3...) = (21.6, 100, 97.59, ....) The X-values are the numbers for "mrate": (x1, x2, x3...) = (0.209, 1.42, 0.91, ....) However, in the computation, the first column of the X matrix will contain a column of 1's. Compute the mean of the mean of the prate column in the given dataset?