Respuesta :
Recall that for a random variable [tex]X[/tex] following a discrete distribution, the expectation of [tex]X[/tex] is given by
[tex]\mathbb E[X]=\displaystyle\sum_x x\,f_X(x)[/tex]
where [tex]f_X(x)=\mathbb P(X=x)[/tex] is the PMF of [tex]X[/tex]. We have
[tex]f_X(x)=\begin{cases}\dfrac1k&\text{for }x\in\{x_1,\ldots,x_k\}\\\\0&\text{otherwise}\end{cases}[/tex]
So the expectation (mean) of the given uniformly distributed [tex]X[/tex] is
[tex]\mathbb E[X]=\displaystyle\sum_x x\,f_X(x)=\sum_{i=1}^k x_i\,f_X(x_i)=\frac1k\sum_{i=1}^k x_i[/tex]
Without any more specific information, this is all we can say about the mean.
The variance of [tex]X[/tex] is defined by
[tex]\mathbb V[X]=\mathbb E[(X-\mathbb E[X])^2]=\mathbb E[X^2]-\mathbb E[X]^2[/tex]
In computing the second moment [tex](\mathbb E[X^2])[/tex] we run into the same issue as before - we can't find a "complete" result - but we do get
[tex]\mathbb E[X^2]=\displaystyle\sum_x x^2\,f_X(x)=\frac1k\sum_{i=1}^k{x_i}^2[/tex]
and so
[tex]\mathbb V[X]=\displaystyle\frac1k\sum_{i=1}^k{x_i}^2-\left(\frac1k\sum_{i=1}^kx_i\right)^2[/tex]
We can expand [tex]\mathbb E[X]^2[/tex] a bit. Denoting the set [tex]K=\{1,2,\ldots,k\}[/tex], we can write
[tex]\displaystyle\left(\frac1k\sum_{i=1}^kx_i\right)^2=\frac1{k^2}\left(\sum_{i=1}^k{x_i}^2+\sum_{i,j\in K,\,i\neq j}x_ix_j\right)[/tex]
and so
[tex]\mathbb V[X]=\displaystyle\left(\frac1k-\frac1{k^2}\right)\sum_{i=1}^k{x_i}^2-\frac1{k^2}\sum_{i,j\in K,\,i\neq j}x_ix_j[/tex]
[tex]\mathbb V[X]=\displaystyle\frac{k-1}{k^2}\sum_{i=1}^k{x_i}^2-\frac1{k^2}\sum_{i,j\in K,\,i\neq j}x_ix_j[/tex]
but again, that's as much as we can say without any more specific information.
[tex]\mathbb E[X]=\displaystyle\sum_x x\,f_X(x)[/tex]
where [tex]f_X(x)=\mathbb P(X=x)[/tex] is the PMF of [tex]X[/tex]. We have
[tex]f_X(x)=\begin{cases}\dfrac1k&\text{for }x\in\{x_1,\ldots,x_k\}\\\\0&\text{otherwise}\end{cases}[/tex]
So the expectation (mean) of the given uniformly distributed [tex]X[/tex] is
[tex]\mathbb E[X]=\displaystyle\sum_x x\,f_X(x)=\sum_{i=1}^k x_i\,f_X(x_i)=\frac1k\sum_{i=1}^k x_i[/tex]
Without any more specific information, this is all we can say about the mean.
The variance of [tex]X[/tex] is defined by
[tex]\mathbb V[X]=\mathbb E[(X-\mathbb E[X])^2]=\mathbb E[X^2]-\mathbb E[X]^2[/tex]
In computing the second moment [tex](\mathbb E[X^2])[/tex] we run into the same issue as before - we can't find a "complete" result - but we do get
[tex]\mathbb E[X^2]=\displaystyle\sum_x x^2\,f_X(x)=\frac1k\sum_{i=1}^k{x_i}^2[/tex]
and so
[tex]\mathbb V[X]=\displaystyle\frac1k\sum_{i=1}^k{x_i}^2-\left(\frac1k\sum_{i=1}^kx_i\right)^2[/tex]
We can expand [tex]\mathbb E[X]^2[/tex] a bit. Denoting the set [tex]K=\{1,2,\ldots,k\}[/tex], we can write
[tex]\displaystyle\left(\frac1k\sum_{i=1}^kx_i\right)^2=\frac1{k^2}\left(\sum_{i=1}^k{x_i}^2+\sum_{i,j\in K,\,i\neq j}x_ix_j\right)[/tex]
and so
[tex]\mathbb V[X]=\displaystyle\left(\frac1k-\frac1{k^2}\right)\sum_{i=1}^k{x_i}^2-\frac1{k^2}\sum_{i,j\in K,\,i\neq j}x_ix_j[/tex]
[tex]\mathbb V[X]=\displaystyle\frac{k-1}{k^2}\sum_{i=1}^k{x_i}^2-\frac1{k^2}\sum_{i,j\in K,\,i\neq j}x_ix_j[/tex]
but again, that's as much as we can say without any more specific information.