Expectation values

In probability theory, the expected value of a random variable



X


{\displaystyle X}
, denoted



E

(
X
)


{\displaystyle \operatorname {E} (X)}
or



E

[
X
]


{\displaystyle \operatorname {E} [X]}
, is a generalization of the weighted average, and is intuitively the arithmetic mean of a large number of independent realizations of



X


{\displaystyle X}
. The expected value is also known as the expectation, mathematical expectation, mean, average, or first moment. Expected value is a key concept in economics, finance, and many other subjects.
By definition, the expected value of a constant random variable



X
=
c


{\displaystyle X=c}
is



c


{\displaystyle c}
. The expected value of a random variable



X


{\displaystyle X}
with equiprobable outcomes



{

c

1


,

,

c

n


}


{\displaystyle \{c_{1},\ldots ,c_{n}\}}
is defined as the arithmetic mean of the terms




c

i


.


{\displaystyle c_{i}.}
If some of the probabilities



Pr

(
X
=

c

i


)


{\displaystyle \Pr \,(X=c_{i})}
of an individual outcome




c

i




{\displaystyle c_{i}}
are unequal, then the expected value is defined to be the probability-weighted average of the




c

i




{\displaystyle c_{i}}
s, that is, the sum of the



n


{\displaystyle n}
products




c

i



Pr

(
X
=

c

i


)


{\displaystyle c_{i}\cdot \Pr \,(X=c_{i})}
. The expected value of a general random variable involves integration in the sense of Lebesgue.

View More On Wikipedia.org
Back
Top