Friday 30 January 2015

Lecture 7

You might like to revisit the blog for Lecture 4. The example I do there about tossing coins for which $\sum_k p_k=\infty$ makes use of both continuity of $P$ (proved today), and independence of events (Lecture 5).

We were careful today in defining the notion of a random variable. It is a function mapping elements of the sample space $\Omega$ to some other set $\Omega_X$ (which is typically some set of real numbers). So $X$ is a function and $X(\omega)$ is value in its range. Most of the time we leave out the $\omega$ and speak of an event like $X\leq 1.45$. This really means the event $\{\omega : X(\omega)\leq 1.45\}$.

I remarked that the terminology ‘random variable’ is somewhat inaccurate, since a random variable is neither random nor a variable. However, a random variable has an associated probability distribution and one can think informally about a random variable as "a variable which takes its value according to a probability distribution".

The reason we like to define a random variable $X$ as a function on $\Omega$ is that it ties $X$ back to an underlying probability space, for which we have postulated axioms and know their consequences. If we try to define a random variable $X$ only in terms of its probability distribution, we would not have such a strong axiomatic basis from which to derive further properties of $X$.

Fine print. In Lecture 4 I gave you the definition of a probability space $(\Omega,\mathscr{F},P)$. For $T\subset\Omega_X$, we can calculate $P(X\in T)=P(\{\omega:X(\omega)\in T\})$ only if $\{\omega:X(\omega)\in T\}=X^{-1}(T)\in\mathscr{F}$. This may put restrictions on allowable $T$ and $X$. But this will not bother us in Probability IA, since we will only look at situations in which $\mathscr{F}$ consists of all subsets of $\Omega$, or $T$ is something nice like a subinterval of the reals.