Monday, 3 March 2014

Lecture 20

Today we saw how to transform random variables. By this means we can discover further interesting relationships between distributions.

I mentioned in this lecture that if X,Y are independent N(0,1) then W=X2+Y2 and T=tan1(Y/X) are independent E(1/2) and U[π/2,π/2]. We can use this result in the reverse direction to generate normal random variables. Make two independent samples from U[0,1], say U and V. Let R2=2logU (which is E(1/2)) and let Θ=(V12)π. Then X=RcosΘ and Y=RsinΘ are independent N(0,1).

There is a particularly strange fact in Examples sheet 4, #19.

By the way, if (like me) you have ever been puzzled why it is that the absolute value of the Jacobian, |J|, gives the volume change of a n-dimensional parallelepiped under an linear transformation in Rn, you might enjoy reading this essay A short thing about determinants, or “an attempt to explain volume forms, determinants and adjugates by staying in 3-d”, by Gareth Taylor.

I have also now added to the notes an Appendix C, in which I explain how under a linear transformation a sphere is mapped into an ellipsoid of a volume that greater by a factor |J|. I think that in previous years lecturers have said at this point "you already know this from Vector Calculus". But I am not sure that's that case, and so decided it would be nice to try to remove some of the mystery.

The convolution we need to add two Cauchy random variables is tricky. On page 81, the partial fractions method of computing this integral is to write

1(1+x2)(1+(zx)2)=A+Bx(1+x2)+C+D(xz)(1+(zx)2)

for (A,B,C,D)=14+z2(1,2/z,1,2/z). Upon integrating w.r.t. x the terms with B and D give 0 (by symmetry either side of 0), and the terms with A and C provide the claimed result of 2π(4+z2).

Comment (1)

Loading... Logging you in...
  • Logged in as
I thought of another way of proving that the magnitude of the determinant of an nxn matrix M is the volume of the shape resulting from transforming the unit n-cube under the map represented by M. In other words, n! times the volume of the n-dimensional tetrahedron with one vertex the origin, and the other n vertices the column vectors of matrix M. I'll describe it in 3D, but the same argument works for nD. This is a proof by induction on n, so this is a description of the proof that 2D version implies 3D version. Let O,A,B,C form a tetrahedron, where O is the origin. Let a,b,c be position vectors of points A,B,C.
i) The volume of the tetrahedron is (1/3)*(area of base)*height = |a.(vector area of OBC)|/3.
ii) It can be verified using suffix notation that the normal of OBC (and therefore also the vector area of OBC) has the same direction as the vector d with components d_i = epsilon_ijk b_j c_k.
iii) Assuming that the magnitude of the determinant of a 2x2 matrix M' (defined using levi civita symbol) is the area of a parallelogram with one vertex the origin and two vertices the column vectors of M', we have:
d.d = (epsilon_12 b_1 c_2)^2 + (epsilon_23 b_2 c_3)^2 + (epsilon_31 b_3 c_1)^2
= (2* area of x,y-image of OBC)^2 + (2* area of y,z-image of OBC)^2 + (2* area of z,x-image of OBC)^2
iv) There is a n-dimensional generalisation of pythagoras' theorem which, for 3 dimensions, states that: for any orthogonal tetrahedron, the sum of the squares of the areas of the three faces which are on the orthogonal corner (i.e the 3 smaller faces), is equal to the square of the area of the face opposite the orthogonal corner. Using this it can be shown that d.d is indeed equal to (2*area of OBC)^2. So that the volume of the tetrahedron is: (1/3!) epsilon_ijk a_i b_j c_k

The generalisation of pythagoras' theorem used here can be proved in 3D as follows (same argument for n dimensions). Let OABC be an orthogonal tetrahedron with O the orthogonal vertex. Let A,B,C be located on the x,y,z axes respectively, with distances a, b, c, respectively. Calculate h, the shortest distance from O to ABC. The volume is then h*Area(ABC)/3. The volume is also equal to abc/6. Then Area(ABC) can be written in terms of ab/2, bc/2 and ca/2, which gives the result.

Post a new comment

Comments by