The probabilistic proof of the Weierstrass approximation theorem was to show you how probability theory can be used to prove an important theorem in a different field. We did something similar in Example 4.5 (Erdos probabilistic method in combinatorics) and in Examples Sheet 1, #8. The Wikipedia page on Bernstein polynomials has a list of Bernstein polynomials and an animation showing how they are used to approximate a continuous function on $[0,1]$.
I cheated today by using a result from the schedules for Analysis II ("a continuous function on a closed bounded interval is uniformly continuous"). I had thought about giving an easier proof by not appealing to Analysis II, but I decided it would be nicer to show you the proper full result.
The proof used Chebyshev's inequality and illustrated a nice idea that is frequently used in probability calculations. We partitioned the sample space into $A$ and $A^c$ and reasoned differently in each. For $A$ we had $P(A)$ small and $|f(X/n)-f(x)|\leq 2$, whereas for $A^c$ we had $P(A^c)\leq 1$ and $|f(X/n)-f(x)|$ small.
Benford's law is surprising when first encountered. The fact that the leading digit should be "1" with a frequency of 0.301 (rather than $1/9$) has been used as a test to check for fraud in financial records. The idea is that if a person is making up numbers, he unlikely to arrange that his distribution of first digits agree with Benford's law. You can read more about that, and many other examples of Benford's law, in the Wikipedia article referenced above.
In discussing why Benford's Law applies to leading digits of Fibonacci numbers I used a result that if $\beta=\log_{10}\alpha$ is irrational, then the sequence of fractional parts $\{]n\beta[\}_{n=1}^\infty$ is uniformly distributed. This result is certainly very plausible, but a proper proof is beyond our scope. It involves an analysis via the discrete Fourier transform of the sequence of length $N$, showing that this approaches the Fourier transform of a sequence of numbers that are uniformly distributed on $[0,1]$.
I cheated today by using a result from the schedules for Analysis II ("a continuous function on a closed bounded interval is uniformly continuous"). I had thought about giving an easier proof by not appealing to Analysis II, but I decided it would be nicer to show you the proper full result.
The proof used Chebyshev's inequality and illustrated a nice idea that is frequently used in probability calculations. We partitioned the sample space into $A$ and $A^c$ and reasoned differently in each. For $A$ we had $P(A)$ small and $|f(X/n)-f(x)|\leq 2$, whereas for $A^c$ we had $P(A^c)\leq 1$ and $|f(X/n)-f(x)|$ small.
Benford's law is surprising when first encountered. The fact that the leading digit should be "1" with a frequency of 0.301 (rather than $1/9$) has been used as a test to check for fraud in financial records. The idea is that if a person is making up numbers, he unlikely to arrange that his distribution of first digits agree with Benford's law. You can read more about that, and many other examples of Benford's law, in the Wikipedia article referenced above.
In discussing why Benford's Law applies to leading digits of Fibonacci numbers I used a result that if $\beta=\log_{10}\alpha$ is irrational, then the sequence of fractional parts $\{]n\beta[\}_{n=1}^\infty$ is uniformly distributed. This result is certainly very plausible, but a proper proof is beyond our scope. It involves an analysis via the discrete Fourier transform of the sequence of length $N$, showing that this approaches the Fourier transform of a sequence of numbers that are uniformly distributed on $[0,1]$.