In Example 8.3 we came to the conclusion that the expected number of couples sitting together is $EN=2$. However, --- as someone pointed out to me afterwards ---, this is wrong for $n=1$, since we would be double-counting when at a table of two places Mr Smith has Mrs Smith on both his left and right. I have changed the notes so that this example specifies $n\geq 2$.

Cycling home I was thinking how to prove that as $n\to\infty$ the distribution of $N$ tends to the Poisson distribution with mean 2. One way to do this would be to calculate $EN^r$ for $r=1,2,\dotsc\,$, just as we have already done for $r=1,2$. That will be difficult, but not impossible. Then we show that as $n\to\infty$ these moments tend to the moments of the Poisson random variable. As we shall see in Lecture 21, there is a 1-1 relationship between distributions and their moments.

Simply as a fun fact, I told you today about Zipf's law. This is a probability mass function such that $P(X=i)\propto 1/i^s$, $i=1,2,\dotsc,m$, for some $s > 0$, typically $s=1$. It is very mysterious why this distribution should be a good fit to frequencies with which words are used in a language.

According to the Wikipedia article linked above, "Zipf himself proposed that neither speakers nor hearers using a given language want to work any harder than necessary to reach understanding, and the process that results in approximately equal distribution of effort leads to the observed Zipf distribution". My guess is thinking that he is thinking that to retrieve the $i$th most popular word from your memory has a cost proportional to $i$ (like you were digging down into a list), and that the average workload that each word puts on the brain should be constant, so $ip(i)$ should constant. But this is clearly hand-waving. Maybe you can think of a better explanation

In queueing theory and other theories of congestion, delays and long waiting times can be caused by variability in service times. Suppose customers enter a queue at a constant rate of 1 every 5 minutes. They are served one at a time, service times are independent, taking $1,2,\dotsc$ minutes with probabilities $p_1,p_2,\dotsc$, respectively

An interesting fact (researched extensively during the past 30 years) is that there are large qualitative difference between how such queues behave when $p_i$ decays geometrically or exponentially (as do the Poisson, or geometric distribution, which has $p_i= q^ip$), and when it decays according to a power law, like $p_i\propto 1/i$ or $\propto 1/i^{0.9}$.

Cycling home I was thinking how to prove that as $n\to\infty$ the distribution of $N$ tends to the Poisson distribution with mean 2. One way to do this would be to calculate $EN^r$ for $r=1,2,\dotsc\,$, just as we have already done for $r=1,2$. That will be difficult, but not impossible. Then we show that as $n\to\infty$ these moments tend to the moments of the Poisson random variable. As we shall see in Lecture 21, there is a 1-1 relationship between distributions and their moments.

**Zipf's law**.Simply as a fun fact, I told you today about Zipf's law. This is a probability mass function such that $P(X=i)\propto 1/i^s$, $i=1,2,\dotsc,m$, for some $s > 0$, typically $s=1$. It is very mysterious why this distribution should be a good fit to frequencies with which words are used in a language.

According to the Wikipedia article linked above, "Zipf himself proposed that neither speakers nor hearers using a given language want to work any harder than necessary to reach understanding, and the process that results in approximately equal distribution of effort leads to the observed Zipf distribution". My guess is thinking that he is thinking that to retrieve the $i$th most popular word from your memory has a cost proportional to $i$ (like you were digging down into a list), and that the average workload that each word puts on the brain should be constant, so $ip(i)$ should constant. But this is clearly hand-waving. Maybe you can think of a better explanation

In queueing theory and other theories of congestion, delays and long waiting times can be caused by variability in service times. Suppose customers enter a queue at a constant rate of 1 every 5 minutes. They are served one at a time, service times are independent, taking $1,2,\dotsc$ minutes with probabilities $p_1,p_2,\dotsc$, respectively

An interesting fact (researched extensively during the past 30 years) is that there are large qualitative difference between how such queues behave when $p_i$ decays geometrically or exponentially (as do the Poisson, or geometric distribution, which has $p_i= q^ip$), and when it decays according to a power law, like $p_i\propto 1/i$ or $\propto 1/i^{0.9}$.