\Question{Solve the Rainbow}
Your roommate was having Skittles for lunch and they offer you some. There are five different colors in a bag of Skittles: red, orange, yellow, green, and purple, and there are 20 of each color. You know your roommate is a huge fan of the green Skittles. With probability $1/3$ they ate all of the green ones, with probability $1/4$ they ate half of them, and with probability $1/4$ they only ate 5 green ones.
\begin{Parts}
\Part If you take a Skittle from the bag, what is the probability that it is green?
\Part If you take two Skittles from the bag, what is the probability that at least one is green?
\Part If you take three Skittles from the bag, what is the probability that they are all green?
\Part If all three Skittles you took from the bag are green, what are the probabilities that your roommate had all of the green ones, half of the green ones, or only 5 green ones?
\Part If you take three Skittles from the bag, what is the probability that they are all the same color?
% \Part If you take three Skittles from the bag, what is the probability that they are all different colors?
\end{Parts}
\Question{Probability Potpourri}
Prove a brief justification for each part.
\begin{Parts}
\Part For two events $A$ and $B$ in any probability space, show that $\Pr(A \setminus B) \geq \Pr(A) - \Pr(B)$.
\Part If $|\Omega| = n$, how many distinct events does the probability space have?
\Part Find some probability space $\Omega$ and three events $A, B$, and $C \subseteq \Omega$ such that $\Pr(A) > \Pr(B)$ and $\Pr(A \mid C) < \Pr(B \mid C)$.
\Part If two events $C$ and $D$ are disjoint and $\Pr(C) > 0$ and $\Pr(D) > 0$, can $C$ and $D$ be independent? If so, provide an example. If not, why not?
\Part Suppose $\Pr(D \mid C) = \Pr(D \mid \overline{C})$, where $\overline{C}$ is the complement of $C$. Prove that $D$ is independent of $C$.
\Part Two six sided dice are rolled. Find three events such that they are all pairwise independent, but aren't mutually independent.
\end{Parts}
\Question{Identity Theft}
A group of $n$ friends go to the gym together, and while they are playing basketball, they leave their bags against the nearby wall.
An evildoer comes, takes the student ID cards from the bags, randomly rearranges them, and places them back in the bags, one ID card per bag.
What is the probability that no one receives his or her own ID card back?
[\textit{Hint}: Use the generalized inclusion-exclusion principle.]
Then, find an approximation for the probability as $n \to \infty$.
\Question{Cookie Jars}
You have two jars of cookies, jar $1$ and jar $2$.
Each jar starts with $n$ cookies initially.
Every day, when you come home, you pick one of the two jars randomly (each jar is chosen with probability $1/2$).
One day, you come home and reach inside one of the jars of cookies, but you find that is empty!
Let $X$ denote the number of remaining cookies in the two jars.
What is the distribution of $X$?
\Question{Exploring the Geometric Distribution}
In this question, we will further investigate the geometric distribution.
Let $X$, $Y$ be i.i.d.\ geometric random variables with parameter $p$.
Let $U = \min\{X, Y\}$ and $V = \max\{X, Y\} - \min\{X, Y\}$.
Compute the joint distribution of $(U, V)$ and prove that $U$ and $V$ are independent.
[\textit{Hint}: If $X \sim \operatorname{Geometric}(p)$ and $Y \sim \operatorname{Geometric}(q)$ are independent, then $\min\{X, Y\} \sim \operatorname{Geometric}(p + q - pq)$.]
\Question{Poisson Coupling}
Consider the following discrete joint distribution for $p \in [0, 1]$.
\begin{align*}
\Pr(X=0, Y=0) &= 1-p, & \\
\Pr(X=1, Y=y) &= \frac{e^{-p} p^y}{y!}, &\qquad y = 1, 2, \dotsc, \\
\Pr(X=1, Y=0) &= e^{-p} - (1-p), & \\
\Pr(X=x, Y=y) &= 0, & \qquad \text{otherwise}.
\end{align*}
\begin{Parts}
\Part Recall that all valid distributions satisfy two important properties. Argue that this distribution is a valid joint distribution.
\Part Show that $X$ has the Bernoulli distribution with probability $p$.
\Part Show that $Y$ has the Poisson distribution with parameter $\lambda = p$.
\Part Show that $\Pr(X \neq Y) \leq p^2$.
\end{Parts}
Now, let $X_i$, $i = 1, 2, \dotsc$ be a sequence of Bernoulli random variables with probabilities $p_i$, $i = 1, 2, \dotsc$. Similarly, let $Y_i$ be a Poisson random variable with parameter $\lambda = p_i$, $i=1, 2, \dotsc$. The $X_i$ and $Y_i$ are coupled, so that they have the joint distribution described above (with $p = p_i$), but for $i \neq j$, $(X_i, Y_i)$ and $(X_j, Y_j)$ are independent.
We will now introduce a coupling argument which shows that the distribution of $\sum_{i=1}^n X_i$ approaches a Poisson distribution with parameter $\lambda = p_1 + \cdots + p_n$.
\begin{Parts}
\setcounter{enumi}{4}
\Part A common way to measure the ``distance'' between two probability distributions is known as the total variation norm, and it is given by
\begin{align*}
d(X, Y) &= \frac{1}{2} \sum_{k=0}^\infty |\Pr(X = k) - \Pr(Y = k)|.
\end{align*}
Show that $d(X, Y) \leq \Pr(X \neq Y)$. [\textit{Hint}: Use the Law of Total Probability to split up the events according to $\{X = Y\}$ and $\{X \neq Y\}$.]
\Part Show that $\Pr(\sum_{i=1}^n X_i \neq \sum_{i=1}^n Y_i) \leq \sum_{i=1}^n \Pr(X_i \neq Y_i)$. [\textit{Hint}: Maybe try the Union Bound.]
\Part Finally, for the $X_i$ and $Y_i$ defined above, show that $d(\sum_{i=1}^n X_i, \sum_{i=1}^n Y_i) \leq \sum_{i=1}^n p_i^2$.
\end{Parts}
\Question{Joint Distributions}
\begin{Parts}
%\Part Give examples of joint distribution over discrete random variables $X$ and $Y$ such that $\mathbb{E}[XY] \neq \mathbb{E}[X] \mathbb{E}[Y]$.
%\Part Give examples of joint distribution over discrete random variables $X$ and $Y$ such that $\mathbb{E}[XY] = 0$, $\mathbb{E}[X] = 0$, and $\mathbb{E}[Y] = 0$, but $X$ and $Y$ are not independent.
\Part Suppose that $X_i$, $i = 1, \dotsc, n$ are binary-valued random variables. How many parameters are required to parameterize the joint distribution $\Pr(X_1 = x_1, \ldots, X_n = x_n)$?
\Part Continuing from the previous part, suppose that all $X_i$s are independent. How many parameters are required to parameterize the joint distribution?
\end{Parts}
\Question{Indicator Variables}
\begin{Parts}
\Part
After throwing $n$ balls into $m$ bins at random, what is the expected number of bins that contains exactly $k$ balls?
\Part
Alice and Bob each draw $k$ cards out of a deck of 52 distinct cards with replacement.
Find $k$ such that the expected number of common cards that both Alice and Bob draw is at least $1$.
\Part
How many people do you need in a room so that you expect that there is going to be a shared birthday on a Monday of the year (assume $52$ Mondays in a year and $365$ days in a year)?
\end{Parts}