I found the talk extremely educational, and would recommend using (a version of) it in any randomized computation course.
It includes a good introduction to unbiased estimators, highlighting the fact that the focus of the analysis is upper bounding the relative variance of the estimator (i.e., the ratio of the variance over the square of the expectation). It then presents a composition theorem for the case that $X$ is an unbiased estimator of $v$, and $Y_x$ is an unbaised estimator of $x$, which is a sample of $X$. (For example, estimating the unreliability of a graph $G$ w.r.t edge failure probability $p$, by estimating the unreliability of a random graph $H$ w.r.t probability $q>p$ where $H$ is generated by contracting edges of $G$ at random (with probability $1-(p/q)$).) Another gem is comparing the unreliability of two graphs by defining a coupled process that samples these graphs under the same edge failure probability.
The all-terminal network reliability problem asks to compute, for a given graph $G$, the probability that graph becomes disconnected when all edges fail independently with some probability. This fundamental problem in graph theory and reliability has been studied for decades, with the first fully polynomial randomized approximation scheme (FPRAS) given in 1995.
In this work, I will describe a particularly simple polynomial time algorithm for this problem, with an equally simple proof of correctness. It relies on generating a low variance unbiased estimator using "partially sampled" graphs, then estimating their reliability recursively. The resulting algorithm is far simpler and much faster than the first one. The analysis introduces techniques that may be useful for a broader class of reliability problems. I will then sketch more sophisticated techniques that can be used to improve the running time of this approximation scheme. In particular I will show that as the edge failure probability decreases there is a rapid "phase transition" from a regime where the graph is likely disconnected, to one in which all cuts can be treated as failing independently which dramatically simplifies the problem. The analysis relies on some new analysis of the Contraction Algorithm---a stochastic process of contracting randomly chosen graph edges until the graph has been fully collapsed.
Available from Karger's webpage.