One of the problems of going to a conference called “MCMSki” is that it is hard to persuade people that you are working rather than extending your Christmas break. These pictures will probably not help.
To be fair, there was a 4-hour break in the middle of the day for outdoor activities, but the conference filled up the rest of the day, right up to the poster session starting at 9:00pm. I can’t give a detailed summary of the meeting like Christian Robert, but I can share my overall impressions of the meeting.
MCMC is not the only game in town any more. I attended several sessions on Approximate Bayesian Computing and sequential Monte Carlo. These methods are receiving increasing attention as they can scale up more easily than MCMC, especially in an increasingly parallel computing environment. Despite its mathematical elegance, MCMC remains a sequential algorithm and I think it will soon be eclipsed by other methods in applications. I said this during the round table on software.
I also attended the sessions on convergence of MCMC, hoping to find out if any practical progress had been made on convergence diagnostics. I am aware that the coda package (which I still maintain) is getting a bit old and tired. These sessions were a bit frustrating as the mathematical machinery required to follow the discussion of convergence is not familiar to me. Even so, I finally understood where the optimal 0.234 acceptance rate for the random-walk Metropolis Hastings algorithm comes from, thanks to an illuminating talk by Gareth Roberts.
As usual, one of the most important parts of the conference was meeting people. I was able to put faces to some famous (and not so famous) names. I met Bob Carpenter and Daniel Lee from the Stan team for the first time. Andrew Gelman gave one of the plenary talks. He began by telling us that he wasn’t interested in statistical computing, which I thought was a brave thing to do at a conference dedicated to statistical computing.
Finally, of most relevance to JAGS, I learned about the Polya-Gamma model for binomial logistic regression models by Polson, Scott and Windle which should fit into the framework of the “glm” module.