Objective is not so objective

Model selection is a difficult process particularly in high dimensional settings, dependent observations, and sparse data regime. In this post, I will discuss a common misconception about selecting models based on values of the objective function generated from optimization algorithms in sparse data settings. TL;DR Don’t do it.

Read More

Optimizing, Sampling, and Choosing Priors

Do you really believe your variance parameter can be anywhere from zero to infinity?

In the past, I’ve often not included priors in my models. I often felt daunted by having to pick sensible priors for my parameters, and I usually fell into the common trap of thinking that no priors or uniform priors are somehow the most objective prior because they “let the data do all the talking.” Recent experiences and have completely changed my thinking on this though.

Read More

Deconstructing Stan Manual Part 2: QR Decomposition

On March 15, we held our second meetup of 2018 covering QR decomposition, simulating correlation matrices using the LKJ distribution, and ending with some general advice about priors.

The slides from the meetup are now available on RPubs. If you have any questions or suggestions, please let us know in the comments.

Read More

Correlation or no correlation, that is the question

A friend asked me about how he should update his beliefs about correlation after seeing some data. In his words:

If I have two variables and I want to express that my prior is that the correlation could be anything between -1 and +1 how would I update this prior based on the observed correlation?

Read More

Deconstructing Stan Manual Part 1: Linear Regression

On February 15, we held our first meetup of 2018 starting a new series called Deconstructing the Stan Manual. During the meetup we coded a Linear Regression model in Stan and fit it to the Wine Quality dataset from the UCI Machine Learning Repository.

The slides from the meetup are now available on RPubs. If you have any questions or suggestions, please let us know in the comments.

Read More