What is the Kullback-Leibler divergence of two Student's T distributions that have been shifted and scaled? That is, $textrm{D}_{textrm{KL}}(k_aA + t_a; k_bB + t_b)$ where $A$ and $B$ are Student's T distributions.
If it makes things easier, $A$ could be a Gaussian. (That is, it could have infinite degrees of freedom.)
The motivation behind this question is that the scaled non-central Student's T distribution is the posterior predictive distribution of normally distributed data with unknown mean and variance. Thus, I would like to compare the true distribution $k_aA + t_a$ with the estimate $k_bB + t_b$.
No comments:
Post a Comment