What is the Kullback-Leibler divergence of two Student's T distributions that have been shifted and scaled? That is, textrmDtextrmKL(kaA+ta;kbB+tb) where A and B are Student's T distributions.
If it makes things easier, A could be a Gaussian. (That is, it could have infinite degrees of freedom.)
The motivation behind this question is that the scaled non-central Student's T distribution is the posterior predictive distribution of normally distributed data with unknown mean and variance. Thus, I would like to compare the true distribution kaA+ta with the estimate kbB+tb.
No comments:
Post a Comment