This is part five of our ongoingseriesonimplementing differentiable sparse linear algebra in JAX. In some sense this is the last boring post before we get to the derivatives. Was this post going to include the derivatives? It sure was but then I realised that a different choice was to go to bed so I can get up nice and early in the morning and vote in our election. It goes without saying that before I split the posts, it was more than twice as long and I was nowhere near finished. So probably...| Un garçon pas comme les autres (Bayes)
Just some harmeless notes. Like the ones Judy Dench took in that movie.| Un garçon pas comme les autres (Bayes)
This is part three of an ongoing exercise in hubris. Part one is here.Part two is here. The overall aim of this series of posts is to look at how sparse Cholesky factorisations work, how JAX works, and how to marry the two with the ultimate aim of putting a bit of sparse matrix support into PyMC, which should allow for faster inference in linear mixed models, Gaussian spatial models. And hopefully, if anyone ever gets around to putting the Laplace approximation in, all sorts of GLMMs and non-...| Un garçon pas comme les autres (Bayes)
Come for the details, stay for the shitty Python, leave with disappointment. Not unlike the experience of dating me.| Un garçon pas comme les autres (Bayes)