Just in time for Christmas, I received two days ago the first hard copies of my book! It is a mix of feelings of relief and pride after 3 years of work. As most book writers will probably acknowledge, it took much longer than I expected when I started, but overall it was an enriching experience for several reasons: (1) I learned a lot of new math to write it; (2) trying to squeeze the simplest arguments for the analysis of algorithms led to many new research questions; (3) I was able to get t...| Machine Learning Research Blog
In last month blog post, I presented the von Neumann entropy. It is defined as a spectral function on positive semi-definite (PSD) matrices, and leads to a Bregman divergence called the von Neumann relative entropy (or matrix Kullback Leibler divergence), with interesting convexity properties and applications in optimization (mirror descent, or smoothing) and probability (concentration inequalities for matrices). | Machine Learning Research Blog