1 Origin story Figure 1 Quantization, in a general sense, is the process of mapping a continuous or large set of values to a smaller, discrete set. This concept has roots in signal processing and information theory —search for Vector Quantization (VQ) emerging in the late 1970s and early 1980s. Think things like the Linde-Buzo-Gray (LBG) algorithm (Linde, Buzo, and Gray 1980). VQ represents vectors from a continuous space using a finite set of prototype vectors from a “codebook,” often...| The Dan MacKinlay stable of variably-well-consider’d enterprises
Figure 1 In classical statistics there are families of model complexity estimates, which are loosely collectively referred to as “Degrees of freedom” of a model. Neither computationally not practically do they scale up to overparmaterized NNs, and there are other tools. Exception: Shoham, Mor-Yosef, and Avron (2025) argues for a connection to the Takeuchi Information Criterion. These end up being popular in developmental interpretability. 1 Learning coefficient The major output of singul...| The Dan MacKinlay stable of variably-well-consider’d enterprises
Figure 1 I’m not sure what this truly means or that anyone is, but I think it wants to mean something like quantifying architectures that make it “easier” to learn about the phenomena of interest. This is a practical engineering discipline in NNs but maybe also intersting to think about in humans.| The Dan MacKinlay stable of variably-well-consider’d enterprises