Neural Discrete Representation Learning
Generating Diverse High-Fidelity Images with VQ-VAE-2
Wikipedia: https://en.wikipedia.org/wiki/Vector_quantization
Vector quantization (VQ) is a classical quantization technique from signal processing that allows the modeling of probability density functions by the distribution of prototype vectors. It was originally used for data compression. It works by dividing a large set of points (vectors) into groups having approximately the same number of points closest to them. Each group is represented by its centroid point, as in k-means and some other clustering algorithms.
http://www.mqasem.net/vectorquantization/vq.html

VQ has been used to quantize a feature representation layer in the discriminator of GANs. The feature quantization (FQ) technique performs implicit feature matching.
It improves the GAN training, and yields an improved performance on a variety of popular GAN models: BigGAN for image generation, StyleGAN for face synthesis, and U-GAT-IT for unsupervised image-to-image translation.

it is the first discrete latent VAE model that get similar performance as its continuous counterparts, while offering the flexibility of discrete distributions.
