Abstract
Normalizing Flows provide a principled approach to modeling complex probability distributions by mapping a simple base distribution through a sequence of invertible and differentiable transformations. This work reviews the change of variables formulation, core architectures, and key challenges in balancing expressive power with computational efficiency. We further discuss density estimation via maximum likelihood, extensions such as coupling layers, and applications in machine learning, data generation, and probabilistic inference, while also noting inherent trade-offs in model complexity and computational cost.
