GitHub Repository

The complete project materials are publicly available, including the full LaTeX source, bibliography, and supporting references.

Explore the repository on GitHub

Denoising Diffusion Probabilistic Models (DDPM)

Thesis Profile

A Rigorous Theoretical Analysis and State-of-the-Art Review

Bachelor’s Thesis in Computer and Telecommunications Engineering
University of Cassino and Southern Lazio

Author: Giuseppe Alfieri
Supervisor: Prof. Alessandro Bria
Academic Year: 2022/2023

Language Notice

The full dissertation is written in Italian.


Overview

This page presents the theoretical research developed for my Bachelor’s thesis on Denoising Diffusion Probabilistic Models (DDPMs).

The objective of the work is not to introduce a novel architecture, but to provide a rigorous mathematical analysis of diffusion models, moving beyond the simplified “black-box” narrative often associated with modern Text-to-Image systems.

The discussion is grounded at the intersection of non-equilibrium thermodynamics, stochastic processes, and variational inference.

Full Thesis

The dissertation is hosted in PDF format on the GitHub repository because of its formal structure, extensive mathematical derivations, and figures.

Read the thesis online (PDF)
Download the PDF (21 MB)


Key Topics and Theoretical Contributions

The thesis deconstructs the theoretical framework of diffusion models, analyzing how they learn an implicit representation of the data distribution by reversing a gradual and structured stochastic process.

  1. Conceptual Framing
    A rigorous discussion of contemporary Text-to-Image systems such as Midjourney, DALL·E 2, and Stable Diffusion, used to motivate the rise of diffusion-based generative modeling.

  2. Probabilistic Foundations
    A formal treatment of DDPMs as coupled Markov chains. In particular, the Gaussian transition of the forward process is defined as:



  3. Variational Derivation (ELBO)
    A complete derivation of the training objective, starting from the intractable negative log-likelihood and leading to the simplified loss commonly used in practice for noise prediction:



  4. Didactic Exposition
    Extensive appendices make the work self-contained, covering the mathematical prerequisites required to follow the derivations, including probability theory, variational inference, and the role of U-Net architectures in diffusion models.


Scope of the Work

This thesis is intended as a rigorous theoretical study and state-of-the-art review of DDPMs, with the goal of clarifying the mathematical principles that underlie one of the most influential generative modeling paradigms in modern deep learning.