This class will expose mathematical tools encountered at various stages of the image synthesis pipeline. These methods are widely used in computer graphics research.

- Optimal transport (Nicolas)
- Monge and Kantorovich formulations
- Differential geometry of optimal transport
- Numerical methods (Sinkhorn algorithm, Power Diagrams, ...)
- Applications to images and rendering
- Lecture notes: PDF version, PPTX version
- Readings and project
- Monte-Carlo methods for 3d rendering (Nicolas, part I)
- The rendering equation: a Fredholm equation of the second kind
- Lecture notes: PDF version, PPTX version
- Readings and project
- Monte-Carlo methods for 3d rendering and geometry(David, part II)
- Importance sampling, Monte-Carlo convergence rate
- Quasi-Monte Carlo
- Lecture notes: Web
- Monte Carlo for Geometry Processing
- Lecture notes (continued): PDF
- Image and video processing -- mostly with Poisson equation! (Nicolas)
- Variational formulation / PDE
- Numerical methods
- Applications (image stitching, content-aware copy/paste, video filtering, gradient based rendering, fluid simulation, ...)
- Other cool non-Poisson image and video processing techniques
- Lecture notes: PDF version, PPTX version
- Readings and project
- Spectral mesh processing (Julie)
- A Fourier basis on surface meshes, Laplace Beltrami definition
- Application to mesh processing: denoising, compression
- Shape correspondence linearization through functional maps
- Lecture notes: PDF version
- Readings and project
- Around Laplace-Beltrami (David)
- Intrinsic Vector Field Design
- Robustness on geometry processing and Laplace-Beltrami on non-standard meshes
- Lecture notes: Directory (pdf/pptx/key)
- Readings
- Markov random fields (Julie)
- Definition, graphical model formulation
- Maximum a posteriori inference: belief propagation and graph cuts
- Application to Segmentation and Texture Synthesis
- Lecture notes: PDF version
- Readings and project
- Machine Learning for Graphics and Vision (Julie)
- Statistical tools for machine learning, Maximum likelihood and Maximum a posteriori estimator, supervised and unsupervised learning
- K-means and Expectation Maximization for Gaussian mixture models.
- Support Vector Machine, Kernel trick, Random Forests and boosting
- Neural networks for image synthesis (Autoregressive neural networks, Variational auto-encoders, generative adversarial networks)
- Readings and project

Nicolas Bonneel and Julie Digne are junior CNRS researchers at LIRIS lab, David Coeurjolly is a senior CNRS researcher at LIRIS.