TSKS18 Generative AI for Data Compression and Transmission
Generative artificial intelligence (AI) tools have exploded in recent years, from generative adversarial networks and diffusion models producing lifelike images to large lan- guage models crafting compelling narratives, poetry, and code. Generative AI relies on probabilistic generative modeling, i.e., learning an underlying data distribution over observed data. Generative modeling has broad applicability, not only to generate new data, but also as a powerful design tool for smart systems involving data compression and transmission, enabling the learning of com- pact latent representations for efficient coding and robust reconstruction.
This course provides an introduction to the principles of generative models, their applications in data compression and transmission, and their ethical implications. Through a combination of conceptual lectures and hands-on assignments, students will acquire the skills to design, implement, and critically evaluate generative modeling solutions for data compression and transmission in an effective and responsible manner

Course content
- Introduction to probabilistic generative modeling;
- Generative models (including variational autoencoders, generative adversarial networks, diffusion models, flow-based models, energy-based models, transformers);
- Frameworks and techniques for generative-AI-based data compression and transmission, including neural compression, rate-distortion-perception tradeoff, latent coding, joint source-channel coding;
- Privacy, security, and ethical considerations of generative AI.
Course literature
- [Tomczak’24] Tomczak, Jakub M, (2024) Deep generative modeling. 2nd ed. Cham : Springer, 2024. ISBN: 9783031640872, 303164087X
- [YMT’23] Yibo Yang, Stephan Mandt and Lucas Theis, An Introduction to Neural Data Compression Foundations and Trends in Computer Graphics and Vision Vol. 15: No. 2, pp 113-200. 2023
- Articles:
- Dai, Jincheng, Qin, Xiaoqi, Wang, Sixian, Xu, Lexi, Niu, Kai, Zhang, Ping, Deep Generative Modeling Reshapes Compression and Transmission: From Efficiency to Resiliency IEEE Wireless Communications vol. 31, no. 4, pp. 48-56, August 2024
- Blau, Yochai, Michaeli, Tomer, Rethinking Lossy Compression: The Rate-Distortion-Perception Tradeoff ICML pp. 675-685, 2019
- Bourtsoulatze, E., Burth Kurka, D., Gunduz, D., Deep Joint Source-Channel Coding for Wireless Image Transmission IEEE Transactions on Cognitive Communications and Networking vol. 5, no. 3, pp. 567-579, Sept. 2019.
Intended learning outcomes
After completing the course, the student should be able to:
- explain theoretical foundations of deep generative modeling, including methods for modeling a density function, network structure, loss function, and training routines;
- implement and train generative AI models to generate new data, and apply these models in a data compression and transmission system;
- analyze and evaluate the effectiveness of a solution based on generative modeling for data compression and transmission;
- reflect on implications and ethical considerations of using generative AI tools.
Instructors
- Course director and lecturer: Khac-Hoang Ngo
- Teaching assistant: [TBD]
Prerequisites
- Calculus (e.g., differentiation and integration), linear algebra (e.g., matrix operations), and probability (e.g., random variables, central limit theorem)
- A first course in machine learning
- Python programming skills, e.g., to train a machine learning model
Information for enrolled students
For detailed lecture, tutorial and lab plans, see the course room in: «https://lisam.liu.se/»