Sidan finns bara på engelska
TSKS38 Distributed Information Processing and Machine Learning
Introduction
TSKS38 provides a broad introduction to the fundamental theories and methods on distributed information processing and machine learning in networked systems. The course focuses on distributed optimization techniques and consensus protocols, which form the foundation of modern distributed and collaborative learning frameworks, such as federated learning. Students will gain a solid understanding of both the theoretical principles and practical implementation aspects of distributed optimization and learning algorithms, as well as how communication resource constraints affect overall system performance.

The course is given in English and the detailed course information is therefore in English.
Course Literature
Books (available as e-books on LiU library):
- Gauri Joshi, Optimization Algorithms for Distributed Machine Learning, 2023.
- Osvaldo Simeone, Machine Learning for Engineers, 2023.
Articles (can be found in Lisam course room as “suggested reading”):
- E. G. Larsson and N. Michelusi, “Unified analysis of decentralized gradient descent: A contraction mapping framework”, IEEE Open Journal of Signal Processing, 2025.
- R. Olfati-Saber, J. A. Fax and R. M. Murray, “Consensus and cooperation in networked multiagent systems”, Proceedings of the IEEE, 2007.
- L. Bottou, F. E. Curtis, and J. Nocedal, “Optimization methods for large-scale machine learning”, SIAM review, 2018.
Lecture Content
- Basics on graphs and spectral graph theory
- Basics on communication networks
- Introduction on machine learning
- Distributed consensus over time-invariant and time-variant networks
- Distributed consensus with noisy communication and link failures
- Optimization for machine learning: gradient descent (DG) and stochastic gradient descent (SGD)
- Distributed synchronous SGD and local-update SGD
- Federated learning with heterogeneous data, communication-efficient methods, variance-reduction techniques
- Contraction mapping framework, decentralized GD with perfect and noisy communication
- Privacy and security issues in distributed learning
- Modern topics in machine learning, ethics and sustainability in AI
Instructors
- Course director and lecturer: Zheng Chen
Prerequisites
- Linear algebra, probability theory, and general mathematical maturity.
- Programming skills.
Information for enrolled students
For detailed lecture, tutorial and lab plans, see the course room in