Leibniz University
Overview
Semester | Winter (except WS 24/25) |
ECTS | 5 |
Level | Master |
Description
Language
The course will be held in English.
The learning objectives include the students acquiring both the theoretical and practical basics of interpretable machine learning (iML). For this purpose, they should internalize the mathematical basics as well as be able to implement, execute and evaluate iML approaches. In a final project, the students will independently apply the learned concepts to a new problem.
Recommended pre-requisites
- Artificial Intelligence
- Machine Learning
- Deep Learning
Lecturer
Prof. Dr. rer. nat. Marius Lindauer
Phone
Fax
Address
Welfengarten 1
30167 Hannover
30167 Hannover
Building
Room
Prof. Dr. rer. nat. Marius Lindauer
Topics
- GAMs and Rule-based Approaches
- Feature Effects
- Local Explanations
- Shapley Values for Explainability
- Instance-wise Feature Selection
- Gradient-based Feature Attribution
- Actionable Explanation and Resources
- Evaluating Interpretability and Utility
Literature
The full list of literature can be found on the StudentIP course page. Recommended books to accompany the lecture are:
- Interpretable ML Book by Molnar.
- Samek, W., Montavon, G., Vedaldi, A., Hansen, L. K., & Müller, K. R. (Eds.). (2019). Explainable AI: interpreting, explaining and visualizing deep learning (Vol. 11700). Springer Nature., ISO 690.