Course details


Deep Representation Learning

WS 2023 Ulf Krumnack, to be announced NN OFFLINE
B.Sc modules:
CS-BWP-NI - Neuroinformatics
KOGW-WPM-NI - Neuroinformatics
M.Sc modules:
CC-MWP-NI - Neuroinformatics
CS-MWP-NI - Neuroinformatics

CS-BW - Bachelor elective course
CS-MW - Master elective course
Tue: 10-12

The remarkable success of deep neural networks, as well as some of their fail-cases, are often attributed to the hierarchical and versatile representations they acquire during training. Despite the practical demonstration of their effectiveness, questions on: how to analyze, evaluate, and improve such representations are still the object of active research. In this seminar, we approach such questions, including a comparison of representations in different networks, methods for enforcing more "well-behaved," interpretable, and reusable representations, and learning representations for specific tasks vs. universal. The seminar will address these topics on a theoretical and practical level. To fully benefit from this course, the ideal participant will have some background in deep learning, e.g., "ANNs with Tensorflow" course, and some experience with at least one of the current deep learning frameworks (e.g., TensorFlow or PyTorch). Additionally, a familiarity with some basic concepts in linear algebra, statistics, and information theory may be useful. By combining theoretical and practical approaches, this seminar aims to provide participants with a comprehensive understanding of the analysis, evaluation, and improvement of deep neural network representations.