March 28-29th, 2023 | Alte Aula, Tübingen, Germany
Deep learning has enabled major advances in machine learning. However, the deployment of deep learning frameworks in settings that are safety-critical or that impact society requires their decision-making to be explainable. This is fundamental for building trustworthy and user-oriented machine learning models. The aim of this workshop is to generate awareness around explainability in machine learning which is a topic of growing interest. Furthermore, we aim to encourage interdisciplinary interaction and collaboration between researchers from the University of Tübingen and other international institutions that work on different aspects of explainability, in particular in the context of computer vision.
Our workshop will consider research on approaches to explainability and on applications of explainability. Topics of interest will include (but will not be limited to):
The workshop will contain both keynote talks from known researchers in the field as well as invited talks and spotlight presentations of recent advancements in the field of explainability.
March | 28th | March | 29th | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
9:00 | - | 9:20 | Opening remarks | 9:00 | - | 9:50 | Invited talk: Bernt Schiele | |||||
"Interpretability of Deep Learning Medical Image Computing Technologies: Insights, Challenges and Opportunities" | "Interpretability for Deep Learning in Computer Vision" | |||||||||||
9:20 | - | 10:10 | Invited talk: Wojciech Samek | 9:50 | - | 10:15 | Spotlight: Moritz Böhle | |||||
"Concept-Level Explainable AI" | ||||||||||||
10:10 | - | 10:40 | Coffee break | 10:15 | - | 10:40 | Coffee break | |||||
10:40 | - | 11:05 | Spotlight: Diego Marcos | 10:40 | - | 11:05 | Spotlight: Leon Sixt | |||||
11:05 | - | 11:30 | Spotlight: Maximilian Augustin | 11:05 | - | 11:30 | Spotlight: Seong Joon Oh | |||||
11:30 | - | 12:20 | Invited talk: Ulrike von Luxburg and Sebastian Bordt | 11:30 | - | 12:20 | Invited talk: Mara Graziani | |||||
"Explanation and Regulation" | "Concept discovery and Dataset exploration with Singular Value Decomposition" | |||||||||||
12:20 | - | 13:20 | Lunch break | 12:20 | - | 13:20 | Lunch break | |||||
13:20 | - | 13:45 | Spotlight: Stephan Alaniz | 13:20 | - | 13:45 | Spotlight: Sara Blanco | |||||
13:45 | - | 14:10 | Spotlight: Julius von Kügelgen | |||||||||
13:45 | - | 14:35 | Invited talk: Ruth Fong | 14:10 | - | 15:00 | Invited talk: Mauricio Reyes | |||||
"Directions in Interpretability" | "Interpretability of Deep Learning Medical Image Computing Technologies: Insights, Challenges and Opportunities" | |||||||||||
14:35 | - | 15:05 | Coffee break | 15:00 | - | 15:15 | Closing remarks | |||||
15:05 | - | 15:30 | Spotlight: Katrin Renz | |||||||||
15:30 | - | 15:55 | Spotlight: Roland Zimmermann | |||||||||
16:00 | - | 16:50 | Invited talk (hybrid): Trevor Darrell and Lisa Dunlap | |||||||||
"Moving from Explainable models to Advisable models" | ||||||||||||
17:00 | - | 18:00 | Panel discussion |
The workshop will take place at the Alte Aula in Tübingen, 28-29th March 2023. Alte Aula is the old auditorium, a historic building whose construction dates back to the 16th century.
Participants will also have the chance to explore the old university town of Tübingen. The city combines the ancient medieval atmosphere with the vibrant life of a cosmopolitan student town. You can get lost in the narrow alleys and timber-framed houses that lead to the 500-years old castle, take a boat trip in the famous "Stocherkahn" or enjoy the tasteful Swabian cuisine. For more information on the city, visit www.tuebingen.de.
There is no registration fee, however the venue has limited capacity. If you would like to attend, please register by February 28th using this form.
Sorry, registrations are closed now!
The workshop can offer limited childcare grants. Please indicate if you require childcare support when registering your interest to attend.
Do you have any questions? Please send an email to eml-workshop@inf.uni-tuebingen.de.
Workshop funded by the |
![]() |