Two recent interviews about generative and human-centred AI 

The Lower Saxony press (Nordwest Zeitung/NWZ) and Oldenburg’s local Mox Journal published two interviews with Prof. Sonntag discussing generative and human-centered AI. As artificial intelligence becomes a widely discussed topic, the University of Oldenburg actively conducts research in this area. In these interviews, Prof. Sonntag elaborates on his research group’s Read more…

Comprehensive Evaluation of
Feature Attribution Methods in Explainable AI via Input Perturbation

Explainable AI (XAI) has demonstrated its potential in deciphering discriminatory features in machine learning (ML) decision-making processes. Specifically, XAI’s feature attribution methods shed light on individual decisions made by ML models. However, despite their visual appeal, these attributions can be unfaithful. To ensure the faithfulness of feature attributions, it is Read more…

Interpretable and Interactive Disease Diagnosis Using Collaborative Learning of Segmentation and Classification

Medical image analysis encompasses two crucial research areas: disease grading and fine-grained lesion segmentation. Although disease grading often relies on fine-grained lesion segmentation, they are usually studied separately. Disease severity grading can be approached as a classification problem, utilizing image-level annotations to determine the severity of a medical condition. On Read more…