DFKI - Interactive Machine Learning Lab

  • Home
  • Projects
    • Multimodality
    • Machine Learning
    • HCI & Virtual Reality
    • Natural Language Processing
    • Lighthouse Project
  • News
  • People
  • Publications
  • Teaching
    • Courses
    • Writing a Thesis
    • Bachelor/Master Thesis at Oldenburg University
  • Jobs

We achieved a new milestone in our EIT project ERICS

Published by Max Biwersi on December 15, 2018December 15, 2018

The chatbot Eike, which is based on our multimodal multisensor platform, was launched at handbookgermany.de

ERICS

Categories:

Search
Categories
  • Machine Learning (6)
  • HCI & Virtual Reality (5)
  • Multimodality (8)
  • Natural Language Processing (3)
  • Lighthouse Project (3)
Article by Year
  • 2021 (1)
  • 2020 (2)
  • 2019 (4)
  • 2018 (12)
  • 2017 (3)
  • 2016 (1)
  • 2014 (2)

Related Posts

Multimodality

EyeLogin – Calibration-free Authentication Method For Public Displays Using Eye Gaze

Motivation The usage of interactive public displays has increased including the number of sensitive applications and, hence, the demand for user authentication methods. We implement a calibration-free authentication method for situated displays based on saccadic Read more…

Machine Learning

Visual Search Target Inference for Pro-active User Support

Visual Search Target Inference in Natural Interaction Settings with Machine Learning Visual search is a perceptual task in which humans aim at identifying a search target object such as a traffic sign among other objects. Read more…

Multimodality

Digital Pens in Education

Digital pen signals were shown to be predictive for cognitive states, cognitive load and emotion in educational settings. We investigate whether low-level pen-based features can predict the difficulty of tasks in a cognitive test and Read more…

  • News
  • Teaching
  • Thesis Advisors
  • Home
  • Projects
  • People
  • Publications
  • Legal Information / Impressum
  • Privacy Policy
Hestia | Developed by ThemeIsle