Inferring Emotional Models from Human-Machine Speech Interactions
Venue & metadata
- Journal/Proceedings: Procedia Computer Science
- Volume: 225
- Pages: 1241 – 1250
- Note: Cited by: 1; All Open Access, Gold Open Access
- Author keywords: Emotional State Inference; Human Digital Twin; Human Machine Interaction; Process Mining; Speech Analysis
Abstract
Human-Machine Interfaces (HMIs) are getting more and more important in a hyper-connected society. Traditional HMIs are built considering cognitive features while emotional ones are often neglected, bringing sometimes such interfaces to misuse. As a part of a long run research, oriented to the definition of an HMI engineering approach, this paper concretely proposes a method to build an emotional-aware explicit model of the user starting from the behaviour of the human with a virtual agent. The paper also proposes an instance of this model inference process in voice assistants in an automatic depression context, which can constitute the core phase to realize a Human Digital Twin of a patient. The case study generated a model composed of Fluid Stochastic Petri Net sub-models, achieved after the data analysis by a Support Vector Machine. © 2023 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (https://creativecommons.org/licenses/by-nc-nd/4.0)
Keywords
Behavioral researchEmotion RecognitionMan machine systemsPetri netsStochastic modelsStochastic systemsVirtual realityEmotional modelsEmotional stateEmotional state inferenceHuman digital twinHuman machine interactionHuman Machine InterfaceHuman-machineLongest runProcess miningSpeech interactionSupport vector machines
Links & artifacts
Suggested citation
Campanile, L., de Fazio, R., Di Giovanni, M., Marrone, S., Marulli, F., & Verde, L. (2023). Inferring Emotional Models from Human-Machine Speech Interactions [Conference paper]. Procedia Computer Science, 225, 1241–1250. https://doi.org/10.1016/j.procs.2023.10.112