Book Chapter

Sentic maxine: Multimodal affective fusion and emotional paths

Details

Citation

Hupont I, Cambria E, Cerezo E, Hussain A & Baldassarri S (2012) Sentic maxine: Multimodal affective fusion and emotional paths. In: Wang J, Yen G & Polycarpou M (eds.) Advances in Neural Networks – ISNN 2012: 9th International Symposium on Neural Networks, Shenyang, China, July 11-14, 2012. Proceedings, Part II. Lecture Notes in Computer Science, 7368. Berlin Heidelberg: Springer, pp. 555-565. http://link.springer.com/chapter/10.1007/978-3-642-31362-2_61#

Abstract
The capability of perceiving and expressing emotions through different modalities is a key issue for the enhancement of human-agent interaction. In this paper, an architecture for the development of intelligent multimodal affective interfaces is presented. It is based on the integration of Sentic Computing, a new opinion mining and sentiment analysis paradigm based on AI and Semantic Web techniques, with a facial emotional classifier and Maxine, a powerful multimodal animation engine for managing virtual agents and 3D scenarios. One of the main distinguishing features of the system is that it does not simply perform emotional classification in terms of a set of discrete emotional labels but it operates in a novel continuous 2D emotional space, enabling the output of a continuous emotional path that characterizes user's affective progress over time. Another key factor is the fusion methodology proposed, which is able to fuse any number of unimodal categorical modules, with very different time-scales, output labels and recognition success rates, in a simple and scalable way.

Keywords
Sentic computing; Facial expression analysis; Sentiment analysis; Multimodal fusion; Embodied agents

StatusPublished
Title of seriesLecture Notes in Computer Science
Number in series7368
Publication date31/12/2012
PublisherSpringer
Publisher URLhttp://link.springer.com/…-642-31362-2_61#
Place of publicationBerlin Heidelberg
ISSN of series0302-9743
ISBN978-3-642-31361-5