Modeling Dyadic and Group Impressions with Intermodal and Interperson Features

S. Okada,L. Nguyen,O. Aran,D. Gática-Pérez

Published 2019 in ACM Trans. Multim. Comput. Commun. Appl.

ABSTRACT

This article proposes a novel feature-extraction framework for inferring impression personality traits, emergent leadership skills, communicative competence, and hiring decisions. The proposed framework extracts multimodal features, describing each participant’s nonverbal activities. It captures intermodal and interperson relationships in interactions and captures how the target interactor generates nonverbal behavior when other interactors also generate nonverbal behavior. The intermodal and interperson patterns are identified as frequent co-occurring events based on clustering from multimodal sequences. The proposed framework is applied to the SONVB corpus, which is an audiovisual dataset collected from dyadic job interviews, and the ELEA audiovisual data corpus, which is a dataset collected from group meetings. We evaluate the framework on a binary classification task involving 15 impression variables from the two data corpora. The experimental results show that the model trained with co-occurrence features is more accurate than previous models for 14 out of 15 traits.

PUBLICATION RECORD

  • Publication year

    2019

  • Venue

    ACM Trans. Multim. Comput. Commun. Appl.

  • Publication date

    2019-01-24

  • Fields of study

    Computer Science, Psychology

  • Identifiers
  • External record

    Open on Semantic Scholar

  • Source metadata

    Semantic Scholar

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-57 of 57 references · Page 1 of 1

CITED BY

Showing 1-27 of 27 citing papers · Page 1 of 1