1. How to submit my research paper? What’s the process of publication of my paper?
The journal receives submitted manuscripts via email only. Please submit your research paper in .doc or .pdf format to the submission email: ijsps@ejournal.net.
2. Can I submit an abstract?
The journal publishes full research papers. So only full paper submission should be considered for possible publication...[Read More]

Automatic Communicative Engagement Measurement and Conversational States Detection Using Visual Movement Signals

Jingguang Han 1 and Nick Campbell 2
1. Accenture Technology Labs, Dublin, Ireland
2. School of Computer Science and Statistics, Trinity College Dublin, University of Dublin, Dublin, Ireland
Abstract—Communicative dynamics have received increasing focus from researchers in the field of social signal processing. These cognitive dynamics can enhance the experience and smoothness of both human-to-human and human-to-machine interactions dramatically. Communicative engagement [1] plays an important part among these dynamics. Promising detection and measurement results of communicative engagement have been reported with quantitative models using acoustic and linguistics signals. To our best knowledge, there is no automated system that utilizes visual signals for communicative engagement detection and measurement, nor is there any clear quantitative model. To bridge this gap, this paper presents a novel method of using multi-dimensional visual signals to automatically detect and measure communicative engagement in multi-party conversations, and a machine learning approach to automatically predict conversational states of participants: speaker vs the most engaged listener. We also present a multi-modal audio-video corpus designed and recorded by one of the authors with multiple microphones and one 360-degree video camera for three days of four people participating in natural and spontaneous social conversations. A face detection and movement measurement system based on the Viola-Jones algorithm and color differentiation algorithm was developed for quantitative analysis of visual movement signals. We applied a series of statistical methods to measure the communicative engagement in multi-party conversations with the visual movement dataset. The results are validated in comparison with the same calculations using randomly tailored signals. The comparison shows a significantly stronger correlation of the visual signals between the participants who are engaged in the communication than using pseudo signals. The result also shows a high probability of 87.3% that pairs of participants with the highest engagement coefficient containing the speakers. Furthermore, a support vector machine was trained with a 5-dimensional movement dataset and applied to predict the conversational states of the participants and distinguish the most engaged listener vs the speaker. Cross validation shows a promising 79.04% accuracy.

Index Terms—social signal processing, communicative engagement, image and video processing, correlation analysis, SVM

Cite: Jingguang Han and Nick Campbell, "Automatic Communicative Engagement Measurement and Conversational States Detection Using Visual Movement Signals," International Journal of Signal Processing Systems, Vol. 5, No. 1, pp. 44-48, March 2017. doi: 10.18178/ijsps.5.1.44-48
Copyright © 2012-2015 International Journal of Signal Processing Systems, All Rights Reserved