An Application Framework for Implicit Sentiment Human-centered Tagging using Attributed Affect
In this paper, a novel framework for implicit sentiment image tagging and retrieval is presented, based on the concept of attributed affect. The user’s affective response is recorded and analyzed to provide an appropriate affective label, while eye gaze is monitored in order to identify a specific object depicted in the scene, which is attributed as the cause of the user’s current state of core affect. Through this procedure, automatic tagging of content, as well as retrieval based on personal preferences is possible. Our experiments show that our framework successfully channels behavioral tags (in the form of affective labels) to the data tagging and retrieval loop, even when applied in the context of a cost-efficient, widely available hardware setup, that uses a single low resolution webcam mounted on a standard modern computer system.