In the vision community recognition and analysis of facial expressions is a long investigated area of research with potential applications such as human computer interaction (HCI). Most state of the art techniques focus on recognition of defined facial expression categories, i.e. basic emotions. However, in HCI classical basic emotions are only occurring sparsely, thus they are inadequate to guide the dialog with the user. Here we suggest a mapping to the so called circumplex model of affect, known from psychology. This enables us to determine the current affective state of the user, which can then be used to control the course of the interaction. In particular, the output of the proposed machine vision based recognition method gives insight to the observed person’s arousal state and the valence along two axes of the 2-D plane of the underlying circumplex model. In this paper we present the basic principles of our new method and present first results.