This paper outlines a system for non-intrusive estimation of a user’s affective state in the Circumplex Model from monitoring the user’s pupil diameter and facial expression, obtained from an EyeTech TM3 eye gaze tracker (EGT) and a RGB-D camera (KINECT) respectively. According to previous studies, the pupillary response can be used to recognize “sympathetic activation” and simultaneous “parasympathetic deactivation”, which correspond to affective arousal. Additionally, tracking the user’s facial muscle movements as he or she displays characteristic facial gestures yields indicators to estimate the affective valence. We propose to combine both types of information to map the affective state of the user to a region on the Circumplex Model. This paper outlines our initial implementation of such combined system.