Microsoft Kinect-Emotion

Below is an extract of the thesis demonstrating the six states of emotion a person can go through. To the right are sets of experiments performed using the Kinect as a face node analyser as I mimicked the various emotions.

The device picks up the different muscle movements of my face, the brow, eyes, nose, mouth and chin. The data recorded, as shown below, were then exported to a 3D software (Blender) and used as armatures attached to a 3D model to analyse the effect of the translated emotions to a virtual space.