This document discusses a system for recognizing facial emotions in real-time using a Kinect depth sensor and interfacing those emotions with a 3D virtual avatar in Second Life. The system analyzes facial feature points using the Kinect to detect emotions like smile, surprise, fear, anger and sad. It then transfers the detected emotions to an avatar in Second Life in real-time. The goal is to help speech-impaired people communicate emotions through an avatar. The system was implemented in two phases - facial emotion recognition using Kinect, and displaying linked emotions on an avatar in Second Life.