This chapter spouts the usefulness of using Human Emotion Intensities to track a Student’s emotional engagement in a class being part of Behavioral Biometrics. The behavioral biometrics deals with the patterns of human activities. It includes physical and emotional biometrics, which signifies a trait to recognize the behavior. In a classroom, student distraction contributes to being one of the factors of behavioral biometrics. It varies in magnitude and duration. Student engagement is frequently viewed as one of the signs of addressing low performance, apathy and disengagement, and high failure rates. The emotions are intertwined with our facial expressions regulated by 43 facial muscles, which themselves are managed by two facial nerves. The dominant features, which show an absent-minded face present in a class using FAU (Facial Action Unit), are examined. This covers multiple aspects of tracking, ranging from the student entering the classroom when the first frame is captured, and then respective structures are differentiated till he/she is present in the class. In our approach, initially, the facial landmarks were recognized by annotating the distracted face dataset. Then, the same is trained with the responsible pixel values and labels to help the machine identify the associated features of a distracted/undistracted face. After processing the dataset, we concatenated two models to train the distracted dataset, one is Custom-Deep Emotion Network using Convolutional Neural Network, and the other one is Reversed-NN to maintain the symmetricity. Thus, in the output layer, we have the concatenation of both the outputs to recognize whether a face is distracted/undistracted. The pipeline supports deploying the model in real-time cameras to trace the behavioral biometrics of students. The experiments performed on the database strengthened the “proposed characteristics” that can be used for biometrics intentions.