Detecting facial expressions and recognizing emotions is crucial for perceiving the true sentiment of a person, regarding any stimuli. As humans, this function is performed instinctively through our natural empathetic response. In order to allow arbitrary devices to have this capability, we used Deep Learning techniques to train an ensemble of neural networks able to detect six different emotional states: anger, disgust, fear, happiness, sadness, surprise and neutrality.

// Example API Response

  "class": "surprised",
  "weights": {
    "angry": 4.27,
    "disgusted": 0.11,
    "fearful": 6.89,
    "happy": 21.49,
    "neutral": 2.95,
    "sad": 0.02,
    "surprised": 64.28

We used transfer learning from the Facenet architecture. This architecture is used for facial identification. We used features from a middle layer of this network as embeddings that were later passed through two convolutional and two fully connected layers ending in a softmax activation. An ensamble of 4 such neural networks was built to produce the final classification Training and testing was performed on a 32,300 image database pre-labeled with the identified emotion with a 80/20 train-test random split.