Sentiment Analysis With Azure Face API and Vonage
The Vonage Video API and Azure Face API are used together in this tutorial to build a multi-party video conference that analyzes the sentiment of each participant based on their facial expression. The code sets up an OpenTok session, subscribes to it, and connects to it using the provided tokens. It then uses JavaScript to capture images from each stream, send them to Azure Face API for emotion detection, and display the identified emotions as emojis next to the corresponding video streams. The `processImages` method is called when the "Analyze" button is clicked, which clears any existing emojis and gets all HTML video tags in the DOM, sending them to the `sendToAzure` method to be processed. However, there are limitations to this code, such as it only adding an emoji for the first face detected by Azure Face API if multiple faces are present.
Company
Vonage
Date published
April 27, 2021
Author(s)
Michael Jolley
Word count
2255
Language
English
Hacker News points
None found.