{"id":3664,"date":"2020-01-28T05:22:17","date_gmt":"2020-01-28T05:22:17","guid":{"rendered":"http:\/\/icaninfotech.com\/?p=3664"},"modified":"2024-12-02T09:59:27","modified_gmt":"2024-12-02T09:59:27","slug":"what-is-artificial-emotional-intelligence-and-ai-works-on-emotions","status":"publish","type":"post","link":"https:\/\/icaninfotech.com\/what-is-artificial-emotional-intelligence-and-ai-works-on-emotions\/","title":{"rendered":"What Is Artificial Emotional Intelligence & How Does Emotion AI Work?"},"content":{"rendered":"
Source : search engine journal<\/p>\n
Imagine a world in which machines interpret the emotional state of humans and adapt their behavior to give appropriate responses to those emotions.<\/p>\n
Well, artificial emotional intelligence, which is also known as emotion AI or affective computing, is already being used to develop systems and products that can recognize, interpret, process, and simulate human affects (with an \u201ca,\u201d not an \u201ce\u201d). In psychology, an \u201caffect\u201d is a term used to describe the experience of feeling or emotion.<\/p>\n
If you\u2019ve seen \u201cSolo: A Star Wars Story\u201d, then you\u2019ve seen the poster child for artificial emotional intelligence: L3-37.<\/p>\n
Lando Calrissian\u2019s droid companion and navigator (voiced by Phoebe Waller-Bridge) instigates a slave revolt to escape from Kessel, but is severely damaged during the diversion. Lando (played by Donald Glover) is also injured during the getaway.<\/p>\n
The \u201cwoke robot\u201d demonstrates the ability to simulate empathy by interpreting the emotional state of a human, adapting its behavior to him, and giving an appropriate response to those emotions.<\/p>\n
Now, this example might lead some video marketers and advertisers to think that emotion AI is science fiction. But, it is very real.<\/p>\n
A number of companies are already working to give computers the capacity to read our feelings and react, in ways that have come to seem startlingly human. This includes Affectiva, an emotion measurement technology company that spun out of MIT\u2019s Media Lab in 2009, and Realeyes, an emotion tech company that spun out of Oxford University in 2007.<\/p>\n
So, how do their technologies help brands, agencies, and media companies improve their advertising and marketing messages? Let\u2019s tackle this question by examining how affective computing works.<\/p>\n
Brands know emotions influence consumer behavior and decision making. So, they\u2019re willing to spend money on market research to understand consumer emotional engagement with their brand content.<\/p>\n
Affectiva uses a webcam to track a user\u2019s smirks, smiles, frowns, and furrows, which measure the user\u2019s levels of surprise, amusement, or confusion.<\/p>\n
It also uses a webcam to measure a person\u2019s heart rate without wearing a sensor by tracking color changes in the person\u2019s face, which pulses each time the heart beats.<\/p>\n
Affectiva has turned this technology into a cloud-based solution that utilizes \u201cfacial coding\u201d and emotion recognition software to provide insight into a consumer\u2019s emotional responses to digital content.\u00a0All a brand or media company needs are some panelists with standard webcams and internet connectivity.<\/p>\n