Marketing insights that spark conversions

November 27, 2017 // 8:47 AM

6 Companies Making Emotional Intelligence Part Of Our Everyday Lives

Written by Matt Allegretti
Earlier this month, fans lined up everywhere to buy the new iPhone X, which features Face ID, a facial recognition security function that uses your face rather than your passcode or thumb print to unlock your iPhone. Apple’s new facial recognition software is just one of the new technologies in the ever-growing field of emotional intelligence. For the first time in human history, we are now living in an age where computers have the ability to read emotions, a technological feat that is already changing life as we know it, whether we’re ready or not. In fact, by 2020, the emotion detection and recognition market will be worth a staggering 22.65 billion dollars.

From smart phones and video games, to mood detection radios, to AI-powered facial recognition that can predict viewers’ reactions to movies, emotionally intelligent technology is being implemented into more and more consumer products.

Check out 6 companies that are using the technology to revolutionize their products and industry:


1. Disney/Dolby Laboratories: Facial Recognition & AI


Disney Research and Caltech are using facial recognition & AI to predict how people react to movies.  Their new system can track the expressions of hundreds of faces in a dark theater, using a new algorithm they call “factorized variational autoencoders.”
The “FVAE’s” have proven to be extremely accurate at recognizing complex facial expressions. So accurate in fact, that after only 10 minutes of analyzing someone’s face, the FVAE’s can predict that person’s reactions for the remainder of a film.


“The FVAEs were able to learn concepts such as smiling and laughing on their own,” said Zhiwei Deng, a lab associate at Disney Research. They were also “able to show how these facial expressions correlated with humorous scenes.”
For this project, Disney used a 400 seat theater with four infrared cameras, filming the audience during 150 showings of nine Disney hits, including: The Jungle Book, Big Hero 6, Star Wars: The Force Awakens, and Zootopia. Throughout the screenings, Disney was able to capture over 16 million data points, which they fed into the neural network. Disney’s AI system can currently predict when someone will laugh or smile, and in the future, their AI will be able to recognize negative emotions as well, such as fear and sadness.


As Disney’s facial recognition and AI technology continues to be developed and implemented, the company will become better and better at understanding and predicting their audience’s emotional reactions.
“Understanding human behavior is fundamental to developing AI systems that exhibit greater behavioral and social intelligence,” said Yison Yue, Caltech professor, in a press release. “After all, people don’t always explicitly say that they are unhappy or have some problem.”
If Disney is already using their new facial recognition technology in the movie theater, it’s more than possible that they’re also experimenting with it in their theme parks and other interactive media.





Along with Disney, Dolby Laboratories is also studying people to better understand how they react physically and emotionally to media. Dolby is using EEG caps, heart rate monitors, skin response sensors, and Flir cameras to track emotional responses of people as they watch a video. Their goal is to better understand what sort of content frightens, excites, arouses or elicits strong emotional reactions from viewers. For example, Dolby has found an intriguing skin reaction when soccer fans watch a game on tv. “Every time, right before a penalty kick, you can see a spike [in emotion], and it’s entirely predictable, said Poppy Crum, head scientist at Dolby Labs. ”It’s just that human anticipation of the event.”
Both Disney and Dolby are changing the way the movie industry is creating movies. In the future, movie studios will be regularly using AI and emotion tracking to produce movies that better trigger our emotional reactions.


2. MIT: EQ-Radio


MIT is developing a wireless technique that measures human emotion. The device known as EQ-Radio is being developed by MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL). Basically, EQ-Radio is similar to a wifi router.
Here’s how it works: EQ-Radio sends a wireless single out, which is bounced back to the device by a human body. Then the device measures how the human reacts to the signals, and it can tell if you are happy, sad, excited, or angry. EQ-Radio’s creators think it could be used in advertising, healthcare, video games, and movie testing.
Diana Katabi, an MIT Professor who led the development of EQ-Radio says her team primarily looked at the heartbeat and the breathing rates of participants during testing. “This kind of vital sign, as you would expect, is related to our emotion,” Katabi says, “he’s now angry, he’s now happy, he’s sad. And all of that without touching the person's body.”
Currently, EQ-Radio can effectively identify emotions 87 percent of the time. An impressive stat for such a new kind of emotion technology.
The major issue with EQ-Radio is privacy concerns, considering that the wireless device could monitor people’s emotions without their consent. Luckily, the MIT team built in a “consent-mechanism” that “will ask the person to do certain motions and movements,” says Katabi. “And if that person doesn't do them, it knows that that person did not give consent. So it would not give you the information.”


3. Apple: Face ID/ Animoji

As mentioned earlier, Apple’s new iPhone X includes Face ID, a state-of-the-art facial recognition security feature that allows users to unlock their phones simply by holding it up to their face. Face ID has an extraordinary 1-in-1 million false acceptance rate, as opposed to Apple’s Touch ID, which has a 1-in-50,000 false acceptance rate.
Along with unlocking the iPhone X, Apple’s facial recognition also powers another new feature called Animoji, which captures user’s facial movements to bring emoji’s to life. With the help of an A11 bionic chip, Animoji can process up to 50 facial movements in real time, allowing the emoji’s to mirror when you smile, laugh, frown, and raise your eyebrows. Simply make a facial expression and choose the emoji you would like to send.


Animoji also allows you to record voice messages and send that message using an emoji. The emoji will deliver the message - with your voice and facial expressions - and will play as a “looping video within the message window.”
So what does this mean for the future of smartphones?
Anil Jain, a biometric professor at Michigan State University, says that in addition to face and fingerprint recognition, smartphones will also eventually have iris recognition sensors. Different sensors could be used for different purposes, such as online transactions. “There is no reason why all three can’t be in the mobile phone within the next few years,” Jain says.


Apple’s Face ID and Animoji is just the one more step towards mobile emotion intelligence. Future smartphones will be able to detect your mood and sense your emotions. Vincent Spruyt from Argus Laboratory, an emotion detection company, says, “A smartphone should be more than just a phone, it should be a smart agent. It should proactively try and think what you think, and feel what you feel.”  
At the rate our technology is evolving, emotion detection phones are not that far away.

4. Toyota: Concept-i 


At CES earlier this year in Las Vegas, Toyota unveiled their “Concept-i” concept car, which they described as “more than a machine. It will become our friend.” Concept-i uses an AI system called Yui.
“Yui learns from us, grows from us, builds a relationship that’s meaningful and emotional,” says Bob Carter, Toyota’s senior vice-president of automotive operations.
Yui will be able to tell if you are happy or sad, limit your driving speed based on your mood, and help you out if you seem distracted. For example, if you’re feeling melancholy, the car can lower your speed.


“At Toyota, we recognize that the important question isn’t whether future vehicles will be equipped with automated or connected technologies, “ says Carter. “It is the experience of the people who engage with those vehicles. Thanks to Concept-i and the power of artificial intelligence, we think the future is a vehicle that can engage with the driver in return.”
Honda has similar aspirations when it comes to emotionally intelligent cars. At CES they showed off their new AI-powered electric concept car called NeuV, which has what the company calls an “emotion engine” that can “enable machines to artificially generate their own emotions.”
The race in developing the most advanced emotionally intelligent vehicles is on, as the world’s top automakers say that in the future (probably the far future) all cars will be self-driving. Car companies believe that when that happens, consumers will shop based on “which car they can have the best relationship with.”
As far as Toyota’s Concept-i, don’t expect to see it on the road anytime soon. Toyota still needs to tinker with their AI prototype to make sure the cars are as safe as possible in all weather conditions.


5. Microsoft: Mood Shirt/ Sensoree - Mood Sweater 


In 2016, Microsoft unveiled “Mood Shirt,” which stimulates the nervous system to monitor and regulate the wearer’s emotions. The shirt uses hidden sensors to read your heartbeat, skin temperature, and to keep track of your movement to cheer you up or calm you down based on how you are feeling.
Mood Shirt is still in the patent phase, but according to Microsoft, the shirt will “provide (users) with a cost effective, reliable and easy to use way to enhance their physical health and mental well-being and enhance their success in communicating and/or negotiating with people.”
How does Mood Shirt work? Here’s an example: Say you were going on a date. The shirt could actually switch on cooling actuators if you became bashful and flustered. Or it could cheer you up with a upbeat song if you were feeling melancholy.


Tech companies aren’t the only ones experimenting with mood detection clothing. The fashion industry has also been developing clothing that can “interact” with us based on how we’re feeling. In fact, the Sensoree mood sweater is ready for commercial production, and it already has a waiting list. The sweater uses LED lights that change colors depending on how someone is feeling, like a mood ring. Before it was made available to the general public, the sweater was originally designed for people with sensory processing disorders like ADHD and autism.
“It helps them see and connect with what they’re feeling,” says Sensoree founder Kristen Neidlinger, “and it helps other people see what they’re feeling… it gives their body a voice.”




It’s safe to say that the clothes of the future will do much more than keep us warm. Check out this article to learn more about other designers who are currently developing mood sensitive/interactive clothing.

6. BfB Labs: Emotional Responsive Gaming


British tech startup, BfB labs, creators of the first “emotion response game,” are using video games as a way to teach kids how to regulate their emotions, using mindfulness techniques. Their first game, Champions of the Shenga, is a mobile fantasy card battling game which uses a BfB sensor to measure heart rate and monitor emotions. Players are rewarded for staying calm and focused while playing the game.
Shenga is a fun alternative to mindfulness, especially for kids who aren’t fond of sitting still for long periods of time. Naomi Stoll, the lead researcher on the Shenga project, says, “Meditation, mindfulness and yoga are not the kind of things that teenage kids, especially teenagers from disadvantaged backgrounds, are going to to think are relevant to them and practice. Even 20 minutes of breathing every day is just not fun in any way.”


To date, Shenga has produced very positive results. BfB ran a test with teenagers who played the game for two years and found that 75% reported they improved their ability to stay calm by playing the game. Perhaps, even better is that 25% (33% in an all girls school) plan to integrate mindfulness techniques into their non-gaming life. Charlotte Berry, Deputy Head at The Bullericay School in Essex, where BfB ran their biggest trial, says the kids “started to use what they learned straightaway - to prepare for tests or deal with issues with their family or friends. It’s certainly been a very interesting experience to be part of.”
At present day, emotional response gaming is still in its infancy, but video game consoles that use this technology are not far away. Imagine a zombie survival game that senses when you are afraid, or a shooting game that shakes the screen when you’re stressed out. With emotional response gaming, developers are going to be able to measure the emotional state of players in order to create a more immersive experience. It’s literally going to be a game changer.


Emotional intelligence is the future. Period.


These are just six examples of how emotional intelligence is being developed and integrated into everyday products. In the future, it will be exciting to see how different industries incorporate this technology into new business processes and solutions. Whether it’s cars or clothing that can adapt to our emotions, movies or video games that give us more immersive experiences, or phones that can seamlessly interact with us, the rise of AI and emotional intelligence will inevitably change our relationship with technology as we know it.




Topics: Emotional Intelligence, technology