Sense and sensibility: Do we want AI to master emotions?
#1
Lightbulb 
Quote:
[Image: emotional-ai-featured.jpg]

We examine the workings of emotion-recognition technologies, their usefulness, and the privacy concerns they inspire.

Imagine you come home one day in a bad mood, shout at the door for not opening fast enough and at the light bulb because it burned out — and the smart speaker immediately starts playing chill music, and the coffee machine pours you a mocha. Or, as you walk into a store, the robot assistant that was about to approach sees your unhappy face, backs off, and helps another customer instead. Sound like science fiction?

In fact, emotion recognition technologies are already being introduced into many areas of life, and in the near future our mood could well be under the watchful eye of gadgets, household appliances, cars, you name it. In this post, we explore how such technologies work, and how useful — and sometimes dangerous — they might be.

Artificial EQ

Most existing emotion-recognition systems analyze an individual’s facial expression and voice, as well as any words they say or write. For example, if the corners of a person’s mouth are raised, the machine might rule that the person is in a good mood, whereas a wrinkled nose suggests anger or disgust. A high, trembling voice and hurried speech can indicate fear, and if someone shouts the word “cheers!” they are probably happy.

More-complex systems also analyze gestures and even take into consideration the surrounding environment along with facial expressions and speech. Such a system recognizes that a person being forced to smile at gunpoint is probably not overjoyed.

Emotion-recognition systems generally learn to determine the link between an emotion and its external manifestation from large arrays of labeled data. The data may include audio or video recordings of TV shows, interviews and experiments involving real people, clips of theatrical performances or movies, and dialogues acted out by professional actors.

Simpler systems can be trained on photos or text corpora, depending on the purpose. For example, this Microsoft project tries to guess people’s emotions, gender, and approximate age based on photographs.

What’s emotion recognition for?

Gartner predicts that by 2022 one in ten gadgets will be fitted with emotion-recognition technologies. However, some organizations are already using them. For example, when stepping into an office, bank, or restaurant, customers might be greeted by a friendly robot. Here are just a few areas in which such systems might prove to be beneficial.

Security

Emotion recognition can be used to prevent violence — domestic and otherwise. Numerous scientific articles have touched on this issue, and entrepreneurs are already selling such systems to schools and other institutions.

Recruitment

Some companies deploy AI capable of emotion recognition as HR assistants. The system evaluates keywords, intonations, and facial expressions of applicants at the initial — and most time-consuming — stage of the selection process, and compiles a report for the human recruiters on whether the candidate is genuinely interested in the position, honest, and more.

Customer focus

The Roads and Transport Authority in Dubai launched an interesting system this year at its customer service centers, with AI-equipped cameras comparing people’s emotions when they enter and leave the building to determine their level of satisfaction. If the score calculated falls below a certain value, the system advises center employees to take measures to improve the quality of service. (Photos of visitors are not saved for privacy considerations.)

Socialization of children with special needs

Another projects aims to help autistic children interpret the feelings of those around them. The system runs on Google Glass smart glasses. When the child interacts with another person, the glasses use graphics and sound to give clues about the latter’s emotions. Tests have shown that children socialize faster with this virtual helper.

How effective are emotion detectors?

Emotion-recognition technologies are far from perfect. A case in point is the aggression-detection technology deployed in many US schools. As it turns out, the system considers a cough more alarming than a bloodcurdling scream.

Researchers at the University of Southern California have found that facial-recognition technology is also easy to dupe. The machine automatically associates certain facial expressions with particular emotions, but it fails to distinguish, for example, malicious or gloating smiles from genuine ones.

As such, emotion-recognition systems that take context into account are more accurate. But they are more complex and far fewer.

Of relevance is not only what the machine is looking at, but what it was trained on. For example, a system trained on acted-out emotions might struggle with real-life ones.
...
Continue Reading
Reply


Messages In This Thread
Sense and sensibility: Do we want AI to master emotions? - by harlan4096 - 04 December 19, 16:39

Forum Jump:


Users browsing this thread: 1 Guest(s)
[-]
Welcome
You have to register before you can post on our site.

Username/Email:


Password:





[-]
Recent Posts
QOwnNotes 19.1.6
24.12.4 The wel...Kool — 12:56
INTEL Arc Graphics 32.0.101.6325/6253 dr...
Highlights Fix...harlan4096 — 11:06
GFYI [Official] Revo Uninstaller Pro v5...
"Share feedback...damien76 — 09:01
GFYI [Official] SpyShelter PRO v15 Chri...
Merry Christmas and ...damien76 — 08:56
GFYI [Official] IObit Christmas 2024 Bl...
Merry Christmas and ...damien76 — 08:54

[-]
Birthdays
Today's Birthdays
No birthdays today.
Upcoming Birthdays
No upcoming birthdays.

[-]
Online Staff
There are no staff members currently online.

>