Alex Moltzau 莫战

Feb 15, 2020

4 min read
Photo by — @andredoesphoto

AI Strategies and Emotional Detection

Emotional Detection Technologies Projected to be a $25 billion market by 2023

Let us talk about emotional detection technology. In one way it is hard to define what that is, however it is projected to be a $25 billion market by 2023. Influencing our emotions a large part of most marketing (if not all) and to some extent can be said to be an integrated aspect of most, yet this has been done to a large extent through online ‘clicks’, retail ‘movements/flow’ and people guessing.

Can we detect emotions?

We can detect what looks like certain emotions or expressions.

However I guess that depends on your definition of an emotion.

Can applied artificial intelligence detect emotion?

If so: what would that look like?

Many would say hiring technology.

Apparently hiring technology does not work particularly well.

In fact HireVue is often used as a negative example.

Broadly it is said that AI emotion recognition cannot be trusted.

Yet more importantly: can we detect emotions in a responsible way?

If we take a step back beyond video or picture machine learning.

It does feel like we are moving to quantitative basics here.

Are you this, that, more, or even more?

Angry, dissatisfied, content or happy?

Or in other words do you like, heart, haha, wow, sad or angry.

If you had real time data of a basic type of feeling from 2 billion people, and indeed someone does.

Then if you applied analysis through machine learning or advanced techniques within AI, what could you learn?

If you knew someone’s pulse was beating quicker at a given place, a given time, what could you predict?

Google is attempting to buy FitBit for US $2,1 billion.

We do not have to think about faces, but we could.

If we could think about faces and the pulse.

If we could think about faces, pulse and stated reaction.

Combined perhaps with location and movement.

Then perhaps we could say: emotional detection, to some degree.

Can you fake your pulse? Unlikely in most cases.

What rights do adults have, children have or marginalised groups?

What is fairness in the context of these technologies?

Questions surely we should ask ourselves when we shape strategies.

Particularly because decisions are being made.

A typical conception of this is China tracking kids in education within certain pilot projects.

Another could be understanding emotional behaviour for policing.

I am not sure how it will develop.

Yet what I do suggest is that we discuss these developments closely, and consider the implications.

Who will detect emotions — and for what reasons?

Can we detect emotions? Yes, to some degree.

It looks like we can do so as well to some degree with advanced machine learning combining a variety of data sources.

Yet again, more importantly: can we detect emotions in a responsible way?

Perhaps AI strategies should be more explicit on this topic.