Emotions and technology

Therapists for robots wanted

aaaa

 

Have you seen the latest marketing campaign of IBM Watson which was presented during the latest Oscar awards gala? If not, you should!  How close from now is a future when robots will need a therapist?

Intelligent automation and artificial intelligence capabilities present already on the market change our expectations from technology. It exceeds our assumptions how technology will interact with humans. Cognitive science connects information technology, natural language processing, psychology, and brain biology. Finally there are commercial solutions like IBM Watson which can learn by itself. Technology that can speak with understanding in natural languages is here. It is not just about speech – it is about computer – human conversation in natural language. The difference between voice recognition and natural language understanding methods are huge. There are many scientific projects mastering one of humans’ senses. Computers can kind of “read” what we write and try to understand this before they respond. We can read information provided by intelligent technology. Computer can also “hear” what we are saying, and try to understand it.

Technology can also recognise content of images. It is more cognitive computation into visual recognition algorithms. Product like Watson Visual Recognition  or a Stanford university project Image-Net are focused on self-learning solutions specialised in image recognition. Facebook deep face methods are focusing in face recognition allowing automatically recognizing humans on pictures and tagging people. Current algorithms are just as good, and sometimes even better then human brain capabilities in this area. But having conversation with a friend we know that sometime he/she doesn’t have to say a word, and we will get a lot of information just looking at a person, a body posture and a face. This is a nonverbal communication. Power of eyes in conjunction with brain is more than knowing an object. We can call much more. See a difference between a horse on a green grass and a metal horse sculpture is not o problem for 3-years old child. Maybe computers are better in listening, but a child will get in a sec if their mum is happy or said.

Fundamental of humans

Every day activities, our cognition and perception are influenced by emotions. AIl technologies master into perfection some of humans capabilities. It is not that easy to teach computers to recognise our emotions. Can computers be more emotionally intelligent? Is it possible to measure emotions? Affective Computing discipline combines computer science with neuroscience, sociology, education, psychology, ethics, and more. It pushes the boundaries of what can be achieved to improve human affective experience with technology. The Unseen fashion studio introduced a dress which changes its colours based to reflect an aura of a person. 8thsene_2_crop_ps_medium_21-576x814

It doesn’t recognise a brain state nor specific emotions. It is an interesting experiment showing potential of this discipline.

Learn about the future from kids

There is sentence “do you want to know a future? Ask kids about it”. In the “Home” movie there are aliens (Boov) that change colours based on their mental and emotional state. home_emotionsThey turn red when they’re angry, green when they lie, yellow when they’re scared, orange when they get excited or happy and pink when they feel love. Such a simple and visible for everybody. Recognition of their mood is as easy as this.

We are this point where we try to better understand how to recognise person emotions from many different sources of insights (body movement, face, brain, heart rate). This is first step. That knowledge can be used in many industries like healthcare, education, public safety, and can be applied in many areas of human life. The next step is to implement those findings into computing solutions. Another step is to teach technology to recognise emotions and their impact on humans’ behaviours. It will bring human – computer interactions into next level of digital experience. Be ready for affective era where technology will understand emotions, or maybe even feel something. And Leia together with IBM Watson, they will have more work than in commercials.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s