Empathic app: using AI to interpret how a non-verbal person feels

Empathic app and API uses Artificial intelligence (AI) to analyse short voice recordings. The app is designed for people who use fewer than 20 reliable, functional words; it can reveal if an individual is expressing excitement, frustration or up to 10 emotions.

Typically, there is one person who understands each non-verbal person. But there can be many others who don’t know them well enough to identify their emotions. For this reason, the app includes a ‘Companion’ feature so that every interaction is supported. Using Empathic , every carer, teacher, family member and friend can understand when the non-verbal person is enjoying an activity and when they are bored or confused.

Link to video: - YouTube

Link to website: https://seamlesscare.ie/


I’m a little worried by this modern trend.

To me it smacks of the auto-complete on a phone keyboard that spews out inappropriate messages to mystified recipients. Allowed to continue sending messages on my behalf I could end up in a very awkward predicament.
I easily end up foot-in-mouth with my communications and wouldn’t want to have to cope with a device doing the same thing on my behalf.

I suppose I am, in essence a Luddite, keep to the old ways, never much liked computers either, to be honest.

Keep on keepin’ on
:hammer: :grinning: :desktop_computer:

In a hit series on American TV which was full of villains, one particular villain was wheelchair bound. He communicated successfully just using a bell. His underlings understood him completely and obeyed all the orders he dished out. Unfortunately he was eventually blown up in his wheelchair by his rival, a Chicken Barbecue establishment owner.
This load of bells came from ‘Breaking Bad’ a story about a high school teacher turned bad, very bad in fact.

1 Like