Wild Technology

Emotion Recognition – A near human quality of machines

0

Humans are known for their ability to read and express emotions. ”You seem happy today” makes us happier and “is everything all right? You look sad” gives us a feeling of support. But with the changing world and the advancing technology, humans are replacing humans.

Yes! Humans are researching today on how machines can act according to the emotions of the human they are working with. The research in ANN i.e. the artificial neural network brings us a step closer to this much-wanted luxury.

What is an Artificial Neural Network??

Artificial Neural Network is an interconnected network of nodes. It is similar to the connection of neurons in our brain. It is not an algorithm, but a framework where a number of algorithms work together to process the data and give output.

A neural network is not programmed to do something. It is a self-learning mechanism that teaches itself from instances and examples. Like it will know a particular shape is called a dog by analyzing various images labeled as a dog and not dog.

The inspiration of neural networks is the human brain and its functionalities. So it aims to operate in more or less the same way.

How can ANN recognize emotions?

Remember chitti?? Yes, the robot made by Dr. Vasi in the movie robot. At the end of the movie chitti is asked to dismantle himself and is just kept in the museum for the future generations to see. Why?? “kyuki mai sochne laga tha”(“because I had started thinking on my own”) that’s what he said. It felt a science fiction back then because robots don’t think on their own, they just act as programmed.

But now we are working to make robots that think on their own. Being the lazy animal that we are, we don’t even want to program the robots. And artificial neural network helps us in it.
Facial expressions represent 55% emotions that we speak. ANN uses these facial expressions as input, performs a few processes and gives the output. It basically takes place in a series. Input image -> cropping -> feature extraction -> output.

When an image is recorded the features are cropped. The eyes, the mouth are the key elements of expressions. They help in recognizing the expressions. ANN then studies and analyses these features in accordance with the previous data. A conclusion is derived and output is given about the persons emotions.

This is just one way of doing it. Facial expressions, body gestures, voice, all provide data regarding the emotions. Slowly the research is extending to include all the aspects because it will only make the output more accurate.

Applications

Emotion recognition has applications in many fields. Medicine, E-learning, Monitoring, Marketing, Entertainment, Law are a few of them.
It is really important to see if the person being taught is willing to learn. Is the person understanding the concept?? This can be done by emotion recognition. Also in other fields like medicine, it helps to see how the patient feels about the treatment. Entertainment is an emotion-driven industry, playing a happy song at a funeral can be the worst thing happening.

Conclusion

Emotions play an important role in every kind of communication. The communication can be between man and man or man and machine. Emotion recognition will help us enhance the experience. The future definitely holds in it robots tha will help is not only physically but emotionally too.

Digital Signature- A Seal Of Trust.

Previous article

AI to reduce HR work

Next article

Comments

Leave a Reply

Login/Sign up