Does AI learn to understand emotions through visual art?

Emotional AI is not far off given the nascent developments in the field.

Artificial intelligence has already made its mark in our lives. The adoption of disruptive technologies has redefined industries and their operations. However, the fear that hangs over the AI, rooted in its ability to take over the human race, has been there from the start. And most of us would have been influenced by those sci-fi movies and books, which portray the AI ​​as an evil entity, talking and behaving like humans.

Well, studies show that we have not yet reached the point where AI can fully augment human intelligence and emotions. Emotional AI is a developing field of study and researchers are trying to integrate emotional intelligence into AI algorithms so that they can better augment human behavior.

There have been instances where the AI ​​has created visual art. For example, in 2019, a gallery in Chelsea organized a exposure fingerprints created by an AI named AICAN. But what about the interpretation of art? AI in art interpretation is a developing field as it has certain complexities.

To interpret art, the AI ​​must understand the type and meaning of a particular piece of visual art. Researchers from Zhejiang University of Technology, Hangzhou, China, recently published a article on the classification of art. They tested 7 different models on 3 datasets to compare their arts classification performance. The study aimed to understand the ability of these neural network models to identify styles, artists and genres in particular works of art. According to the article, the convolutional neural network models and computer vision techniques used have provided state-of-the-art results, especially in small datasets.

Computer vision is a game-changing innovation currently affecting many industries, including automotive and construction. Artificial intelligence technologies have developed to such an extent that they almost augment the human brain, but do not fully replicate it. A group of researchers from Stanford University, Ecole Polytechnique and King Abdullah University of Science and Technology have published a study titled “ArtEmis: affective language for the visual arts”.

This team trained an AI algorithm to interpret emotions in large works of art. They used the WikiArt dataset consisting of 81,446 artworks by 1,119 artists and collected more than 4 Lakhs worth of emotional explanations and answers from annotators from Amazon Mechanical Turk (AMT) services.

This experiment aims to bring AI and emotional intelligence closer together by improving the capabilities of machine learning algorithms to analyze data based on emotions, metaphors, descriptions, and more. The algorithm they developed could dissect works of art into eight emotional categories. As an example, we can consider Vincent Van Gogh’s Starry Night and its explanation given by ArtEmis. According to the study, ArtEmis identifies the emotion in the painting as “awe” and explains it as follows: “The blue and white colors in these paintings make me feel like I’m looking at a dream.” ArtEmis is a great development because it can also provide written explanations of visual stimulation in algorithms besides just labeling it with emotion.

These nascent developments in the field of AI will enhance its capabilities. Adding emotional intelligence in AI will benefit many business operations, especially customer interaction and engagement. But there will always be ethical concerns around these human abilities to reproduce artificial intelligence. Will we be able to overcome these controversies and concerns? Will ethical AI and emotional AI completely replace humans? These are questions to which we must seek answers in the near future.

Share this article

Do the sharing

About the Author

More info about the author