Joon Seok Moon: Empathy Wall
Notice: Pod Template PHP code has been deprecated, please use WP Templates instead of embedding PHP. has been deprecated since Pods version 2.3 with no alternative available. in /data/siggraph/websites/history/wp-content/plugins/pods/includes/general.php on line 518
Artist(s):
Title:
- Empathy Wall
Exhibition:
- SIGGRAPH Asia 2020: Untitled & Untied
-
More artworks from SIGGRAPH Asia 2020:
Notice: Array to string conversion in /data/siggraph/websites/history/wp-content/plugins/siggraph-archive-plugin/src/next_previous/source.php on line 345
Notice: Array to string conversion in /data/siggraph/websites/history/wp-content/plugins/siggraph-archive-plugin/src/next_previous/source.php on line 345
Category:
Artist Statement:
Summary
When two audiences enter each room, press a button, and talk to the microphone about a given topic, a brush image that expresses emotions analyzed through AI algorithms appears on the wall. At this time, the images of the two audiences are mixed and expressed on a single screen.
Abstract
When human-to-human communication is applied to technology-to-human communication, can it create the same ’empathy’? Empathy Wall sought to create ’empathy’ by promoting human-to-human communication through technology. In conclusion, we want to extend this to the empathy and familiarity between technology and human beings. In other words, through an art piece called Empathy Wall, we are trying to extend human-to-human empathy to the formation of human-to-technical empathy. The two audiences who experience the work at the same time cannot see each other because of the screen wall between them. However, you can infer and empathize with each other by looking at the color images that are perceived by each person’s speech. In the process of convergence of technology and art, this “observant” and “user” will be able to communicate and empathize in a new way, and furthermore, naturally feel trust and familiarity with technology. Also, Empathy Wall is an interactive work that you can enjoy together, not an individual’s experience of looking at the work in front of the work. Through these processes, I believe that positive emotions that the audience can have after experiencing the work of depicting and expressing empathy as interactive art can also be applied to technology.
Empathy is one of the most important feelings for humans. Human beings may feel empathy between humans and humans, but there may be other cases where they empathize with other objects. I think this empathy is not a real human emotion, but we wanted to think about empathy for other subjects, not for this human being. The feeling of empathy for humans is a very important point of view from the perspective of feeling and understanding emotions from the other side’s point of view. This empathy is mainly created between humans and humans, but humans sometimes empathize and feel empathy for objects other than humans. It can be thought that as the distance between man and man in modern society becomes distant, and there are more cases of close contact between man and non-human objects in proportion to that, there can also be feelings of empathy between man and machine. Between humans and machines, I would like to think about what process human beings go through when they have a feeling of empathy for machines, and whether they can be said to be the same feeling of empathy that arises between humans and humans.
Technical Information:
Two artificial intelligence technologies were used in the work: voice language recognition and natural language processing. When the audience speaks to the microphone, the voice language recognition system changes the voice language to text, analyzes the texture with the emotion analysis algorithm among the natural language processing algorithms, analyzes the feelings of positivity and negation, and returns the results in numbers. Brush line images with the Kandinsky theory of returned numeric data are displayed on the screen as a result of angles and colors matching the resulting numeric data. The screen shows the combined images of the two people’s emotional analysis results.