Facial recognition is an essential part of self-image and social interactions. In the age of advanced digital technologies, we are faced with fascinating questions about communication and identity. How does changing our facial identity affect our sense of self and our interactions with others?
These are the questions that Dr. Shunichi Kasahara, a researcher at the Cybernetic Humanity Studio at the Okinawa Institute of Science and Technology (OIST), is investigating, using real-time facial image transformation (transforming our face into someone else’s and vice versa). The studio was established in 2023 as a joint research platform between OIST and Sony Computer Science Laboratories, Inc.
Dr. Kasahara and his colleagues studied the dynamics of face recognition using motor-visual synchrony, that is, the coordination between a person’s physical movements and the visual feedback they receive from those movements.
They found that whether or not we influence the movement of our self-image, levels of identification with our face remain constant. Therefore, our sense of agency, or our subjective feelings of control, have no impact on our level of identification with our self-image. Their results were published in Scientific reports.
The effect of agency on perceptions of identity
Using psychological experiments using screens and cameras, scientists sought to determine where the “self-identification boundary” lies and what influences this boundary. Participants were seated and asked to watch screens showing their faces gradually changing.
At one point, participants were able to notice a change in their facial identity and were asked to press a button when they felt that the image on the screen no longer matched them. The experiment went both ways: the image passed from self to other and from other to self.
“It’s like looking at your face in a mirror as you move it and identify yourself, but your face slowly changes to a certain point and you realize it’s not you anymore,” Dr. Kasahara explained.
The researchers studied the effect of three movement conditions on facial boundaries: synchronous, asynchronous, and static. They hypothesized that if the movements were synchronized, participants would identify more with the images.
Surprisingly, they found that whether their movements were synchronized or not, the boundaries of their facial identity were similar. Additionally, participants were more likely to identify with static images of themselves than with images of their face in motion.
Interestingly, the direction of the transformation, whether from self to other or from other to self, influenced how participants perceived their own facial boundaries: Participants were more likely to identify with their facial images when those images transformed from self to other rather than from other to self. Overall, the results suggest that the feeling of acting on facial movements does not have a significant impact on our ability to judge our facial identity.
“Take deepfakes, for example, which are essentially a form of asynchronous movement. When I stand still but the visual representation moves, it creates an asynchronous situation. Even in these deepfake scenarios, we can still feel a sense of identity connection with ourselves,” Dr. Kasahara explained.
“This suggests that even when we see a fake or manipulated version of our image, such as someone else using our face, we can still identify with that face. Our findings raise important questions about our sense of self and identity in the digital age.”
How does identity impact perceptions of control?
And the other way around? How does our sense of identity influence our sense of agency? Dr. Kasahara published a paper in collaboration with Dr. Wen Wen, a professor of psychology at Rikkyo University who specializes in research on our sense of agency. They studied how recognizing ourselves through facial features can affect how people perceive control over their own movements.
During the experiments, participants observed their own face or that of another person on a screen and were able to interact and control facial and head movements. They were asked to observe the screen for about 20 seconds while moving their face and changing their facial expressions.
Facial movement was controlled either by the participant’s own face and head movements alone or by an average of the participant’s and experimenter’s movements (full control or partial control). They were then asked, “To what extent did you feel that this face looked like you?” and “To what extent did you feel that you were in control of this presented face?”
Again, the main findings are intriguing: Participants reported a greater sense of autonomy over the “other face” than over “their own face.” Moreover, controlling another person’s face resulted in a greater variety of facial movements than controlling one’s own face.
“We gave participants a different face, but they could control the movements of that face, similar to deepfake technology, where AI can transfer the movement to other objects. This AI technology allows us to go beyond the conventional experience of simply looking in a mirror, allowing us to tease out and study the relationship between facial movements and visual identity,” said Dr. Kasahara.
“Based on previous research, you would expect that if I see my own face, I would feel more in control of it. Conversely, if it’s not my face, I would expect to feel less in control because it’s someone else’s face. That’s the intuitive expectation. However, the results are the opposite: When people see their own face, they report having a lower sense of autonomy.
“Conversely, when they see another person’s face, they are more likely to feel a sense of autonomy.” These surprising results challenge what we thought we knew about how we perceive ourselves in images.
Dr. Kasahara emphasized that acceptance of technology in society plays a crucial role in technological progress and human evolution.
“The relationship between technology and human evolution is cyclical; we evolve together. But concerns about certain computing technologies can lead to restrictions. My goal is to help foster acceptance within society and update our understanding of the ‘self’ in relation to human-machine integration technology.”
More information:
Shunichi Kasahara et al., Study of the impact of visual motion synchronization on face recognition using real-time morphing, Scientific reports (2024). DOI: 10.1038/s41598-024-63233-2
Provided by Okinawa Institute of Science and Technology
Quote: Facial transformation technology illuminates the limits of self-recognition (2024, August 30) retrieved August 30, 2024 from
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without written permission. The content is provided for informational purposes only.