ÄûÃʵ¼º½

News

3D amplifies emotions evoked by facial expressions

The research findings have implications for emotion research, entertainment industry and 3D displays.

Six stereoscopic image pairs used in the experiment. The images can be seen in 3D by 'looking through' the image.

Mediated facial expressions do not elicit emotions as strongly as real-life facial expressions. In particular, 2D photographs of facial expressions fail to evoke emotions as strongly as live faces, possibly due to the low fidelity of the pictorial presentation.

In a new study, researchers in Aalto University and University of Helsinki found that 3D facial expressions evoke stronger emotions than their 2D counterparts. Due to the illusion of non-mediation, natural depth levels create the strongest emotional amplifications. In this experiment, depth magnitude was manipulated by varying the distance between the two cameras providing the left and right images for the 3D presentation.

– Until now, facial expressions have been studied by using 2D photographs and the results have been generalized to the real world. Yet stereoscopic images replicate reality more faithfully and thus are more valid stimuli, states doctoral candidate Jussi Hakala.

3D photographs trick the brain into thinking that the face in a 3D photograph is more real than in the 2D photograph.

– 3D photographs trick the brain into thinking that the face in a 3D photograph is more real than in the 2D photograph, explains Hakala.

Whereas the negative valence and arousal elicited by angry expressions was most significantly amplified at the most natural depth magnitude, the positive valence elicited by happy expressions was amplified in both narrowed and natural depth conditions. The research findings are relevant for virtual and augmented reality 3D displays such as Oculus Rift, indicating that 3D content must preferably provide a natural depth percept to provide emotion-evoking experiences.

– Currently, 3D is mostly used in action films to emphasize the effects, but it could be also employed to enhance the emotions conveyed by the actors, concludes Hakala.

The study was conducted by Jussi Hakala and Jari Kätsyri at the Aalto University Department of Computer Science and Jukka Häkkinen at the Institute of Behavioural Sciences, University of Helsinki. Arousal and valence data was collected from 40 participants.

The study was recently published in i-Perception.

Link to the article

More information:
Jussi Hakala
Tel. +358505444553
jussi.h.hakala@aalto.fi

  • Updated:
  • Published:
Share
URL copied!

Read more news

A collage of nine people in formal and casual attire. Backgrounds vary from office settings to plain walls.
Research & Art Published:

Research Council of Finland establishes a Center of Excellence in Quantum Materials

The Centre, called QMAT, creates new materials to power the quantum technology of coming decades.
arotor adjustable stiffness test setup
Cooperation, Research & Art Published:

Major funding powers development of next-generation machine technology aimed at productivity leap in export sectors

The BEST research project is developing new types of sealing, bearing, and damping technology.
TAIMI-hanke rakentaa tasa-arvoista työelämää. Kuva: Kauppakorkeakoulu Hanken.
Research & Art Published:

The TAIMI project builds an equal working life – a six-year consortium project seeks solutions to recruitment and skill challenges

Artificial intelligence (AI) is changing skill requirements, the population is aging, and the labor shortage is deepening. Meanwhile, the potential of international experts often remains unused in Finland. These challenges in working life are addressed by the six-year TAIMI project funded by the Strategic Research Council, and implemented by a broad consortium.
Unite! Seed Fund 2026: Call opens on 20 January. Applications open for student activities, teaching and learning, research and PhD.
Cooperation, Research & Art, Studies, University Published:

Unite! Seed Fund 2026: Call opens on 20 January 2026

Gain an early overview of the Unite! Seed Fund Call of Spring 2026. The call includes three funding lines: Student Activities, Teaching and Learning, and Research and PhD.