4DMR Logo

The 1st Challenge and Workshop for 4D Micro-Expression Recognition for Mind Reading (4DMR)

To be held at IJCAI 2025, 16th-22nd August 2025, Montreal, Canada & Guangzhou, China

Welcome to 4DMR Workshop & Challenge 2025

We jointly hold the first workshop and challenges for 4D Micro-Expression Recognition for Mind Reading (4DMR) at IJCAI 2025, 16th-22nd August 2025.

We warmly welcome your contribution and participation!

News

March 23 : The website of 4DMR workshop & challenge is under construction...

April 8 : The Kaggle website of 4DMR challenge is available, the training & resource data will be released in few days.

Overview

We hold the 1st 4DMR Workshop & Challenge to explore the application of 4D technologies in facial expression analysis, to be held at IJCAI 2025, 16th-22nd August 2025, Montreal, Canada & Guangzhou, China.

Humans display a vast array of emotional and cognitive states. The ability to interpret these states, often referred to as mind reading, is unparalleled in the animal kingdom and is fundamental to human social interaction and communication. A key component of mind reading is facial expression, which accounts for 55% of how we understand others' feelings and attitudes, playing a vital role in conveying essential information about mental states.

Micro-gesture

Micro-expressions (ME) are a special form of facial expressions which may occur when people try to hide their true feelings for some reasons. Unlike 2D and 3D methods, 4D analysis (3D mesh + temporal changes) excels at detecting fleeting micro-expressions.4D information can be leveraged to enhance accuracy and robustness of facial expression, effectively addressing challenges such as variations in lighting, pose, and noisy environments, making it ideal for real-world applications. Despite its promise, 4D facial expression research faces challenges that limit its progress.

This workshop aims to explore the application of 4D technologies in facial expression analysis. It will feature the inaugural 4D micro-expression recognition challenge to propel the field forward and provide a platform for researchers to benchmark their methodologies. The workshop will delve into cutting-edge techniques for both macro- and micro-expression recognition, discuss the implications of these methodologies for global communication and AI systems, and highlight practical applications in domains such as security, healthcare, and customer service. Interactive sessions with leading experts will foster deeper insights into how 4D facial expression analysis can revolutionize our understanding of human emotions and cognitive states.

Workshop Topics

    This workshop explores advanced methodologies and applications of facial expression analysis for emotion understanding and mind reading across diverse scenarios and challenges. Topics of interest include, but are not limited to:
  • Facial expressions (including micro- and macro-expressions) analysis for emotion understanding.
  • 4D-based methodologies for facial expression understanding, e.g., classification, detection, online recognition, generation, and transferring.
  • Psychological study and biologically-driven mechanism about facial expression and their links to mind reading.
  • Solutions for special challenges involved with the in-the-wild facial expression analysis, e.g., high heterogeneous samples of interclass, face poeses, noisy backgrounds, etc.
  • Applications of facial expressions recognition, e.g., for medical assessment in hospitals (ADHD, depression), for health surveillance at home or in other environments, for emotion assessment in various scenarios like for education, job interview, etc.
  • Different modalities developed for mind reading, e.g., facial expressions, attentive gazes, and desensitized voices.
  • New data collected for the purpose of mind reading.
  • Applications of realistic facial expressions generation for human computer interaction or computer-mediated interactions.

References

  • Li, Xiaobai, et al. "4DME: A spontaneous 4D micro-expression dataset with multimodalities." IEEE Transactions on Affective Computing (2022).

Related works

  • TBD

Workshop Details

Introduction

Humans display a vast array of emotional and cognitive states. The ability to interpret these states, often referred to as mind reading, is unparalleled in the animal kingdom and is fundamental to human social interaction and communication. A key component of mind reading is facial expression, which accounts for 55% of how we understand others' feelings and attitudes, playing a vital role in conveying essential information about mental states.

Micro-gesture

Fig. 1. 4D micro-expressions (3D mesh + temporal changes) examples.

To date, while extensive research has been conducted on facial expressions, the advent of 4D facial expression analysis marks a transformative leap in the field. By capturing the temporal evolution of expressions in three-dimensional space, 4D analysis reveals the intricate dynamics of facial muscle movements over time. Unlike 2D and 3D methods, 4D analysis (3D mesh + temporal changes) excels at detecting fleeting micro-expressions(Figure 1), which are brief, involuntary displays of hidden emotions by incorporating multiple views and temporal information for richer and more precise data. 4D information can be leveraged to enhance accuracy and robustness of facial expression, effectively addressing challenges such as variations in lighting, pose, and noisy environments, making it ideal for real-world applications. Despite its promise, 4D facial expression research faces challenges that limit its progress. The lack of diverse and realistic datasets, particularly for spontaneous micro-expressions, constrains its applicability to practical scenarios. Moreover, the computational demands of processing the complex temporal and spatial data inherent in 4D analysis pose significant technical challenges. Existing methodologies often struggle with capturing rapid and subtle micro-expressions and adapting to real-world conditions, such as occlusions, pose variations, and noisy backgrounds. Advancing the field requires the development of innovative algorithms, efficient computational techniques, and large-scale datasets to bridge these gaps, enabling applications in healthcare, security, and education.


This workshop aims to explore the application of 4D technologies in facial expression analysis. It will feature the inaugural 4D micro-expression recognition challenge to propel the field forward and provide a platform for researchers to benchmark their methodologies. The workshop will delve into cutting-edge techniques for both macro- and micro-expression recognition, discuss the implications of these methodologies for global communication and AI systems, and highlight practical applications in domains such as security, healthcare, and customer service. Interactive sessions with leading experts will foster deeper insights into how 4D facial expression analysis can revolutionize our understanding of human emotions and cognitive states.

Workshop Topics
    This workshop explores advanced methodologies and applications of facial expression analysis for emotion understanding and mind reading across diverse scenarios and challenges. Topics of interest include, but are not limited to:
  • Facial expressions (including micro- and macro-expressions) analysis for emotion understanding.
  • 4D-based methodologies for facial expression understanding, e.g., classification, detection, online recognition, generation, and transferring.
  • Psychological study and biologically-driven mechanism about facial expression and their links to mind reading.
  • Solutions for special challenges involved with the in-the-wild facial expression analysis, e.g., high heterogeneous samples of interclass, face poeses, noisy backgrounds, etc.
  • Applications of facial expressions recognition, e.g., for medical assessment in hospitals (ADHD, depression), for health surveillance at home or in other environments, for emotion assessment in various scenarios like for education, job interview, etc.
  • Different modalities developed for mind reading, e.g., facial expressions, attentive gazes, and desensitized voices.
  • New data collected for the purpose of mind reading.
  • Applications of realistic facial expressions generation for human computer interaction or computer-mediated interactions.
Submission Guidelines
  • Papers must comply with CEURART paper style and paper length should be less than 7 pages.
  • The CEURART template can be found on this Overleaf link.
  • The review process is double-blind. Please do not include your identity information in your submitted paper.
  • There are two tracks (challenge track and workshop track) for the paper submission.
    • For the challenge track, The top-3 teams are encouraged to submit papers to this track (Acceptance depends on the quality of the paper, so it is not guaranteed.). Other teams are also welcome to submit papers. The review criterion is based on both the team ranking and the paper quality.
    • For the workshop track, all work in the topic scope mentioned above is encouraged to submit. The review criterion is based on the paper quality and the relevance to 4DMR workshop.
  • Accepted papers will be included in a volume of the CEUR Workshop Proceedings (EI-index, JUFO1).
  • The submission link is as follow: TBD
  • At least one of the authors of accepted papers should register for the IJCAI 2025 workshop and be present onsite at the workshop.
Workshop Important Dates

Paper submission is open.

  • May 31 (23:59, AoE): Paper submission deadline
  • June 6: Notification to authors
  • June 13: Camera-ready deadline

Note: Each paper must be presented on-site by an author/co-author at the conference.

Workshop Program
TBD
Invited Speakers
TBD

Challenge Details

Micro-expressions (MEs) are subtle, rapid, and involuntary facial movements that often occur in high-stakes scenarios or when individuals attempt to gain advantages or conceal their true emotions. Due to their extremely short duration and low intensity, MEs are difficult to detect and demand high-precision facial data. This challenge leverages the power of 4D facial analysis—capturing the temporal evolution of facial expressions in 3D space—to uncover the complex dynamics of facial muscle movements over time. Unlike traditional 2D or static 3D approaches, 4D analysis (3D mesh + temporal sequence) excels at identifying fleeting, involuntary micro-expressions by incorporating both spatial depth and motion cues. This multi-view, temporal information enriches the data and significantly improves recognition accuracy and robustness.


This challenge will be organized on the Kaggle Website. On the Kaggle website, instructions and data will be shared, and results from participants will be submitted and ranked. The top 3 teams will be awarded certificates if the top 3 teams submit papers and are present at the workshop.

Challenge Important Dates (might slightly adjust later)
    The timeline for the Challenge will be organized as follows:
  • April 8, 2025: Challenge website goes online.
  • April 10, 2025: Training and resource data released.
  • April 30, 2025: Team registration deadline.
  • May 3, 2025: Testing data released.
  • May 17, 2025: Final test submission deadline.
  • May 24, 2025: Challenge results announced.
  • May 31, 2025: Paper submission deadline.
  • June 6, 2025: Notification to authors.
  • August 16-22, 2025: 4DMR IJCAI 2025 Workshop, Montreal & Guangzhou.

Contact us

Contact Information: