Data-driven projects with significant societal impacts include:

 

Can dynamic robots modulate fear experiences?

In the rapidly evolving field of human-robot interaction, researchers are uncovering novel ways in which robots can influence human emotions. A particularly intriguing question is whether humans can "catch" emotions, such as fear, by engaging with robots that mimic human behaviors. This study aimed to investigate whether holding a robot that simulates human breathing patterns can influence human emotions, particularly fear.

 

Can ChatGPT-4 Recognize Emotion Expressions?

Modern Large Language Models (LLMs) excel at text-based tasks, but their performance on image-based tasks—particularly those involving fundamental aspects of human communication—remains uncertain. This investigation evaluates whether state-of-the-art large language models—OpenAI GPT-5, Google Gemini 2.5 Flash, Claude Sonnet 4, and xAI Grok 2 Vision—can accurately recognize nonverbal emotion expressions across six validated databases presenting facial expressions, bodily expressions, or combined face+body displays, encompassing both posed and ecologically valid naturalistic stimuli.

 

Smiles reveal personality, enabling observers to form accurate personality judgments from mere smiling faces

People ubiquitously smile during brief interactions and first encounters, and when posing for photos used for virtual dating, social networking, and professional profiles. Yet not all smiles are the same: subtle individual differences emerge in how people display this nonverbal facial expression. In this research I demonstrate that differences in the way people pose their smile in photographs reveals aspects of their personality, enabling observers to from accurate personality judgments. Smiles therefore serve as a powerful tool for communicating who we are.