University at Buffalo, SUNY

Human Behavior Modeling Lab

The Human Behavior Modeling Lab (HBML) focuses on comprehensive understanding, reasoning, prediction, and 3D generation of human behaviors. Our research spans from individual actions to complex multi-person interactions, aiming to bridge perception and generation through multimodal and embodied AI. Our work covers a broad range of applications including, but not limited to:

• Sign language 3D generation/ production, translation and dual learning.
• Children education cognitive reasoning
• Dyadic teacher(parent)-child interaction modeling
• Infant cry analysis and behavior understanding
• Health and medical image understanding

At HBML, we are committed to advancing socially intelligent systems to model human behavior that can interpret, respond to, and generate human-centered behaviors in real-world contexts.



Recent News


  • 2025.07: One paper on a Multi-Agent LLM Framework for Social Robot interaction has been accepted by IROS 2025.
  • 2025.05: One paper on Fusion of Speech Embeddings for MOS Prediction has been accepted by Interspeech 2025.
  • 2025.03: Prof. Ifeoma Nwogu was invited to give a talk on AI for Human Behavior at Women in Tech Western New York.
  • 2024.12: Lu Dong have been awarded the Best AI Project Award at the 2024 CSE Poster Competition.
  • 2024.12: One paper on infant cries has been accepted by ICASSP 2025.
  • 2024.09: One paper on 3D sign language motion generation has been accepted by EMNLP 2024.
  • 2024.08: Two papers,on facial expression understanding and multi-person motion, have been accepted to ECCV 2024.
  • 2024.08: One paper on sign language translation has been accepted by ICPR 2024.
  • 2024.04: Two papers — one on sign language generation and another on baby face generation — were accepted to FG 2024.
  • 2023.02: One paper on 2-way sign language translation have been accepted by TPAMI 2023.