Published in npj Artificial Intelligence: 'AI system facilitates people with blindness and low vision in interpreting and experiencing unfamiliar environments'
2024 CHI Extended Abstract: 'FaceVis: Exploring a Robot’s Face for Affective Visualisation Design'
2024 WWW paper: 'More Than Routing: Joint GPS and Route Modeling for Refine Trajectory Representation Learning'
Contributed to pedestrian safety evaluation in a high-fidelity simulation framework (2024)
Background
Second-year Ph.D. candidate at the School of Computing and Information Systems, University of Melbourne
Affiliated with the Human-Computer Interaction Group and Human-Robot Interaction Lab
Current research focuses on leveraging implicit communication to enhance grounding between humans and robots during physical collaboration
Aims to enable collaborative robots to interpret and generate context-enriched, human-like communication using multi-agentic LLMs
Strives to make actionable information exchange between humans and robots intuitive, natural, effective, and inclusive for daily activities
Also interested in designing user-centric guiding robots for visually impaired individuals