Yan Zhang
Scholar

Yan Zhang

Google Scholar ID: CyIEsPgAAAAJ
University of Melbourne
Human-Robot CollaborationCommunicationAgentAccessibility
Citations & Impact
All-time
Citations
122
 
H-index
5
 
i10-index
4
 
Publications
17
 
Co-authors
0
 
Publications
17 items
Browse publications on Google Scholar (top-right) ↗
Resume (English only)
Academic Achievements
  • 2025 CHI paper: 'Can you pass that tool?: Implications of Indirect Speech in Physical Human-Robot Collaboration'
  • 2025 HAI paper: 'From Conversation to Orchestration: HCI Challenges and Opportunities in Interactive Multi-Agentic Systems' (co-first author)
  • 2025 HRI paper: 'ROSAnnotator: A Web Application for ROSBag Data Analysis in Human-Robot Interaction'
  • 2025 HRI paper: 'Implicit Communication of Contextual Information in Human-Robot Collaboration'
  • 2025 HRI paper: 'OfficeMate: Pilot Evaluation of an Office Assistant Robot'
  • 2025 AutoUI Adjunct paper: 'Exploring Deictic Interface Referencing Outside-Vehicle Landmarks'
  • Published in npj Artificial Intelligence: 'AI system facilitates people with blindness and low vision in interpreting and experiencing unfamiliar environments'
  • 2024 CHI Extended Abstract: 'FaceVis: Exploring a Robot’s Face for Affective Visualisation Design'
  • 2024 WWW paper: 'More Than Routing: Joint GPS and Route Modeling for Refine Trajectory Representation Learning'
  • Contributed to pedestrian safety evaluation in a high-fidelity simulation framework (2024)
Background
  • Second-year Ph.D. candidate at the School of Computing and Information Systems, University of Melbourne
  • Affiliated with the Human-Computer Interaction Group and Human-Robot Interaction Lab
  • Current research focuses on leveraging implicit communication to enhance grounding between humans and robots during physical collaboration
  • Aims to enable collaborative robots to interpret and generate context-enriched, human-like communication using multi-agentic LLMs
  • Strives to make actionable information exchange between humans and robots intuitive, natural, effective, and inclusive for daily activities
  • Also interested in designing user-centric guiding robots for visually impaired individuals
  • Previous research includes autonomous driving
Co-authors
0 total
Co-authors: 0 (list not available)