Pioneering Safer Streets: Groundbreaking Research on Human Interaction with Autonomous Vehicles
Dr Yee Mun Lee is helping shape the future of transport — one pedestrian step at a time.

As an experimental and cognitive psychologist, Dr Lee explores how humans process information and how perception translates into behaviour. Her research sits at the crossroads of psychology, technology, and transport safety, focusing on how pedestrians interact with autonomous vehicles (AVs) — and how these vehicles can communicate their intentions safely and intuitively.
The start of the route
Dr Lee’s journey into transport psychology began during her PhD, completed in 2016, where she examined how drivers perceive and make judgments at junctions. She was among the first researchers to conduct empirical cross-cultural studies exploring how drivers’ cognitive processes vary across contexts.
Before joining the University of Leeds, Dr Lee was a lecturer — but her passion for research and experimentation led her to pursue a Research Fellow position at Leeds in 2018. “Given my expertise in experimental design and understanding how road users process information, the opportunity was too good to pass,” she explains.
What drew her most to Leeds was its world-class infrastructure. As a psychology student at the University of Nottingham Malaysia, Dr Lee’s early experiments relied on static images and videos. At Leeds, she found a step change in capability: a suite of state-of-the-art simulators within Virtuocity — including the UK’s most advanced driving simulator, a truck simulator, two static simulators, several VR headsets, and, at the heart of it all, the Highly Immersive Kinematic Experimental Research (HIKER) lab.
“We can recreate virtually any scenario to match real-world environments, all fitted with eye trackers and physiological measures,” says Dr Lee. “What makes Leeds unique is our multidisciplinary approach — engineers, psychologists, and computer scientists all working together. Everyone brings a different perspective, and that’s how we solve real-world problems and push innovation beyond the state-of-the-art.”
Inside the HIKER Lab
The HIKER lab is the largest ‘CAVE-based’ pedestrian simulation environment of its kind worldwide. Using high-fidelity 3D projections and motion tracking, it allows participants to navigate life-sized virtual environments — from busy crossings to futuristic AV interactions.
Working alongside Dr Ruth Madigan, Professor Natasha Merat, and Professor Gustav Markkula, Dr Lee and her team use HIKER to investigate how pedestrians interpret AV movements and signals. Their pioneering Distributed Simulation system even connects HIKER with the university’s driving simulator, enabling real-time interaction between pedestrians and human or automated drivers — a world-first in understanding mutual road user behaviour.
The External Human-Machine
With drivers no longer fully responsible for control in higher levels of vehicle automation, new communication systems are essential. Dr Lee’s research focuses on external Human–Machine Interfaces (eHMIs) — visual displays or light patterns that allow AVs to signal intent, such as “I am yielding.”
In one study she investigated how comprehension, familiarity, and repeated exposure affect the effectiveness of eHMIs. Using the HIKER lab, her team compared a Slow Pulsing Light Band (SPLB) design with traditional flashing headlights. The results showed that:
- eHMIs are most useful at lower speeds and shorter time gaps, but they cannot solve all pedestrian–AV interactions
- their effectiveness depends on visibility and cultural familiarity with the design
- and cross-cultural differences play a major role in how explicit versus implicit signals are interpreted.
In another experiment (Kaleefathullah et al., 2020), Dr Lee’s team simulated eHMI failures, where an AV signalled it would yield but continued to move. Alarmingly, 36% of participants stepped into the road, leading to “virtual collisions.” These participants also reported heightened anxiety and reduced trust in the system.
The findings highlight the risks of over-reliance on visual signals and the need for AV developers to better understand potential miscommunication. Dr Lee stresses that public education around AV capabilities will be critical as automation becomes mainstream.
Most recently, her team connected real drivers and pedestrians in a real-time distributed simulation (Yang et al., 2025) to explore how each side adapts. They discovered that both drivers and pedestrians use distinctive strategies — from braking patterns to lateral movement — when negotiating space and intent. These insights could be key to developing human-like behavioural models for future AVs.
Potential Impact
Dr Lee’s research provides vital guidance for policymakers, AV developers, and standard-setting organisations worldwide. Her work contributes directly to international discussions through the International Organisation for Standardisation (ISO) and the BSI Connected and Automated Mobility Standards Coordination Group.
Her collaborations extend across major EU-funded projects, including Hi-Drive, L3Pilot, SHAPE-IT, and interACT, all aimed at ensuring AVs integrate safely and harmoniously into mixed-traffic environments.
Looking ahead, Dr Lee’s upcoming research — in partnership with Dr Elizabeth Sheppard at the University of Nottingham — takes this mission further. Funded by the Road Safety Trust, the project will explore how autistic children perceive and navigate pedestrian crossings, using HIKER to test how design affects attention and behaviour.
“By understanding how different groups experience and respond to their environments, we can design safer, more inclusive systems for everyone,” Dr Lee says. “Our work aims to ensure that as cities evolve, no one is left behind.”