Autonomous vehicle technology will change how people live and move around in cities. As a mode of transport, the benefits of autonomous vehicles include independence, comfort and safety for riders. Applied to services like parcel delivery and waste collection, autonomous technology also promises to address worker shortages and reduce congestion.
Technological progress and regulatory approval are key enablers of autonomous vehicles and their mass adoption. But there is one factor that will make or break their success: public acceptance, which represents a significant hurdle.
The issue of public acceptance not only relates to whether people feel comfortable riding inside a driverless car, it also extends to how other road users interact with the vehicle. Early trials of driverless cars have been met with public backlash, including attacks on vehicles and threats against operators. Even behaviour that is not intended to cause physical damage but born out of curiosity, such as stepping in front of a driverless car to see its reaction, significantly affects their operational efficiency. A recent study further found that trust remains a concern expressed among all respondents.
Researchers have linked these behavioural patterns and lack of trust to the absence of non-verbal communication cues between vehicles and other road users. For manually driven vehicles, these non-verbal cues, such as eye contact and gestures, are essential to ensuring safety, understanding and trust.
This is particularly pertinent in unstructured traffic and in situations involving vulnerable road users, such as a pedestrian crossing the road in front of a vehicle or in areas where pedestrians and vehicles share the same space.
To overcome the lack of non-verbal cues, studies have investigated ways to communicate the intention of autonomous vehicles, their awareness – what they ‘see’ – and even their ‘emotions’. For example, a vehicle could express its frustration when being challenged by a pedestrian deliberately stepping into its path. This vehicle-to-pedestrian communication is achieved through designing so-called external human-machine interfaces that may take the form of simple text or visual displays attached to the front of the vehicle, projections onto the road and auditory alerts. In a more distant future, this information could be communicated through augmented reality, as suggested by a study proposing to overlay autonomous vehicles with relevant visual cues.
External human-machine interfaces are a necessary evolution of the traditional blinker, which became widely adopted by car manufacturers in the late 1940s. At their core, fully autonomous vehicles are robots, requiring more advanced communication channels than simple light signals. But unlike the depiction of Johnny Cab’s driverless taxi in the movie Total Recall, they don’t come with a humanoid robot driver. Instead, they are part of an automated infrastructure, where the city itself becomes a distributed robot.
This kind of conceptual shift opens up new perspectives on how road users will interact with autonomous vehicles in future cities, which are expected to come in different shapes and sizes beyond just driverless cars. Pedestrians may be able to assist delivery robots that get stuck in the snow or lost in the woods. Starship Technologies, the company behind the iconic white delivery robot, even encourages passersby to give their robots a helping hand when they get stuck. But this relies on bidirectional communication and the ability of the robot to understand human input, which was clearly not the case for a rogue robot driving right into a crime scene.
It is a long road towards robots roaming the city and getting the human-machine interaction right for public acceptance is challenging. This is not only because of the costs of fully functional autonomous vehicles but also the risks associated with real-world studies. To overcome this risk, many researchers have turned to studying digital replicas or 360-degree recordings of real-world situations in a virtual reality environment.
Results from these studies are promising, suggesting that external human-machine interfaces can improve the trust of road users in autonomous vehicles. In a study of crossing scenarios, 81 percent of participants reported they felt safer if an external display communicated the vehicle’s intention. In more complex scenarios, such as those in unregulated mixed-traffic environments, the combination of several factors was found to contribute to study participants expressing trust in the vehicle.
These factors include observing the vehicle’s interactions with other pedestrians, implicit cues, such as the vehicle slowing down and explicit cues, such as external light signals showing intent and awareness.
Historically, car manufacturers have primarily focused on the safety of their passengers. It’s a business strategy as manufacturers can use safety as a selling point. Mercedes-Benz’s manager of driver assistance systems even publicly stated that the company would prioritise the safety of passengers over pedestrians in their autonomous vehicle technology.
As we are moving closer to a future in which the city and its infrastructure become automated, this approach requires rethinking. This is key to encouraging active transport and transitioning towards greener and healthier urban living, reflected also in the United Nations’ Sustainable Development Goal 11.
Manufacturers of autonomous vehicles have a unique opportunity to be a leader in this emerging and rapidly growing market by complementing their focus on technology with a strong understanding of the social and human challenges.
The successful adoption of autonomous vehicles in and by cities also requires innovative policies. Regulatory bodies need to emphasise how these vehicles interact with pedestrians and other vulnerable road users not only algorithmically but also through vehicle-to-pedestrian communication channels.
Instead of regulating how cars adopt automation, what might policies look like if we think about cars more as robots that operate in close proximity to people?
Dr Martin Tomitsch is a Professor of Interaction Design and Director of Innovation at the University of Sydney where he leads the Urban Interfaces group and teaches interface design, creative thinking and innovation.
Dr Marius Hoggenmueller is a Postdoctoral Fellow at the University of Sydney’s Design Lab, focusing on design-led approaches for prototyping emerging interfaces such as urban robots. They declare no conflict of interest.
The research on which this article is based was funded in part by the Australian Research Council.
The authors would like to acknowledge the contributions of their colleagues to some of the research cited in the article, Dr Luke Hespanhol, Dr Callum Parker, Tram Thi Minh Tran, Yiyuan Wang and Dr Stewart Worrall.
Originally published under Creative Commons by 360info™.