"The Molly Problem" at a glance
"The Molly Problem" has been conceptualized as an alternate thought expression to the "trolley problem" in the context of autonomous vehicles.
"The Molly Problem" tackles the ethical challenge faced by companies in terms of their autonomous vehicle (AV) systems not being able to assure the elimination of all accidents.
To address this, "The Molly Problem" presents a single scenario and raises critical questions on public expectation in the AD software behaviour in the event of a collision.
A young girl called Molly is crossing the road alone and is hit by an unoccupied self-driving vehicle. There are no
eye-witnesses. What should happen next?
Decide future course of action!
To chart the future path for autonomous vehicles in this ethical debate,
"The Molly Problem" survey has been developed.
See the preliminary results of the survey (
PRELIMINARY RESULTS)
new
The findings from this survey will support and facilitate the identification of requirements for the data and metrics to help shape global regulatory frameworks and safety standards for self-driving software, ensuring that they are designed to meet public expectations.
You may contribute to this global dialogue by taking the survey
here!
Two webinars were organized in collaboration with the AI for Good Global Summit to elaborate on the Molly problem and possible policy/legal implications. Please find their webpages and recorded sessions below:
"The Molly Problem" evolved from the discussions of the
ITU-T Focus Group on AI for Autonomous and Assisted Driving (FG-AI4AD) as initiated by ADA Innovation Lab Limited and Technical University of Munich.
See the ITU News on The Molly Problem story "Self-driving cars: Can AI make the 'right' decisions on the road?"