THE POLICE CHIEF of Tempe, Arizona said Uber is likely not at fault for the fatal collision involving one of its self-driving vehicles.
On Monday, it was reported that a female pedestrian was struck by the autonomous vehicle and was taken to hospital where she later died from her injuries.
Police have identified the victim as 49-year-old Elaine Herzberg.
The Uber vehicle, which was in "autonomous mode" with a safety driver behind the wheel at the time of the collision, was headed northbound when a woman walking outside of the crosswalk was struck.
Speaking to the San Francisco Chronicle, Chief of Police Sylvia Moir said that a preliminary investigation reveals that Uber is unlikely at fault for the fatal crash.
"It's very clear it would have been difficult to avoid this collision in any kind of mode [autonomous or human-driven] based on how she came from the shadows right into the roadway," Moir told the paper, adding that the incident occurred roughly 100 yards from a crosswalk."
"It is dangerous to cross roadways in the evening hour when well-illuminated managed crosswalks are available," she added.
According to the Chronicle's report, the Uber car was driving at 38 mph in a 35 mph zone and did not attempt to brake. Herzberg is said to have abruptly walked from a centre median into a lane with traffic.
In response to the collision, Uber said in a statement: "Our hearts go out to the victim's family. We are fully cooperating with local authorities in their investigation of this incident."
The company also confirmed that it would be pausing its self-driving car operations in Phoenix, Pittsburgh, San Francisco and Toronto.
Dara Khosrowshahi, Uber's CEO tweeted: "Some incredibly sad news out of Arizona. We're thinking of the victim's family as we work with local law enforcement to understand what happened."
If the death was caused as a result of an error by Uber's driverless car software, it would be the first such death in the United States.
In July 2016, Tesla's Autopilot software was found to have played a "major role" in the death of Tesla driver Joshua Brown after failing to see the white side of a tractor-trailer that had pulled out across the highway.
Tesla said at the time: "Neither Autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied."
However, the US National Transport Safety Board - which has opened an investigation into Monday's fatal crash - suggested that Brown had "at least 10 seconds" to notice the truck and to apply the brakes, but eyewitnesses have claimed that Brown was watching a DVD at the time of the accident. µ
Privacy-aware office worker slams 'authoritarian' AFR tech
Flagship packs a 6.26in screen, quad-cameras and, er, Android Pie
Like, subscribe, and run away with my data
Tor of duty of care