Categories: News

Internal Metrics Show How Often Uber’s Self-Driving Cars Need Human Help

[ad_1]

Jeff Swensen / Getty Images

Human drivers were forced to take control of Uber's self driving cars about once per mile driven in early March during testing in Arizona, according to an internal performance report obtained by BuzzFeed News. The report reveals for the first time how Uber’s self-driving car program is performing, using a key metric for evaluating progress toward fully autonomous vehicles.

Human drivers take manual control of autonomous vehicles during testing for a number of reasons, for example, to address a technical issue or avoid a traffic violation or collision. The self-driving car industry refers to such events as “disengagements,” though Uber uses the term “intervention” in the performance report reviewed by BuzzFeed News. During a series of autonomous tests the week of March 5, Uber saw disengagement rates greater than those publicly reported by some of its rivals in the self-driving car space.

When regulatory issues in December 2016 forced Uber to suspend a self-driving pilot program in San Francisco, the company sent some of its cars to Arizona. Since then, Uber has been testing its autonomous cars along two routes in the state. The first is a multi-lane street called Scottsdale Road — a straight, 24-mile stretch that runs through the city of the same name. According to Uber's performance report on tests for the week of March 5, the company's self-driving cars were able to travel an average of 0.67 miles on Scottsdale Road without human intervention and an average of 2 miles without a “bad experience” — Uber’s classification for incidents in which a car brakes too hard, jerks forcefully or behaves in a way that might startle passengers. Uber described the overall passenger experience for this particular week as “not great,” but noted improvement compared to the prior week's tests, which included one “harmful” incident — an event that might have caused human injury.

Uber has also been testing its autonomous vehicles on a “loop” at Arizona State University. According to the performance report reviewed by BuzzFeed News, self-driving cars used on the ASU loop saw “strong improvement” during the week of March 5, traveling a total of 449 miles in autonomous mode without a “critical” intervention (a case where the system kicked control back to the driver, or the driver regained control to prevent a likely collision). The vehicles were able to drive an average of 7 miles without a “bad experience” that might cause passenger discomfort (a 22% improvement over the week prior) and an average of 1.3 miles without any human intervention (a 15% improvement over the week prior). The cars made 128 trips with passengers, compared to 81 the prior week.

Uber told BuzzFeed its disengagements could also include instances when the system kicks back control to a driver, and when the car returns control to a human driver toward the end of a trip. The company declined to comment on the internal metrics obtained by BuzzFeed News and its disengagement rate compared to those of competitors. Uber also declined to say how many miles and hours the vehicles in Arizona drove in total during the week of March 5.

“To take out the safety drivers, you would want far better performance than these numbers suggest.”

Bryant Walker Smith, a University of South Carolina law professor and a member of the US Department of Transportation's Advisory Committee on Automation in Transportation, said it’s difficult to draw conclusions about the progress of Uber’s self-driving car program based on just one week of disengagement metrics, adding that the figures suggest that safety drivers appear to intervene regularly out of caution – even in cases where an accident may not be imminent.

“To take out the safety drivers, you would want far better performance than these numbers suggest, and you’d want that to be consistently better performance,” Walker Smith said. “If these are actual bad experiences for someone inside the vehicle, then that probably doesn’t compare very favorably to human driving. How often do people go 10 miles or 10 minutes and have a viscerally bad experience?”

Uber’s internal metrics are specific to its vehicles in Arizona. The state does not require companies testing there to release data on how their self-driving cars perform. California is the only state that requires companies that test self-driving cars on public roads to submit annual reports detailing how many times they “disengage” autonomous mode. Because Uber only returned some self-driving vehicles to San Francisco’s roads this month, after its trials were shut down in the state for not obtaining the proper permits December, it has not yet submitted a public report. But reports submitted by other companies to the California DMV do offer a point of comparison.

Alphabet’s Waymo said in a Jan. 5 report filed with the CA DMV that during the 636,000 miles its self-driving vehicles drove on public roads in California from December 2016 through November 2016, human drivers were forced to take control of their self-driving vehicles 124 times. That’s a rate of 0.2 disengagements per thousand miles — or 0.0002 interventions per mile, compared to Uber’s 0.67 and 1.3 rates on Scottsdale Road and the ASU loop, respectively. But Google’s report also notes that its figures don’t include all disengagements: “As part of testing, our cars switch in and out of autonomous mode many times a day. These disengagements number in the many thousands on an annual basis though the vast majority are considered routine and not related to safety.”

(As a comparison to Uber’s testing the week of March 5th in Arizona, here are the CA DMV reports from other companies that tested on public roads in California and reported their statistics to the DMV for December 2015 through November 2016.)

Uber CEO Travis Kalanick has called self-driving cars “existential threat” to his ride-hail business. (If competitor were to develop autonomous vehicles and run an Uber-like service that did not require giving a cut to drivers, the rides would be cheaper.) In February 2015, Uber poached dozens of top roboticists from Carnegie Mellon University to jumpstart a self-driving car program. Eighteen months later, Uber launched a pilot program in Pittsburgh that put passengers in the backseats of cars manned by a safety driver and a “copilot” riding shotgun. “Your self-driving Uber is arriving now,” the company wrote on its website. Headlines called it a “landmark” trial, and “the week self-driving cars became real.”

Uber’s self-driving program is quarterbacked by Anthony Levandowski, who helped build the first self-driving Google (now called Waymo) car before leaving to create his own startup, Otto. Uber's self-driving program is now embroiled in a lawsuit from Alphabet over allegations that Levandowski stole crucial part of Waymo’s self-driving technology before leaving. Uber acquired Otto in August, about three months after Levandowski launched the company out of stealth mode.

Levandowski became the self-driving program’s fourth leader in less than two years. Uber CEO Travis Kalanick described their relationship as “brothers from another mother,” saying the pair shared a desire to move autonomous technology from the research phase to the market. A few weeks after the Pittsburgh pilot launched, Levandowski set a new, ambitious goal for Uber’s engineers, according to an internal planning document viewed by BuzzFeed News: Prepare self-driving cars to run with no humans behind the wheel in San Francisco by January 2017.

In the end, in response to concerns raised by engineers who worried the goal was too aggressive, Uber did something far less ambitious. In December 2016, it launched a trial in San Francisco that mirrored its Pittsburgh pilot program: A human safety driver, accompanied by a copilot,” would man each self-driving Volvo on the road in San Francisco. On its first day, one of the vehicles was caught running red light. Uber attributed the traffic violation to human error, but the New York Times reported in February that “the self-driving car was, in fact, driving itself when it barreled through the red light.”

“When they let us know they were doing the test, we kind of had to play catch-up because nobody had ever asked us that question before.”

Meanwhile, Uber’s self-driving truck division Otto has been working toward its own goals. In October, Otto made headlines for completing the first publicly known self-driving truck delivery – a 120-mile beer haul along a public highway in Colorado for Anheuser-Busch, with the driver in the back seat.

“When they let us know they were doing the test, we kind of had to play catch-up because nobody had ever asked us that question before,” Mark Savage, deputy chief for the Colorado State Patrol, told BuzzFeed News. “We did put together a protocol that we had them walk through in order to determine whether the test was done safely and it was pretty involved.”

For one month ahead of the demo, the company performed trials along that route for 16 hours a day with human safety drivers behind the wheel, according to a Colorado state planning document obtained by BuzzFeed News. A video showed the truck driver crawling into the sleeper berth for the duration of the ride.

After completing five consecutive tests – a total of 625 miles – that did not necessitate human intervention, Otto embarked on a fully driverless demo at midnight on Oct. 20, with the state patrol “packaging” the truck with troopers during the event much like a motorcade, according to the planning document. The truck included two emergency stop buttons: one near the steering wheel, and one in the sleeper berth, where the driver sat during the ride, Uber told BuzzFeed. The company added the second button specifically for the delivery; in all other tests, Otto drivers remain behind the wheel.

Steven Shladover, chair of the federal Transportation Research Board’s vehicle highway automation committee, said Otto’s testing before the demonstration “tells nothing about whether the system is safe.” He said crashes occur when “some other driver happens to do something stupid. You’re not going to run into those circumstances by driving a few hundred hours.”

“Just the fact that they have however many hundred hours of driving doesn’t prove safety,” Shladover told BuzzFeed. “Putting together a show like that is nice for marketing purposes, but it doesn’t prove anything about the readiness of the technology to be put into public use.”

[ad_2]

Continue Reading
techfeatured

Recent Posts

Unlock Detailed Vehicle Information with VINDECODERZ

VINDECODERZ offers comprehensive and reliable VIN decoding services to provide users with detailed vehicle information…

4 days ago

Trending Innovations in Auto Repair: How Modern Technology Shapes Service Quality

Table of Contents: Key Takeaways Understanding the Impact of Diagnostic Software Electric and Hybrid Vehicles:…

2 weeks ago

Exploring the Impact of Trigger Kits on Firearm Performance and Safety

Key Takeaways: Custom trigger kits can offer personalization while potentially improving shooting accuracy and performance.…

3 weeks ago

Extending Vehicle Longevity: The Impact of Ceramic Coatings on Car Maintenance

Ensuring a vehicle's longevity requires more than just regular servicing; it encompasses a broader approach…

3 weeks ago

Better Care with Tech: How Skilled Use of Medical Tools Saves Lives

Technological developments in medicine have raised the bar for patient care to an unprecedented degree,…

1 month ago

Choosing the Right Platform: Options for Website Creation

In the digital era, having a website is essential for businesses, organizations, and individuals alike.…

2 months ago