Tag: autonomous vehicles

  • How to Guarantee the Safety of Autonomous Vehicles

    How to Guarantee the Safety of Autonomous Vehicles

    [ad_1]

    The original version of this story appeared in Quanta Magazine.

    Driverless cars and planes are no longer the stuff of the future. In the city of San Francisco alone, two taxi companies have collectively logged 8 million miles of autonomous driving through August 2023. And more than 850,000 autonomous aerial vehicles, or drones, are registered in the United States—not counting those owned by the military.

    But there are legitimate concerns about safety. For example, in a 10-month period that ended in May 2022, the National Highway Traffic Safety Administration reported nearly 400 crashes involving automobiles using some form of autonomous control. Six people died as a result of these accidents, and five were seriously injured.

    The usual way of addressing this issue—sometimes called “testing by exhaustion”—involves testing these systems until you’re satisfied they’re safe. But you can never be sure that this process will uncover all potential flaws. “People carry out tests until they’ve exhausted their resources and patience,” said Sayan Mitra, a computer scientist at the University of Illinois, Urbana-Champaign. Testing alone, however, cannot provide guarantees.

    Mitra and his colleagues can. His team has managed to prove the safety of lane-tracking capabilities for cars and landing systems for autonomous aircraft. Their strategy is now being used to help land drones on aircraft carriers, and Boeing plans to test it on an experimental aircraft this year. “Their method of providing end-to-end safety guarantees is very important,” said Corina Pasareanu, a research scientist at Carnegie Mellon University and NASA’s Ames Research Center.

    Their work involves guaranteeing the results of the machine-learning algorithms that are used to inform autonomous vehicles. At a high level, many autonomous vehicles have two components: a perceptual system and a control system. The perception system tells you, for instance, how far your car is from the center of the lane, or what direction a plane is heading in and what its angle is with respect to the horizon. The system operates by feeding raw data from cameras and other sensory tools to machine-learning algorithms based on neural networks, which re-create the environment outside the vehicle.

    These assessments are then sent to a separate system, the control module, which decides what to do. If there’s an upcoming obstacle, for instance, it decides whether to apply the brakes or steer around it. According to Luca Carlone, an associate professor at the Massachusetts Institute of Technology, while the control module relies on well-established technology, “it is making decisions based on the perception results, and there’s no guarantee that those results are correct.”

    To provide a safety guarantee, Mitra’s team worked on ensuring the reliability of the vehicle’s perception system. They first assumed that it’s possible to guarantee safety when a perfect rendering of the outside world is available. They then determined how much error the perception system introduces into its re-creation of the vehicle’s surroundings.

    The key to this strategy is to quantify the uncertainties involved, known as the error band—or the “known unknowns,” as Mitra put it. That calculation comes from what he and his team call a perception contract. In software engineering, a contract is a commitment that, for a given input to a computer program, the output will fall within a specified range. Figuring out this range isn’t easy. How accurate are the car’s sensors? How much fog, rain, or solar glare can a drone tolerate? But if you can keep the vehicle within a specified range of uncertainty, and if the determination of that range is sufficiently accurate, Mitra’s team proved that you can ensure its safety.

    [ad_2]

    Source link

  • Apple Quadrupled Its Autonomous Driving Testing Miles Last Year

    Apple Quadrupled Its Autonomous Driving Testing Miles Last Year

    [ad_1]

    Apple’s secretive vehicle project doesn’t have much to show for its six years of work, at least publicly. But records submitted by the company to a California agency show that Apple went on an autonomous testing jag last year, almost quadrupling the number of miles it tested on public roads compared to 2022 and jumping 2021’s total by a factor of more than 30.

    The data covers December 2022 to November 2023. The majority of the testing miles were in the second half of the reporting period, with miles tested peaking in August at 83,900.

    Apple has a permit to test autonomous vehicle tech on California’s public roads only if the company has a safety driver behind the wheel—a first step that allows autonomous vehicle companies to collect more data on streets and determine how their software handles itself in traffic.

    A handful of other companies, including Alphabet’s Waymo and Amazon’s Zoox, have the state’s permission to test without safety drivers. California allows just two companies—Waymo and autonomous delivery firm Nuro—to deploy commercial self-driving technology in California.

    Apple’s testing totals are well below those of more advanced autonomous vehicle developers’, though the state’s reporting guidelines make them difficult to compare directly. Waymo drove 3.7 million testing miles in California with a safety driver behind the wheel and 1.2 million testing miles with no one behind the wheel. The company drove more than 1.6 million additional miles with passengers in the car, according to separate government documents. (Waymo is also operating a driverless service in Phoenix and is testing in Austin, Texas; its operations in those cities aren’t covered in this data.)

    Even Cruise, General Motors’ troubled autonomous vehicle division, which had its permit to deploy in California suspended in October and halted nationwide testing soon after, drove almost 2.65 million testing miles in the state in 2023—almost 2.2 million more than Apple.

    [ad_2]

    Source link