• Of algorithms, automated trucks, and ethics

    Of algorithms, automated trucks, and ethics
    Dec. 10, 2015
    3 min read

    When you started getting down into the weeds of autonomous vehicle technology, some rather strange – and potentially scary – discussions start to take place.

    For example, picture this scenario: A truck in autonomous mode traveling a heavily congested roadway suddenly comes upon completely stopped traffic. To the right, is a passenger car; dead in front is a transit bus; and on the left, a school bus.

    What does the vehicle do?

    Does it have enough time to slam on the brakes? If it does not and must maneuver to avoid rear-ending a packed transit bus, does it go left or right? Go left, and you hit a school bus loaded with children. Yet maybe, since it is a bigger vehicle, it can better absorb the crash and only cause injuries – for if it swerves right, hitting the passenger car, the chances of a fatality are much higher.

    Then again, the children in the school bus don’t have seat belts. Does that elevate the risk of serious injuries or even for fatalities?

    Wow. Talk about a difficult ethical problem – one not easily solved by mathematics.

    The Massachusetts Institute of Technology (MIT) delved into this problematic topic with a rather provocative headline back in October: Why Self-Driving Cars Must be Programmed to Kill.

    Experts gathered here at the Texas Motor Speedway to attend the North American Automated Trucking Conference also touched on this tricky subject as well.

    Stephan Keese, senior partner with Roland Berger Strategy Consultants, posed those kinds of dilemmas, asking if automated trucks need to be programmed with algorithms that automatically choose the larger vehicle in case of a crisis situation. If so, how might laws need to be changed to reflect such crisis decision making?

    “Should the driver even be allowed to overrule a vehicle in such a case? This is one of the reasons why the least of the hurdles facing truck automation are on the technology side,” he said. “We’re still early in the in the development of legal and ethical requirements for automated trucks.”

    However, Bill Kahn – principal engineer and manager of advanced concepts for Peterbilt Motors Co. – stressed that those types of decisions will probably never have to be made, simply because machines can react faster than humans in such “crisis situations” on the highway.

    “We’ll just slam on the brakes and stop,” he explained, noting that current research indicates it takes one to two seconds for a human driver to “hit the brakes,” whereas a machine can do so in 1/100th of a second.

    Kahn also referenced Google’s experience to date with crashes involving its autonomous cars: that it’s typically other human-controlled vehicles hitting the self-driving cars, not the other way around.

    “Those [self-driving] cars get run into – they don’t hit other vehicles,” he said. “We expect [autonomous] trucks will operate in a similar mode.”

    About the Author

    Sean Kilcarr 1

    Senior Editor

    Voice your opinion!

    To join the conversation, and become an exclusive member of FleetOwner, create an account today!

    Sign up for our free eNewsletters

    Latest from Trucks at Work

    173194228 | Mariusz Burcz | Dreamstime.com
    EV Tesla Prime
    There’s a lot to like about the commercial vehicle industry switching to EVs, but some serious obstacles need to be addressed before scaling the technology.
    Photo: Aurora
    Press Aurora Truck Pacifica
    With a friendly regulatory—and weather—environment, autonomous trucking and technology firms are finding plenty of space to play in the Lone Star State.
    Photo: Prestolite
    Prestolite's Leece-Neville IdlePro alternators and PowerPro Extreme 5 starter
    The Leece-Neville IdlePro alternators and PowerPro Extreme 5 starter facilitate more uptime in the work truck segment.