Ford is readying its new generation Fusion Hybrid autonomous development vehicle for its debut at CES in Las Vegas, and then also to put in an appearance in at the North American International Auto Show in Detroit a week later.
The car will also debut updated autonomous vehicle programming, including new hardware that amps up the processing power, and new sensors that gather the same amount of information from reduced hardware and new locations on the car.
The LiDAR (Light Detection and Ranging) sensors are sleeker and more targeted in their sensing, which allowed Ford to use just two sensors and still collect as much data.
The new gen autonomous car also debuts the Virtual Driver program — improving on the SAE level 4 automation, whereby the machine doesn’t need a driver to intervene in a certain driving task, but still requires a human driver for some driving modes. The 5th level is full automated driving in all driving modes.
The Virtual Driver program uses mediated and direct perception to handle the driving chores. Mediated perception is knowing the rules of the road, for example, and abiding by those rules (an example would be the rules of right of way at a 4-way stop). Direct perception is the identification of speed limit signs, for example, and knowing when another driver should have yielded the right of way at a 4-way stop but didn’t. The Fusion Hybrid’s LiDAR sensors can even interpret a traffic officer’s hand signals.
The original Fusion Hybrid autonomous development vehicle debuted three years ago.