Whatever the depth of an automated system’s decision making, in the end achieving the wanted outcome comes from a collaboration between the processing computer and the operator or supervisor. This has been examined closely in the field of defense.
As a 2012 U.S. Defense Science Board report, “The Role of Autonomy in DoD Systems,” explained it, the value of unmanned systems is not the replacement of humans but rather improvement in persistence of operations, keeping the operator out of danger, and reducing “cognitive load” on humans. Much like other authorities on autonomous systems, they promote a framework that seeks to allocate cognitive functions and responsibilities between the human and the computer.
This approach allows the operator to focus on what humans do best: making complex recognition and value judgments while leaving the mechanization of tasks to the autonomous system. In the example of an unmanned combat air system, the flying “stick and throttle” and navigating decisions may be left to the system while the responsible operator retains the authority to make strategic and value judgments.
However automated the system, the human operator ultimately remains in control. The “Person to Purpose” approach described in work led for the UK Ministry of Defence (MoD) by Dr. Jon Platts, shows how the human supervisor retains absolute authority while being detached from ownership of the platform (or multiple platforms), freeing the operator to concentrate instead on the actual tasks.
But we needn’t get too wrapped up in definitions. As Professor Mary Beard once wrote, definitions are often false friends: “The smartest and appealing tend to exclude too much; the most judicious and broadest are so judicious as to be unhelpfully dull.” Our job at NovAtel is to give customers what they need, not tell them how to define autonomy. But to do that, we have to understand what they’re talking about.
Many of today’s cars already have sensors that provide them some level of perception of the environment. These include such sensors as radar, camera, LiDAR and ultrasonics, says NovAtel’s Segment Manager for Autonomous Systems, Siamak Akhlaghi. These sensors, in conjunction with high precision GNSS provide the localization, reliability and accuracy required for vehicle autonomy. Localization and environmental/situational awareness are fundamental for automating the driving function.
Unlike military systems, commercial vehicles are rated with different levels of autonomy. Level 5 represents full autonomy—(i.e., acceleration/handling/control functions handled by the automobile with no involvement of a driver), which Akhlaghi predicts is at least five years from becoming a reality.
Level 3 vehicles (i.e., two to three of the acceleration/handling/control functions performed by the vehicle), which is about where we are now, feature some level of autonomy. At this level, the algorithms/sensors are not advanced enough, so the driver must be sitting in the car and ready to make decisions when necessary. At Level 4 (i.e., acceleration/handling/control performed by the automobile with an operator taking over control in an emergency situation), a driver no longer needs to maintain control at all times.
Although unmanned systems have been around for many years, in the military world, autonomous vehicles are still a new concept for civil and commercial operations, and they come with their own set of challenges. Legal restrictions make it difficult to test these vehicles, which must also learn to comply with traffic lights, stop signs and the other rules of the road. They also must learn to avoid pedestrians, street furniture, and other roadway obstacles, all of which make precise positioning vital for autonomous vehicles.