Robot cars

The latest advances in self-driving technology at CES

Artificial intelligence (AI) is transforming cars into friendly robots. The Las Vegas tech fest, which took place in January, offered tantalizing glimpses into the future for automotive vehicles.

self-driving vehicles on road
Highly complex algorithms are required for self-driving technology to work

Some say it is overhyped but self-driving technology has become one of the main draws of CES in just a few years. Organizers of the event claim it is the largest auto show out there and this year around 170 different exhibitors came together to demonstrate their self-driving know-how, which ranged from connected cars right down to futuristic concept vehicles. Even if fully autonomous cars are far from hitting the roads, self-driving technology has progressed in leaps and bounds over the last year, partly thanks to more complex analytics algorithms.

Getting better all the time

Most cars on the roads today have some form of driving assistance, helping drivers to park, for instance. At CES, advanced driving assistance made the headlines, including passenger and road edge detection and automatic emergency braking. Pre-collision systems, including passenger detection, are meant to help drivers and notify them that an obstacle is in the way. These systems combine software with sensors, cameras and, in some cases, radars to detect objects near or in front of the car.

Even more sophisticated algorithms are required to move to fully autonomous vehicles. Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory have been working on a new change lane algorithm which allows automated cars to behave like humans and make split-second decisions on whether to stay in a lane or not. The researchers tested their algorithm in a simulation with up to 16 autonomous cars driving in an environment with several hundred other vehicles, without collision. The rise in edge computing has made cars more capable of processing and finding patterns in the data provided by sensors. The data is stored in the car itself instead of a central cloud, making it faster and easier to process. It is also more difficult to hack. (For more information about edge computing, read the IEC White Paper Edge Intelligence.)

Brains and brawn

There is still some way to go, however, before autonomous cars can compete with the human brain. According to Tigran Shaverdyan, one of the inventors of a self-driving van that launched at CES 2019, “it is still very difficult to create an algorithm that would enable an autonomous car to choose the right option in an unlikely scenario. It is the 'chicken crossing the road' quandary.” Their van, a sort of grocery shop robot, is piloted remotely for now, essentially for safety reasons. “We will be testing increased autonomy next year. But the technology will still involve some form of monitoring from afar. A number of safety issues have to be addressed before we can launch a fully autonomous vehicle but we are confident we can solve these problems in the longer run.”

IEC is preparing the ground for the increasing use of AI technology in our daily life. The joint technical committee of IEC and ISO on information technology (ISO/IEC JTC 1) and several of its subcommittees (SCs) prepare international standards that contribute towards artificial intelligence. For instance, SC 42 was set up to provide standardization in the area of AI as well as guidance to other committees developing AI applications. IEC is also a founding member of the Open Community for Ethics in Autonomous and Intelligent Systems (OCEANIS).This global forum brings together organizations interested in the development and use of standards as a means to address ethical matters in autonomous and intelligent systems.

A series of standards published by IEC TC 47, IEC 62969, specifies the general requirements of power interfaces for automotive vehicle sensors. IEC TC 100 issues several standards relating to multimedia systems in cars. One of its most recent publications is IEC technical specification (TS) 63033. It specifies the model for generating the surrounding visual image of the drive monitoring system, which creates a composite 360° image from external cameras. This enables the correct positioning of a vehicle in relation to its surroundings, using input from a rear-view monitor for parking assistance as well as blind corner and bird’s eye monitors.

Connecting the dots

Connected cars were one of the big trends at CES 2019. Improved features and technology were touted on the back of the arrival of 5G networks. The connection speed of this latest generation mobile communication system is much higher and delivers signals more reliably than previous networks. This is very useful for high quality VR applications, for instance. One of the novelties at the show was content producers teaming up with car manufacturers, chip makers and smartphone companies to offer passengers in-car VR immersive experiences. ISO/IEC JTC 1/SC 24 is preparing standards in the area of augmented and virtual reality.

5G will also help with the implementation of vehicle to everything (V2X) communication between self-driving vehicles and other cars, appliances or obstacles, such as traffic lights and pedestrians, etc. IEC 62232, issued by IEC TC 106, provides methods for determining radio-frequency field strength near the radio base station. This standard takes into account frequencies to be used for 5G for the purpose of evaluating human exposure. IEC TC 106 has established a new joint working group with the Institute of Electrical and Electronics Engineers (IEEE) to develop international standards for 5G device testing by 2020.

In the mood for a drive

Several concept cars at CES demonstrated voice and image recognition systems, used to guess drivers’ moods. A well-known voice recognition tool has been integrated into many cars, where it performs a wide variety of tasks which include acting as a safety assistant and warning of potential dangers on the road. A Korean manufacturer’s concept car featured facial recognition technology that uses artificial intelligence to assess the emotional state of the person holding the steering wheel. The software can change the vehicle's interior lighting, for instance or warn drivers when it detects that they are tired.

Before becoming fully autonomous, cars are developing into friendly robots, happy to help and serve, while drivers still retain a modicum of control. This could be the best of both worlds – reducing the risk of human error while preserving the enjoyment of driving.