The Consumer Electronics Show, one of the world’s tentpole technology shows, is a flashy vehicle for the engineering underneath. In the world of chipsets, the show was dominated by Nvidia, one of many companies seeking to power the artificial intelligence in next-generation cars and image processors. LG also wowed with its 4k displays, including the “Wallpaper” TV screen. Many component makers pushed their products for use in display screens, whether those screens were headed for the living room or an SUV. There were also plenty of opportunities for component manufacturers to find places in the flashy new cars and fleets that dominated some of the show floors.
Chipmakers and sensor manufacturers are also keeping an eye on what customers want in the automotive space. Just seven states – Nevada, California, Florida, Michigan, Hawaii, Washington, and Tennessee — and the District of Columbia have passed bills related to autonomous driving. Although these laws determine whether or not an autonomous car can operate in the state, a lot of details are only sketched in, including the capabilities required of the sensors. Manufacturers in this space have to contend with customers’ changing needs as a standard is worked out.
Here are some of the standout companies featured at CES:
Velodyne’s LiDAR sensors are considered to be the top-of-the-line when it comes to autonomous cars. Their HDL-32E LiDAR offers an industry-leading 360 degree field of view, while the VLP-16 Puck is designed for mass production of 360 degree view sensors in a smaller, $8,000 package. They are working on lowering this price, but the sensor is already being snapped up by automakers. At CES, Ford announced their Fusion Hybrid concept car will have two of these on the A pillars. The Puck offers a range of 100 meters and a ± 15° vertical field of vision. While it’s expensive, its small size, at just 830 grams, makes it attractive.
Osram is another company getting in on the ground floor with autonomous vehicle sensors. Focused on lamps and lighting products, Osram also makes a four-channel laser for LiDAR systems. They have partnered up with laser scanning company Innoluce, an offshoot of Infineon, to demonstrate how very short laser pulses can be used to create a high-resolution map of the car’s environment. A short pulse length (less than 5 nanoseconds) means the laser light produces a high-resolution image while also meeting deferral standards for eye safety. This is the first time a LiDAR sensor will be based on micro-electromechanical systems (MEMS) instead of mechanical redirection. The controller and MEMS mirror will be packaged together, but this system will not be available from Innoluce until about 2020. Osram showed a sample laser package at CES, as well as using virtual reality to give visitors a tour of their other offerings in the lighting space.
Nvidia also made a lot of noise in the autonomous car space at CES. Along with demonstrating consumer-oriented products, the graphics processing unit maker also placed itself at the core of autonomous vehicles. Nvidia has partnered up with Mercedes Benz and Audi to show its DRIVE PX-2 AI self-driving computer, which uses DriveWorks software running a neural net to negotiate unexpected hazards and unfamiliar territory. The star of the show was “BB8,” a Lincoln Town Car that responds to spoken commands and was demonstrated outside the convention center. Inside, Nvidia used an Audi Q7 to demonstrate more AI-piloted driving.
The DRIVE PX 2 computing platform offers deep neural networks, location services and two discrete GPUs, all while consuming just 10 watts of power in certain single-processor configurations. It’s also being used in the Tesla Model S. Automotive technologies companies such as LG, ON Semiconductor, and TomTom all use DRIVE PX 2 in a variety of capacities, including ON’s cameras and LG’s cockpit displays. These deep neural networks can be used both in the car itself, where they process information from the cameras and sensors, and inside a data center, from which information can be sent to the car.
NVidia’s Tesla chips are designed to handle that data center side of the operations. Their GPU accelerators decrease deep learning training time and enable a data center to handle more information – important when deep learning networks may be performing 24 trillion operations per second.
Other automotive companies showed off different solutions. Toyota’s Concept-I runs on “Yui,” a personal assistant system that includes biometric sensors to detect whether a driver is alert. (Osram is also predicting a future in which cameras inside the car may tell the system how the driver is feeling.) Yui is based on an in-house software platform.
Meanwhile, Nissan is looking to space for its personal assistant. The Seamless Autonomous Mobility (SAM) system demonstrated at the show uses NASA’s Visual Environment for Remote Virtual Exploration (VERVE), which helps Mars rovers determine which path to take. SAM will also ride onboard the next-generation Leaf, which will have its own autonomous technology platform.
One of the more out-there concepts was Chrysler’s Portal, already infamous for targeting millennials. The 150 mile-range electric car includes facial recognition technology that stores a driver’s preferences for seat configuration, music, and more.
Honda announced a self-driving concept car for the ride-sharing market, a two-seater with an electric engine optimized to reduce downtime and sell energy back to the grid. It’s intended to work as both a personal car and a money-maker on the side. From the outside it looks like a smart car, while the smarts inside come from AI technology built by SoftBank.
With self-driving cars comes a system that can hand over control to the driver while maintaining awareness of its surroundings even when the driver isn’t. Ford will be integrating Amazon Alexa into some of its vehicles, including the Ford Focus Electric. Along with displays, Amazon’s voice assistant was another trend at this year’s CES, with more and more product makers integrating Alexa’s voice commands into their smart home or smart automotive products. On the other hand, Nissan and BMW will be integrating Cortana, Microsoft’s digital assistant, into their smart cars.
With companies jockeying to be the one to set the standard for autonomous navigation and in-car assistance, chipmakers and parts manufacturers are set to build the next generation of cars.
Filed Under: AI • machine learning, Rapid prototyping, Virtual reality