Home | WebMail | Register or Login

      Calgary | Regions | Local Traffic Report | Advertise on Action News | Contact

Science

Giving self-driving cars the gift of sight

While self-driving cars may be able to see more of the road than humans, the trick is teaching them to interpret all of that data as good as and eventually, better than humans can.

Chip companies are trying to make sensors and software that see the world as humans do only better

This is how a sensor commonly used in self-driving cars saw visitors to Ford's booth at CES 2017. (Matthew Braga/CBC News)

When humans drive, they typically do it with two eyes. When cars drive themselves, they currently rely on several.

But while self-driving cars may be able to see more of the road than humans, the trick is teaching them to interpret all of that data as wellas and eventually, better than humans can.

At CES, the annual consumer technology show in Las Vegas last week, there were a handful of companies attempting to bring us closer to this reality.Some are the makers of sensors and chips that underpin many of the devices we use each day, and others are not yet household names.

There are companies such as Velodyne and Quanergy, which have been developing LIDAR(Light Detection and Ranging) sensors that use lasers to measure the distance, shape, and size of objects near and far.

Others, such as Mobileye, make cameras that havetraditionally been used for rear and side-view camera systems, but are increasingly used for object detection and 3D mapping as well. (Its sensors are perhaps most well known as the eyes behindelectric auto maker Tesla'sAutopilot hands-free drivingfeature.)

The MobileEye booth at CES, the annual consumer electronics show in Las Vegas. The company makes sensors commonly used for rear-view camera systems, but has also lent its expertise to Intel and BMW's still in-development self-driving car. (Matthew Braga/CBC News)

And then there are the chip companies Intel, NVIDIA, and even BlackBerry thatare partnering with more traditional car companies to provide the processing power and next-generation internet connectivity required to make self-driving cars a reality.

James Kuffner, the chief technology officer of Toyota's artificial intelligence and self-driving vehicle research institute, said in an interview at CESthatthere's room for current sensors to improve in accuracy, range and cost.

"And I think we're going to see a dramatic shift in the next ten years as this technology starts to mature."

Cameras,maps, and semantic cues

In the near term, both Mobileye and Toyota have plansto leverage data from existing car camerasin an attempt to build more up-to-date3D maps of the world.

By harvesting images from cars that are already on the road, or soon to be on the road, the two companies hope that they can more cheaply and regularlycrowdsource the data required for self-driving vehicles to safely navigate ever-changing urban areas.

Erez Dagan, Mobileye'ssenior vice president of advanced development and strategy, says the company's discussions are in "very advanced stages with multiple car manufacturers" to not only harvest data from their cars' cameras, but share that data with other automakers globally.

But longer term, cameras could be used for more nuanced types of sensing. Another area that still requires work is the understanding of semantic cues all of the human behaviours that are easily recognizable to drivers, but that computers don't fully understand.

A sensor made by LIDAR company Velodyne is being used in Ford's fleet of self-driving test vehicles. (Matthew Braga/CBC News)

Today, the posture of a pedestrian, a cyclist's gaze, or the direction a parked car's wheels are difficult for self-driving cars to discern and understand the wayhumans do.

"Car sensor systems are not yet sensitive enough to be able to interpret body language. But we do," explained MelissaCefkin,Nissan's in-house design anthropologist and principal researcher. "We can tell that somebodystanding like this" hereCefkin mimes a pedestrianlooking down at her phone "is not about to run across the street."

"So part of what we would like to do is continue to bring these insights," she added. "[But] we're kind of ahead of what the technology is capable of in terms of what we would try to teach it."

Long-range lasers

While camera sensors are good for some tasks, most opt for LIDAR when it comes to detecting things like size, shape, depth and speed everything from acar's proximity tonearby cyclists to the velocity of a wayward soccer ball.

What companies want is "additional range, they want additional resolution, they want additional quality within the data," said Michael Jellen, president and chief operating officer of Velodyne, perhaps the most well-knownLIDARcompany. Its sensors have been used by companies rangingfromGoogleto Ford.

"A human can see all the things some of the time, or some of the things all the time, but it's never going to see all the things, all the time. And that's what our perception is going to enableyou to do," saidJellen.

Inside the trunk of a self-driving Ford Fusion hybrid, which is packed full with computer equipment and high-end graphics chips from NVIDIA. (Matthew Braga/CBC News)

Quanergy uses a type of LIDAR sensor that the company says can be re-focused on-the-flytoidentify objects with greater accuracy and detail. "You can say, 'Okay, this is floating like a plastic bag, so I'm going to drive through it,'" explained Louay Eldada, the company's co-founder and CEO, or "'This is a moose, and I'm not going to mess with that.'"

The company says thatit is working with Mercedes, Hyundai-Kiaand Renault-Nissan, in addition to other manufacturers it declined to name.

But one perennial challenge is weather.

"When we think about what sensors and software it would take to drive in snow, or heavy rain, we'renot there yet. And no one really is," saysKuffner of Toyota. "But I think there's a lot of sensing modalities you can explore that might be able to enable that."

Driving data

Another important piece of the self-driving puzzle is what to do with all of the data these sensors produce.

Chipmaker Intel's current focus is on building next-generation computer systems that can ingest an "enormous amounts of data from a variety of sensors," explained Bridget Karlin, managing director of the company's Internet of Things Group. "And not just vision sensors, not just cameras. We are looking at motion, we're looking at temperature, LIDAR, radar."

Intel, in partnership with BMW and Mobileye, says it plans tohave40 autonomous vehicles on the road with ahuman watching behind the wheel inthe second half of 2017, and a fully driverless car on the road by 2021.

Hyundai unveiled a version of its IONIQ vehicle at CES with less-expensive self driving technology aimed at consumers, but no release data. (Matthew Braga/CBC News)

Competing chipmaker NVIDIA announced a similar platform at the show, and a partnership with Audi to have "advanced AI cars on the road starting in 2020," according to a press release.

And BlackBerry, once one of the world's most well-known smartphone makers, has been trying to re-position itself into a similar role with its automotive operating system, QNX, which the company says can be found in more than 60 million vehicles worldwide.

"When you think of the sensors, they're like your eyes and ears. Software will be the brains, powered by very high performance computer platforms," said Grant Courville, a senior director at QNX. "We want to provide the software platform, not only for those autonomous driving safety systems, but in fact, for the whole car."