TECHNOLOGY

Why Self-Driving Uber Cars Look So Geeky

By Russ Mitchell
Los Angeles Times

WWR Article Summary (tl;dr) Although self driving Uber Cars may currently look a little bulky, that is sure to change as technology develops. Right now, the bulkiest single item on the roof is also the newest technology: 3-D lidar. Lidar is similar to radar but instead of electromagnetic radiation, it beams out laser light. Those beams are bounced off objects to measure distance with enough accuracy to create three-dimensional images of anything from a face to a fire hydrant.

SAN FRANCISCO

Qawiyah Muhammad can see her own future. Literally.

An Uber driver in Pittsburgh, she knows that one day her job will be replaced by a robot car. She knows the robot cars are coming because she sometimes spots experimental models driving themselves around town.

“You can tell them apart,” she said, “because they have a thing on the top of the car, like ‘Back to the Future.'”

There’s a reason they stand out so much, and it’s not because Uber or anybody else thinks they look cool.

On top of Uber’s new driverless cars is an array of bulky sensors, cameras, radars, lidars, that eventually will be shrunk into a more discreet system that will replace Muhammad and thousands of other Uber drivers. Inside the cars is a computer, which, when sufficiently advanced, will stand in for a human’s thinking, steering and pedal pushing.

Although the cars look geeky now, the driverless cars of the next decade won’t look anything like the clumsy agglomeration on the rooftop of Uber’s early-iteration driverless Fords.

“Over time, it will be harder and harder to differentiate an autonomous car from a conventional car,” said Aaron Steinfeld, an associate research professor at Carnegie Mellon University in Pittsburgh.

The fast-shrinking nature of tech stretches back to the transistor and the computer chip after World War II. Over the decades, more and more computer power has been crammed into less and less space, until it in effect suffuses the object itself. The hardware then can take whatever form works (or looks) best.

The first computers filled large rooms; they even scared some people. Today, a Fitbit fitness tracker is more powerful than any of them. Computer processing power in large part allows tiny iPhone speakers to sound as good as they do. The examples are endless.

Likewise, the rooftop sensors on autonomous cars will become smaller and cheaper; some won’t even be needed as “perception software” gets better at seeing and interpreting the world outside the car, Steinfeld said.

The bulkiest single item on the roof is also the newest technology: 3-D lidar. Lidar is similar to radar but instead of electromagnetic radiation, it beams out laser light.

Those beams are bounced off objects to measure distance with enough accuracy to create three-dimensional images of anything from a face to a fire hydrant. They’re mounted high up to see over pedestrians and other cars.

More versatile than radars and less prone to error than optical cameras, which can be fooled by light, they’re also bigger and a lot more expensive.

By the time the cars are ready for prime time, five years from now at least, according to automakers, all this scientific equipment will be shrunk down and integrated into the look of the vehicle.

Uber didn’t want to wait until then. In Pittsburgh earlier this month, the San Francisco company unveiled to media its tech-laden experimental driverless cars.

The San Francisco company showed off a small fleet of driverless Ford Fusions and said it will soon add Volvos featuring a more streamlined look to the mix. Eventually Uber’s driverless fleet will expand to other cities and the cars won’t need a human chaperone anymore.

For now, the main purpose of Uber’s cars is research: mapping Pittsburgh; identifying objects; figuring out how to program the car’s computer so it knows what’s in front of it, what’s coming in from the side and how fast; and how to react to it all.

There might be an unintended bonus too: The very visible nature of the car’s tech capabilities perched on the rooftop will clearly indicate to passengers, drivers and pedestrians that Uber’s newest cars are robots.

“People around those cars can see them coming,” Steinfeld said. “For people who are feeling a little nervous about being around such a vehicle, it makes it easier to avoid them.”

For others, it will be a chance to become more familiar with the rapidly approaching era of driverless cars.

Lidar makers at present are in a “chicken-and-egg” situation, said Mike Jellen, chief executive at 3-D lidar pioneer Velodyne, based in Morgan Hill, Calif. Right now, the price of lidar systems ranges from thousands to tens of thousands of dollars.

“When the (carmakers) are using those sensors in the millions, you’ll see those cost points fall down in the mid-hundreds or below,” he said. As they become less expensive and more advanced, they’ll also become less noticeable.

Earlier this month, Velodyne announced the latest in a series of “Puck” models, or lidars not much bigger than a hockey puck.

When autonomous cars go commercial, the lidars on board are likely to be that size or smaller. Before then, Velodyne and its competitors are aiming to sell lidars into the current car market, for semi-autonomous cars that provide automatic lane changing and other driver assistance vehicles today.

Velodyne is counting on other markets to help get costs down and volumes up. The company also makes lidars that can be used in drones, industrial robots and mapping.

Last month, Ford and China’s Baidu said they will invest a combined $150 million in Velodyne. Several 3-D lidar startups have popped up, including Quanergy, a Sunnyvale, Calif., company that is working closely with Mercedes.

The market research firm Frost & Sullivan estimates that the 3-D imaging market will grow from $5.71 billion last year to $15.15 billion by 2020.

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Most Popular

To Top