Ford Fusion autonomous research vehicles use LiDAR sensor technology to see in the dark

April 12, 2016, Ford Motor Company

Recently, under the cover of night, a Ford Fusion Hybrid autonomous research vehicle with no headlights on navigated along lonely desert roads, performing a task that would be perilous for a human driver.

Driving in pitch black at Ford Arizona Proving Ground marks the next step on the company's journey to delivering fully to customers around the globe. It's an important development, in that it shows that even without cameras, which rely on light, Ford's LiDAR – working with the car's virtual driver software – is robust enough to steer flawlessly around winding roads. While it's ideal to have all three modes of sensors – radar, cameras and LiDAR – the latter can function independently on roads without stoplights.

National Highway Traffic Safety Administration data has found the passenger vehicle occupant fatality rate during dark hours to be about three times higher than the daytime rate.

"Thanks to LiDAR, the test cars aren't reliant on the sun shining, nor cameras detecting painted white lines on the asphalt," says Jim McBride, Ford technical leader for autonomous vehicles. "In fact, LiDAR allows autonomous cars to drive just as well in the dark as they do in the light of day."

To navigate in the dark, Ford self-driving cars use high-resolution 3D maps – complete with information about the road, road markings, geography, topography and landmarks like signs, buildings and trees. The vehicle uses LiDAR pulses to pinpoint itself on the map in real time. Additional data from radar gets fused with that of LiDAR to complete the full sensing capability of the autonomous vehicle.

For the desert test, Ford engineers, sporting night-vision goggles, monitored the Fusion from inside and outside the vehicle. Night vision allowed them to see the LiDAR doing its job in the form of a grid of infrared laser beams projected around the vehicle as it drove past. LiDAR sensors shoot out 2.8 million laser pulses a second to precisely scan the surrounding environment.

"Inside the car, I could feel it moving, but when I looked out the window, I only saw darkness," describes Wayne Williams, a Ford research scientist and engineer. "As I rode in the back seat, I was following the car's progression in real time using computer monitoring. Sure enough, it stayed precisely on track along those winding roads."

After more than a decade of Ford autonomous vehicle research, the company is dedicated to achieving fully autonomous driving capability, which, as defined by SAE International Level 4, does not require the driver to intervene and take control of the vehicle.

This year, Ford will triple its autonomous vehicle test fleet – bringing the number to about 30 self-driving Fusion Hybrid sedans for testing on roads in California, Arizona and Michigan.

These developments are key elements of Ford Smart Mobility, the plan to take Ford to the next level in connectivity, mobility, autonomous vehicles, the customer experience, and data and analytics.

Explore further: How an autonomous Ford hybrid manages to drive in the snow

Related Stories

How an autonomous Ford hybrid manages to drive in the snow

March 11, 2016

Ford motor company, along with all the other major car manufacturers, has been working on self-driving cars, but unlike others, such as Google, Ford has begun demonstrating an autonomous vehicle that is capable of driving ...

Ford CEO looks to autonomous cars, sharing economy

February 22, 2016

Ford CEO Mark Fields says the 112-year-old company is tripling its investment in new technologies that will ultimately lead to self-driving vehicles—but will keep making cars for drivers who want to keep their hands on ...

Ford triples autonomous fleet, poses drone challenge

January 5, 2016

Ford is to unveil Tuesday a smorgasbord of technology initiatives ranging from tripling its fleet of autonomous Fusion sedans to funding new mobility startups and also challenging software wizards to program drones and F-150 ...

Recommended for you

Coffee-based colloids for direct solar absorption

March 22, 2019

Solar energy is one of the most promising resources to help reduce fossil fuel consumption and mitigate greenhouse gas emissions to power a sustainable future. Devices presently in use to convert solar energy into thermal ...

EPA adviser is promoting harmful ideas, scientists say

March 22, 2019

The Trump administration's reliance on industry-funded environmental specialists is again coming under fire, this time by researchers who say that Louis Anthony "Tony" Cox Jr., who leads a key Environmental Protection Agency ...

The taming of the light screw

March 22, 2019

DESY and MPSD scientists have created high-order harmonics from solids with controlled polarization states, taking advantage of both crystal symmetry and attosecond electronic dynamics. The newly demonstrated technique might ...

4 comments

Adjust slider to filter visible comments by rank

Display comments: newest first

Eikka
not rated yet Apr 13, 2016
"In fact, LiDAR allows autonomous cars to drive just as well in the dark as they do in the light of day."


But not in rain, fog, dusty or snowing conditions, because that confuses the heck out of lidars, and they can get interference from other cars' lidars as well, which is a prospect that nobody has yet tested out.

Ford self-driving cars use high-resolution 3D maps – complete with information about the road, road markings, geography, topography and landmarks like signs, buildings and trees. The vehicle uses LiDAR pulses to pinpoint itself on the map in real time.


A virtual railroad. The road is mapped in advance and the car is simply told where to drive. That's not exactly "self driving", because where reality differs from what the car "knows", it has no understanding of what's happening or how to react appropriately.
Eikka
not rated yet Apr 13, 2016
The problem is that the AI may see a blob in the radar that isn't in the virtual map, but it has no idea what that blob means. The map it understands because it's been programmed with everything that's in it, but the real world around is not programmed in because it can't be. The AI is living in its own "imagination", as if sleepwalking.

Imagine you were the computer. You're walking around your room, and seeing everything that's in the room because you remember where things are, what things are, and what to do with everything. Suddenly something new comes in - it feels like an invisible finger is poking you on the cheek. What does it mean? What's causing it? How do you tell? The invisible poker comes and goes as you move and look around, but there's no identifiable cause or reason to it in your mental map of the room. It's just there, and nobody's told you what it is or what to do with it.

Meanwhile in the real world, an angry robber is pushing a gun to your face.
Eikka
not rated yet Apr 13, 2016
Continuing with the example, to understand how little modern AI really understands, imagine that the robber in your room took your coffee cup and moved it slightly.

Whoops, now you're in a conflict. The coffee cup no longer looks like a coffee cup because it's in the wrong place which is not encoded in your mental map of reality. The coffee cup has dissapeared from the world!

Suddenly you're no longer sure where you are, because in your room there's supposed to be a coffee cup. You know there is a coffee cup, so this must be a different room, and suddenly nothing makes sense. After all, you don't have a hierarchy for object permanence, so a mug is as important as the walls around you in defining your location in the world, and your location in the world defines what everyhing is - because that's all you know.

So you might have some tolerance for missing objects or sensor noise, but still if the robber messes up with enough stuff, you're completely lost.
Eikka
not rated yet Apr 13, 2016
There's supposed to exist a video that has since been removed from Youtube, where a Google test driver exposes in an interview how he had had to grab the wheel of a Google Car driving down a canyon pass, because the car lost its bearings passing a truck and mistook the side of the trailer as the canyon wall.

Suddenly the car's "reality" shifted one lane width to the left with obvious consequences: the car thinks its in the wrong lane and tries to get back to the right lane, which in the real world happens to be down the cliffside.

That's the danger of running on virtual rails: if for any reason you get a glitch and a disrepancy between the programming and reality, the cars will do some crazy things.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.