Fujitsu Develops Video-Processing Technology Enabling World's First Wraparound View of Vehicles in Real Time

Nov 17, 2008
Results achieved employing Fujitsu Laboratories' new video-processing technology (using same-source raw video data)

Fujitsu Laboratories announced today the development of a new video-processing technology that enables a complete wraparound view of a vehicle's perimeter in real time, to enhance the driver's field of view. The new driver assistance technology adapts to different driving situations, enabling the driver to peripherally view the entire surroundings of a vehicle, from the point of view and field of view that is most appropriate for each driving situation. The

technology improves driving safety by assisting the driver in a variety of situations, such as parking, passing on a narrow street, and seeing around corners at intersections with poor sightlines.

Details of this new technology will be presented at the 15th World Congress on Intelligent Transport Systems & ITS America's 2008 Annual Meeting and Exposition, being held from November 16 to 20 in New York.

Background

In recent years, vehicle-mounted cameras have grown in popularity as a tool for enhancing driver safety. In 2007, the market in Japan for vehicle-mounted cameras that were sold grew to exceed 4 million. In the U.S., a bill was passed in February of this year mandating improved rearview driver visibility, and studies conducted to support that legislation reported that cameras are the best way to achieve improved visibility. In summary, there is growing global interest in car-mounted cameras as a means for providing an improved driver view to facilitate safer driving.

Wraparound video image of a vehicle's peripheral

In regard to various car-mounted cameras systems that are currently in practical use, some use individual cameras to provide a view that would otherwise be a blind spot - such as cameras for seeing behind or at intersections with poor sightlines, or for passing on narrow streets. Also available are parking-assistance systems that utilize four cameras mounted around the vehicle that film the roadway surrounding the vehicle. Their images are processed into a virtual bird's-eye view 2-D image that is shown on a monitor.

Technological Challenges

Drivers navigate their vehicles in a variety of situations - such as parking, turning, and merging in lanes - that demand immediate visual checks of the vehicle's perimeter, thus requiring the following issues to be addressed in providing visual assistance:

1. Reduce the burden on the driver for visual checks in the driver's field of view

In addition to the driver's own direct-eye field of view, rearview and sideview mirrors as well as rearview monitors are visual enhancement components that can enhance the entire field of view.

Although these features can be used to quickly cover a greater field of view, the need to instantly refer to all of these features to ensure safe driving impose a great cognitive load on the driver.

The four-camera system that provides a bird's-eye view is likewise only able to provide a video of the roadway within about two meters of the vehicle, thus necessitating the addition of a rearview monitor. These cameras do not integrate the field of view information that the driver needs to pay immediate attention to, thus failing to adequately reduce the cognitive load on the driver.

2. Difficulty for the driver to recognize the point of view (perspective), sightline, and field of view shown in the monitor

With conventional technologies, each camera and each function differs in perspective and sightline - because the display changes instantly, the driver must be able to instantly recognize which view is being presented, thereby making it more difficult to make a perimeter check and thus limiting the situations in which the technology can be used effectively. In addition, substantial time is required to get accustomed to such systems.

Newly Developed Technology

To address these problems, Fujitsu Laboratories developed the world's first video-processing technology that enables the supplement of a field of view that can show any perspective and any sightline on the vehicle's periphery, and which can instantaneously smoothly transition from one view to another.

Four cameras were installed around the vehicle's perimeter, and their video images of the vehicle's surroundings were synthesized by a "3-D virtual projection/point of view conversion technology" developed by Fujitsu Laboratories, comprised of the following: projection of the video images onto a virtual 3-D curved plane as a virtual 3-D video, and conversion of the the video images into views from any desired perspective of the vehicle's surroundings ("omni-view"). This technology enables drivers to easily check views from any direction or from any perspective desired, thereby alleviating the cognitive burden involved in recognizing the view that is being displayed.

Additionally, when switching between different views as the driving situation changes, this new technology makes a smooth transition from one view to another by continuously interpolating points of view, fields of view, and sightlines. This helps to quickly orient the driver to the new view currently being shown, when the driver makes periphery checks.

Results

This new technology is comprised of the following: the MB86R01 SoC graphics chip for automobiles from Fujitsu Microelectronics Limited which supports the OpenGL ES, a general-purpose embeddable image-processing platform; and a video-processing chip that combines video images from four cameras. As a vehicle view assistance system, this system achieves real-time operation with 30 millisecond video processing time.

With this technology, in various driving situations the driver can obtain an overview from a single image, thus instantly grasping and understanding a situation surrounding the vehicle that requires immediate attention. For example, when parking, the driver can stay apprised of not only the roadway but also nearby cars and people. At merging points on a highway, the driver can see in real time the view ahead and behind the vehicle, on both sides, and whether there is enough room to merge. When making turns, the technology also enables the driver to easily view the surroundings of the side opposite of the driver's side, to ensure that pedestrians are not too near.

Provided by Fujitsu Laboratories

Explore further: Ride-sharing could cut cabs' road time by 30 percent

add to favorites email to friend print save as pdf

Related Stories

US study: Split views on robots' employment role

Aug 06, 2014

In 2025, self-driving cars could be the norm, Americans could have more leisure time and goods could become cheaper. Or, there could be chronic unemployment and an even wider income gap, human interaction ...

Sensors that improve rail transport safety

Jul 31, 2014

A new kind of human-machine communication is to make it possible to detect damage to rail vehicles before it's too late and service trains only when they need it – all thanks to a cloud-supported, wireless ...

IPCC must consider alternate policy views, researchers say

Jul 07, 2014

The Summary for Policymakers recently produced by the Intergovernmental Panel on Climate Change has triggered a public debate about excessive governmental intrusion in the IPCC process. The IPCC cannot avoid alternative political ...

Recommended for you

Ride-sharing could cut cabs' road time by 30 percent

12 minutes ago

Cellphone apps that find users car rides in real time are exploding in popularity: The car-service company Uber was recently valued at $18 billion, and even as it faces legal wrangles, a number of companies ...

Avatars make the Internet sign to deaf people

Aug 29, 2014

It is challenging for deaf people to learn a sound-based language, since they are physically not able to hear those sounds. Hence, most of them struggle with written language as well as with text reading ...

Chameleon: Cloud computing for computer science

Aug 26, 2014

Cloud computing has changed the way we work, the way we communicate online, even the way we relax at night with a movie. But even as "the cloud" starts to cross over into popular parlance, the full potential ...

User comments : 0