本文へジャンプ

成果報告書詳細
管理番号20170000000578
タイトル*平成28年度中間年報 国際研究開発・実証プロジェクト コファンド事業 日本ーイスラエル研究開発協力事業 車載表示機器における対象物追尾AR表示トラッキング技術の研究開発
公開日2018/1/26
報告書年度2016 - 2016
委託先名株式会社リコー
プロジェクト番号P14005
部署名ロボット・AI部
和文要約
英文要約Title:Japan-Israel R&D Cooperation Program "Development of the Augmented Reality indication technology for object tracking in the display equipment system" (FY2016-FY2017) FY2016 Annual Report

The display equipment system for automotive captures a object seen by the driver through the windshield, such as a vehicle, lane of traffic, or pedestrian and enables an augmented reality (hereinafter, AR) display of the object on the windshield. The system detects and measures the driver’s eye position with the driver monitor, displays the object as seen by the driver based on the detected eye position, and continues following the intended object. A prototype display equipment system is being designed and evaluated. Its current development status is described below.
(1) External camera: We have developed a camera with a 71.6° horizontal angle of view, just above the 70° necessary to obtain external image information, by selecting the proper sensor size and lens.
(2) Driver monitor: We have introduced a non-contact eye position measuring instrument as the driver monitor. The driver monitor is composed of two driver cameras, the synchronous control module, USB hub, and the PC for calculation. The driver camera is composed of a near-infrared light source, bandpass filter, optical lens, and USB-interface camera.This component design enables the capturing of images while suppressing the expected effect of ambient light from the sun or street lights, etc. that would interfere with using the system outdoors.
(3) Object detection: We have developed an algorithm that detects relative objects in motion, and a software module that detects vehicles and traffic lanes. We have confirmed that vehicles and traffic lanes can be detected by entering into the software module the driving video images including information of vehicles and traffic lanes previously taken by the external camera.
(4) AR superimposed algorithm: We have developed an algorithm that changes the display position depending on the driver’s eye position. We have confirmed the algorithm’s function to change the output display position by entering dummy data of objects including vehicles and traffic lanes, as well as the location coordinates of the driver’s eye position.
(5) Design of AR superimposed display equipment system (prototype): We have designed an evaluation prototype of the display equipment system and developed a unit evaluation environment for the algorithms that move the AR display position and capture the object according to the driver’s eye position, while identifying and tracking the object on the driver monitor as per the driver’s eye position.
(6) Development of wide-angle display equipment:We have designated such conditions as display range as viewed from the driver, optical characteristics of the windshield, and the layout of optical elements inside the display equipment system, in an effort to simulate the optical performance of displayed images.
ダウンロード成果報告書データベース(ユーザ登録必須)から、ダウンロードしてください。

▲トップに戻る