Classifying Near-Miss Traffic Incidents through Video, Sensor, and Object Features
-
- YAMAMOTO Shuhei
- NTT Human Informatics Laboratories, NTT Corporation
-
- KURASHIMA Takeshi
- NTT Human Informatics Laboratories, NTT Corporation
-
- TODA Hiroyuki
- NTT Human Informatics Laboratories, NTT Corporation
Description
<p>Front video and sensor data captured by vehicle-mounted event recorders are used for not only traffic accident evidence but also safe-driving education as near-miss traffic incident data. However, most event recorder (ER) data shows only regular driving events. To utilize near-miss data for safe-driving education, we need to be able to easily and rapidly locate the appropriate data from large amounts of ER data through labels attached to the scenes/events of interest. This paper proposes a method that can automatically identify near-misses with objects such as pedestrians and bicycles by processing the ER data. The proposed method extracts two deep feature representations that consider car status and the environment surrounding the car. The first feature representation is generated by considering the temporal transitions of car status. The second one can extract the positional relationship between the car and surrounding objects by processing object detection results. Experiments on actual ER data demonstrate that the proposed method can accurately identify and tag near-miss events.</p>
Journal
-
- IEICE Transactions on Information and Systems
-
IEICE Transactions on Information and Systems E105.D (2), 377-386, 2022-02-01
The Institute of Electronics, Information and Communication Engineers
- Tweet
Keywords
Details 詳細情報について
-
- CRID
- 1390009454809893888
-
- NII Article ID
- 130008149856
-
- ISSN
- 17451361
- 09168532
-
- Text Lang
- en
-
- Data Source
-
- JaLC
- Crossref
- CiNii Articles
-
- Abstract License Flag
- Disallowed