RESEARCH on Vehicle Control / Driving Behavior
Automated driving control based on optical flow (2013-2016)
In order to construct human-like automated driving systems, we focused on optical flow that is a velocity vector generated by the surrounding environment. Humans can perceive the direction of self-motion based on optic flow, and they can track a target path by matching their direction to a point of "they want to go" on the target path. This is a human (animal)-like control method. Therefore, we modeled optic flow generated by the vehicle state and we verified that the modeling result can strictly reflect the direction of self-motion that has been only verified through the experiments. Then, we introduced a mathematical model of optic flow into the vehicle steering control based on a nonlinear control method in order to construct human-like control method. From the results of the simulations and vehicle experiments, we confirmed this method is effective for the automated steering controller and can reproduce human's steering behaviour in terms of steering accuracy.
Examine the influence of optic flow for driving behavior (2017-2018)
From the results of our proposed automated driving control method, we got a hypothesis: the usefulness of a region of optic flow (for instance, far and near, or central and peripheral vision) for driver's steering behavior is not consistent. Then, we made a unique simulation environment that can selectively mask either the optic flow or the road edge information to analyze driving behavior. Then, we verified how this two information affects the steering performance of the driver within the scope of Two-point steering model. Amazingly, a part of the results is the same as the results produced by the optical flow control. This conclusion could give us some intuition: driver models capturing driver characteristics can figure out new driver behavior even if we do not conduct the psychological experiments.
Analyzing the driver's gaze behavior (2018-current)
Drivers’ gaze behaviors in naturalistic and simulated driving tasks have been investigated for decades. Many research studies focus on the road geometrical environment to explain a driver’s gaze. On the other hand, we focused on vehicle states such as optical flow and the vehicle position to explain the driver's gaze behavior. Then, we modeled the driver's gaze behavior in order to apply for constructing advanced intelligent vehicle systems. The results can simulate the driver's gaze behavior.
Investigation of visual information influencing driver's control using machine learning (2019-current)
RESEARCH on Human Robot Interaction
Human motion prediction to reduce mechanical delays of robot (2018-2019)
Face-to-Face contact is an important functional behavior for Humanoid robots. However, it is difficult due to the mechanical delay to generate Face-to-Face contact without delay. So, we make robot motion with little/no delay during face-to-face contact using the predicted human face position by both of the fast machine learning methods and conventional image processing methods.
Robot behavior to stop pedestrians in commercial facilities (2019-current)
In order to spread the robot in the real world, it is necessary to create a robot that can be used by humans. However, it is known that even if a robot is implemented in a real environment, robots tend to be ignored. Therefore, we installed a robot in a shopping mall and investigated what kind of robot behavior could attract a passerby and stop the passerby.
PROFESSION / EDUCATION
[Oral] Y. Okafuji, T. Wada, T. Sugiura, K. Murakami, H. Ishida, “Drivers' Gaze Behaviors are Influenced by Vehicle Position”, 64th Annual Meeting of Human Factors and Ergonomics Society, Chicago, USA, October 2020 (to appear)
[Poster] Y. Okafuji, Y. Ozaki, J. Baba, A. Kitahara, J. Nakanishi, K. Ogawa, Y. Yoshikawa, H. Ishiguro, “Please Listen to Me: How to Make Passersby Stop by a Humanoid Robot in a Shopping Mall”, ACM/IEEE International Conference on Human-Robot Interaction, Cambridge, UK, March 2020
[Oral] Y. Okafuji, T. Fukao, H. Inou, “Theoretical Interpretation of Driver’s Gaze Considering Optic Flow and Seat Position”, IFAC Symposium on Analysis, Design, and Evaluation of Human-Machine Systems, Tallinn, Estonia, September 16-19, 2019
[Oral] C. Mole, G. Markkula, O. Giles, Y. Okafuji, R. Romano, N. Merat, R. Wilkie, “Drivers fail to calibrate to optic flow speed changes during automated driving”, Driving Assessment Conference, New Mexico, USA, June 24-27, 2019
[Demo] Y. Okafuji, J. Baba, J. Nakanishi, “Face-to-Face contact method for humanoid robots using face position prediction”, ACM/IEEE International Conference on Human-Robot Interaction, Daegu, Korea, March 2019
Domestic Conference (in Japan)
[Poster] 岡藤勇希，深尾隆則，伊能寛，”オプティカルフローに基づいた自動操舵システム”，ロボティクス・メカトロニクス講演会 (ROBOMECH2014)，2014年5月
AWARDS / GRANTS / PATENT