RESEARCH on Vehicle Control / Driving Behavior
Automated driving control based on optical flow
In order to construct human-like automated driving systems, we focused on optical flow that is a velocity vector generated by the surrounding environment. Humans can perceive the direction of self-motion based on optic flow, and they can track a target path by matching their direction to a point of "they want to go" on the target path. This is a human (animal)-like control method. Therefore, we modeled optic flow generated by the vehicle state and we verified that the modeling result can strictly reflect the direction of self-motion that has been only verified through the experiments. Then, we introduced a mathematical model of optic flow into the vehicle steering control based on a nonlinear control method in order to construct human-like control method. From the results of the simulations and vehicle experiments, we confirmed this method is effective for the automated steering controller and can reproduce human's steering behaviour in terms of steering accuracy.
Examine the influence of optic flow for driving behavior
From the results of our proposed automated driving control method, we got a hypothesis: the usefulness of a region of optic flow (for instance, far and near, or central and peripheral vision) for driver's steering behavior is not consistent. Then, we made a unique simulation environment that can selectively mask either the optic flow or the road edge information to analyze driving behavior. Then, we verified how this two information affects the steering performance of the driver within the scope of Two-point steering model. Amazingly, a part of the results is the same as the results produced by the optical flow control. This conclusion could give us some intuition: driver models capturing driver characteristics can figure out new driver behavior even if we do not conduct the psychological experiments.
Analyzing the driver's gaze behavior
Drivers’ gaze behaviors in naturalistic and simulated driving tasks have been investigated for decades. Many research studies focus on the road geometrical environment to explain a driver’s gaze. On the other hand, we focused on vehicle states such as optical flow and the vehicle position to explain the driver's gaze behavior. We show that drivers' gaze strategy can be interpreted by optical flow theory that is a method to quantify the extent to which they can perceive the future path of the vehicle. In addition, we assume that the drivers' gaze behavior is influenced by two aspects: the importance of Lane Keeping and Route Prediction. Then, we modeled the driver's gaze behavior in order to apply for constructing advanced intelligent vehicle systems. The results can simulate the driver's gaze behavior.
Investigation of visual region influencing driving behavior using machine learning
One aspect of the drivers' cognitive behavior is to understand what information in which visual regions are used for steering/throttle control. In previous research, these behaviors were analyzed using a special simulator environment. Therefore, we proposed a CNN model with human physical characteristics, which can analyze cognitive behavior in a real environment. We have established a new method for human analysis by verifying that the analytical results by the proposed model can correctly reflect human cognitive behavior even in machine learning models that are considered to be black boxes.
RESEARCH on Human Robot Interaction
Human motion prediction to reduce mechanical delays of robot
Face-to-Face contact is an important functional behavior for Humanoid robots. However, it is difficult due to the mechanical delay to generate Face-to-Face contact without delay. So, we make robot motion with little/no delay during face-to-face contact using the predicted human face position by both of the fast machine learning methods and conventional image processing methods.
Robot behavior to stop pedestrians in commercial facilities
In order to spread the robot in the real world, it is necessary to create a robot that can be used by humans. However, it is known that even if a robot is implemented in a real environment, robots tend to be ignored. Therefore, we installed a robot in a shopping mall and investigated what kind of robot behavior could attract a passerby and stop the passerby.
PROFESSION / EDUCATION
[Oral] Y. Okafuji, J. Baba, J. Nakanishi, J. Amada, Y. Yoshikawa, H. Ishiguro, “Persuasion Strategies for Social Robot to Keep Humans Accepting Daily Different Recommendations,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Virtual Conference, September 2021
[Oral] J. Amada, Y. Okafuji, T. Wada, J. Baba, J. Nakanishi, Y. Yoshikawa, “Behavioral Changes in Passersby by Expanding Embodiment of a Calling Robot,” IEEE International Conference on Robot & Human Interactive Communication (RO-MAN), Virtual Conference, August 2021
[Oral] C. Zhang, Y. Okafuji, T. Wada, “Evaluation of visualization performance of CNN models using driver model,” IEEE/SICE International Symposium on System Integration (SII), Virtual Conference, January 2021
[Oral] Y. Okafuji, T. Sugiura, T. Wada, “Preliminary investigation of visual information influencing driver’s steering control based on CNN”, IEEE International Conference on Systems, Man, and Cybernetics (SMC), Virtual Conference, October 2020
[Oral] R. Ukita, Y. Okafuji, T. Wada, “A simulation study on lane-change control of automated vehicles to reduce motion sickness based on a computational model”, IEEE International Conference on Systems, Man, and Cybernetics (SMC), Virtual Conference, October 2020
[Oral] T. Wada, J. Kawano, Y. Okafuji, A. Takamatsu, M. Makita, “A computational model of motion sickness considering visual and vestibular information”, IEEE International Conference on Systems, Man, and Cybernetics, Virtual Conference (SMC), October 2020
[Oral] Y. Okafuji, T. Wada, T. Sugiura, K. Murakami, H. Ishida, “Drivers' gaze behaviors are influenced by vehicle position”, 64th Annual Meeting of Human Factors and Ergonomics Society (HFES), Virtual Conference, October 2020
[Poster] Y. Okafuji, Y. Ozaki, J. Baba, A. Kitahara, J. Nakanishi, K. Ogawa, Y. Yoshikawa, H. Ishiguro, “Please listen to me: How to make passersby stop by a humanoid robot in a shopping mall”, ACM/IEEE International Conference on Human-Robot Interaction (HRI), Cambridge, UK, March 2020
[Oral] Y. Okafuji, T. Fukao, H. Inou, “Theoretical interpretation of driver’s gaze considering optic flow and seat position”, IFAC Symposium on Analysis, Design, and Evaluation of Human-Machine Systems (IFAC-HMS), Tallinn, Estonia, September 16-19, 2019
[Oral] C. Mole, G. Markkula, O. Giles, Y. Okafuji, R. Romano, N. Merat, R. Wilkie, “Drivers fail to calibrate to optic flow speed changes during automated driving”, Driving Assessment Conference, New Mexico, USA, June 24-27, 2019
[Demo] Y. Okafuji, J. Baba, J. Nakanishi, “Face-to-Face contact method for humanoid robots using face position prediction”, ACM/IEEE International Conference on Human-Robot Interaction (HRI), Daegu, Korea, March 2019
Domestic Conference (in Japan)
[Poster] 岡藤勇希，深尾隆則，伊能寛，”オプティカルフローに基づいた自動操舵システム”，ロボティクス・メカトロニクス講演会 (ROBOMECH2014)，2014年5月
AWARDS / GRANTS / PATENT
Principal Investigator / 研究代表
Yuki Okafuji© 2017