Yuki Okafuji, Ph. D.
CyberAgent AI Lab
last updated: 15/Feb./2023
I am ...
Yuki Okafuji / 岡藤 勇希
Ph. D. in Engineering / 博士（工学）
Research Scientist at CyberAgent AI Lab
株式会社サイバーエージェント AI Lab、リサーチサイエンティスト
E-mail: okafuji_yuki_xd [at] cyberagent.co.jp
Visiting Associate Professor at Ritsumeikan University, Playful Lab
E-mail: yokafuji [at] fc.ritsumei.ac.jp
Visiting Researcher at Osaka University, Intelligent Robotics Lab
E-mail: okafuji.yuki [at] irl.sys.es.osaka-u.ac.jp
RESEARCH on Vehicle Control / Driving Behavior
Automated driving control based on optical flow
In order to construct human-like automated driving systems, we focused on optical flow that is a velocity vector generated by the surrounding environment. Humans can perceive the direction of self-motion based on optic flow, and they can track a target path by matching their direction to a point of "they want to go" on the target path. This is a human (animal)-like control method. Therefore, we modeled optic flow generated by the vehicle state and we verified that the modeling result can strictly reflect the direction of self-motion that has been only verified through the experiments. Then, we introduced a mathematical model of optic flow into the vehicle steering control based on a nonlinear control method in order to construct human-like control method. From the results of the simulations and vehicle experiments, we confirmed this method is effective for the automated steering controller and can reproduce human's steering behaviour in terms of steering accuracy.
Examine the influence of optic flow for driving behavior
From the results of our proposed automated driving control method, we got a hypothesis: the usefulness of a region of optic flow (for instance, far and near, or central and peripheral vision) for driver's steering behavior is not consistent. Then, we made a unique simulation environment that can selectively mask either the optic flow or the road edge information to analyze driving behavior. Then, we verified how this two information affects the steering performance of the driver within the scope of Two-point steering model. Amazingly, a part of the results is the same as the results produced by the optical flow control. This conclusion could give us some intuition: driver models capturing driver characteristics can figure out new driver behavior even if we do not conduct the psychological experiments.
Analyzing the driver's gaze behavior
Drivers’ gaze behaviors in naturalistic and simulated driving tasks have been investigated for decades. Many research studies focus on the road geometrical environment to explain a driver’s gaze. On the other hand, we focused on vehicle states such as optical flow and the vehicle position to explain the driver's gaze behavior. We show that drivers' gaze strategy can be interpreted by optical flow theory that is a method to quantify the extent to which they can perceive the future path of the vehicle. In addition, we assume that the drivers' gaze behavior is influenced by two aspects: the importance of Lane Keeping and Route Prediction. Then, we modeled the driver's gaze behavior in order to apply for constructing advanced intelligent vehicle systems. The results can simulate the driver's gaze behavior.
Investigation of visual region influencing driving behavior using machine learning
One aspect of the drivers' cognitive behavior is to understand what information in which visual regions are used for steering/throttle control. In previous research, these behaviors were analyzed using a special simulator environment. Therefore, we proposed a CNN model with human physical characteristics, which can analyze cognitive behavior in a real environment. We have established a new method for human analysis by verifying that the analytical results by the proposed model can correctly reflect human cognitive behavior even in machine learning models that are considered to be black boxes.
RESEARCH on Human Robot Interaction
Human motion prediction to reduce mechanical delays of robot
Face-to-Face contact is an important functional behavior for Humanoid robots. However, it is difficult due to the mechanical delay to generate Face-to-Face contact without delay. So, we make robot motion with little/no delay during face-to-face contact using the predicted human face position by both of the fast machine learning methods and conventional image processing methods.
Robot behavior to stop pedestrians in commercial facilities
In order to spread the robot in the real world, it is necessary to create a robot that can be used by humans. However, it is known that even if a robot is implemented in a real environment, robots tend to be ignored. Therefore, we installed a robot in a shopping mall and investigated what kind of robot behavior could attract a passerby and stop the passerby.
Persistence of persuasion strategy of robots through long-term period
PROFESSION / EDUCATION
Domestic Conference (in Japan)
AWARDS / GRANTS / PATENT
Principal Investigator / 研究代表
Co-Investigator / 研究分担
Yuki Okafuji© 2017