In the concept of our study, we propose an intelligent agricultural worker assistance system using autonomous robot vehicles. The system autonomously controls cameras on the vehicles in order to track workers and identify workflows based on worker behaviors in fields. The system also formulates assistance request menus such as the timings, locations, and tasks that need to be performed to assist workers, autonomously assign navigation routes and tasks to robots to go to the needed locations, and autonomously send the request commands to robots without requiring workers to operate the system. The robots autonomously travel along navigation paths by recognizing the field scenes, such as the farm road, furrow, ridge, and crop. Operators at remote sites monitor the working information presented graphically in real time by the worker assistance system, such as assistance request menus, workflow, positions, orientations, and behaviors of workers, and robots, via the Internet. Through the assistance operations, the operators can receive an agricultural education.
Get StartedDevelopment of Agricultura Robot for Intelligent Worker Assitance
In our previous studies, we developed a vanishing point detection method with a single camera for navigating an autonomous robot (Makino, 2012). The vanishing point in a field image could be detected; however, the method could not segment an input image into the furrow region and other regions and could not identify a single targeted furrow line to navigate a vehicle and robot. Therefore, in the other study, a visual-based furrow line detection method without GNSS was developed to navigate vehicles and robots in greenhouses as well as open fields. Our furrow line detection method demonstrated a high potential to robustly detect a single targeted furrow line to travel in the 17 types of test fields: nine types of crop fields (sweet pea, green pea, snow pea, lettuce, Chinese cabbage, cabbage, green pepper, tomato, and tea), and eight types of tilled soil fields (Teramoto et al., 2015; Morio et al., 2016b; Morio et al., 2017b). Although the furrow line detection system could detect a furrow line inside fields with ridge and furrow structure, the system had no function to detect the global position of the vehicle. In our previous study, we developed a visual-based vehicle position estimation system; however, the system could not estimate the precise global vehicle position but identify the driving route with high precision (Hanada et al., 2016).
In this current study, we focused on the development of a field scene recognition system to robustly and precisely estimate the global position of an autonomous agricultural vehicle without using GNSS. In our system, the global position of the vehicle can be recognized by using a set of field scene images captured sequentially from 3 cameras: a front camera, a left camera, and a right camera.
Programming . Image processing . Study, Education, Management .
Programming class for C++, R, Phython, gnuplot, Scilab, VBA.
Coaching of Communication, presentation style, and Slide design.
Faculty Development, Facilitator, Design of e-portfolio, Moodle & e-Portfolio help desk
Student counseling, Design of training program and environment
Previous companies and my tasks
Mie University, Organization for the Development of Higher Education and Regional Human Resources
Mie University, Higher Education Development Center
Mie University, Higher Education Development Center
Mie University, Higher Education Development Center
Mie University, Graduate School of Bioresoruces
Mie University, Graduate School of Bioresoruces
Mie University, Faculty of Bioresources
Kyoto University, Graduate School of Agriculture
My university and degree
Kyoto University
Kyoto University, Left the master program early without obtaining the master degree due to my employment as an assistant professor in Kyoto University
Mie University
Development of autonomous agricultural robots for intelligent worker assistance.
Send me your message.
1577 kurima machiyacho, Tsu Mie, Japan