Obstacle avoidance using convolutional neural network for drone navigation in oil palm plantation
Loading...
Date
2019-06
Authors
Lee Hui Yin
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
In Malaysia, oil palm plantation is one of the vital sectors that contribute to the country
economy. In recent years, drones are widely applied in the precision agriculture due to
their flexibility and capability. However, one of the challenges in a low-altitude flight
mission is the ability to avoid the obstacles in order to prevent the drone crashes. Most
of the previous literature demonstrated the obstacle avoidance systems with active
sensors which are not applicable on small aerial vehicles due to the cost, weight and
power consumption constraints. In this research, we present a novel system that enables
the autonomous navigation of a small drone in the oil palm plantation using a single
camera only. The system is divided into two main stages: vision-based obstacle detection,
in which the obstacles in the input images are detected, and motion control, in which the
avoidance decisions are taken based on the results from the first stage. As the monocular
vision does not provide depth information, a machine learning model, Faster R-CNN,
was trained and adapted for the tree trunk detection. Subsequently, the heights of the
predicted bounding boxes were used to indicate their estimated distances from the drone.
The detection model performance was validated on the testing images in term of the
average precision. In the system, the drone is programmed to move forward until the
detection model detects any closed frontal obstacle. Next, the avoidance motion direction
is defined by commanding a yawing angle which is corresponded to the x-coordinate in
the image that indicated the optimum path direction with the widest obstacle-free space.
We demonstrated the performance of the system by carrying out flight tests in the real oil palm
plantation environment in two different locations, where one of them is a new
place. The results showed that the proposed method was accurate and robust for the drone
vision-based autonomous navigation in the oil palm plantation.