Sensor Fusion for Precision Agriculture

In Issue28 by FIEA

Precision agriculture has considerably benefited from the use of unmanned aerial vehicles, often combined with active ranging sensors like Lidar to gather information from underneath the crops, but existing systems are relatively expensive for the farming sector. In this project, researchers from the University of Calgary in Canada have used an integrated sensor orientation method for the local referencing of Lidar measurements.

Their approach is based on loosely coupled, image-aided inertial navigation in which the pose of the camera replaces the GNSS measurements. The result is a low-cost solution that is useful for capturing topographic data of inaccessible areas as well as in GNSS-denied environments.

Integration of a low-cost 3D Lidar sensor with a DSLR camera and an industrial-grade INS allows the creation of high-resolution and complete point clouds of agricultural plots and the automatic generation of a terrain model to measure soil micro-topography. A critical challenge with this system is georeferencing the Lidar point clouds with an accuracy comparable to the georeferencing accuracy of image-based point clouds. 

Click here to read the full story (GIM)

Image Credit: University of Calgary

Share this Post