Delineating game damage with drones

For the use case on game damage (use case 3), we need to collect reference data to train the deep learning models from. This use case focuses on the detection of damage in agricultural fields and grasslands , primarily by wild board. We decided to first focus on maize parcels, as damage by wild boar is most easily detected in this type of crop. Ideally, we can derive the percentage crop damage per Sentinel-2 pixel. But we as humans cannot easily see the maize from above. Luckily, drones can!

From September to December 2021, Lennert Destadsbader, a scientific collaborator at the KU Leuven ‘Forest, Nature and Landscape’ research group, has collected drone imagery over more than 100 maize fields across the eastern part of Flanders, where crop damage by wild boar is most abundant.

The raw drone images were then converted into orthophotos in Pix4D.

Stitching raw drone images to an orthophoto (image credit: Sam Ottoy)
Stitching raw drone images to an orthophoto (image credit: Sam Ottoy)

Crop damage can be manually digitized on these orthophotos in QGIS.

Digitizing crop damage on a drone-based orthophoto

Over the next months, a group of 5 bachelor students in Bioscience Engineering will digitize the damage in all available drone-based orthophotos. They will also assess whether the damage can be “seen” on Sentinel-2 images at 10 or 10 meter spatial resolution. Their observations can help us to conceptualize a deep learning model for detecting crop damage from Sentinel data.

Below, you can read an article in a Flemish newspaper about the detection of game damage in agricultural crops and how remote sensing can be an asset for this.