Geo-4D recently became a member of Farm 491, an Agritech business hub with links into the Royal Agricultural University and Cirencester College, with the aim of further developing our services in the agricultural sector. The team were tasked with mapping 2 sites close to the Rural Innovation centre, with the final results to be used in dissertation projects investigating the potential for drone-gathered information to be utilised in arable agriculture.
The deliverables required from the survey were as follows:
- RGB Orthomosaic
- Vegetation Index Maps: NDVI, GNDVI & NDRE
- Variable Rate Maps
Additional outputs of 3D models of the sites were also to be generated, allowing interactive, online visualisation of both fields.
As Geo-4D manage a network of drone pilots across the country, the ability to share & mount our in-house sensors with as many of these operators as possible is absolutely essential. The vast majority of pilots in our network, along with many other UAV companies that we've worked with, have a DJI Inspire 1 as part of their fleet, so for this reason we chose to develop our RedEdge mount for this system. The RedEdge itself works completely independently from the drone, with it's own GPS and triggering system, and the sensor powered by an external source, all that was needed was a mount to ensure that the Inspire could fly with the multispectral sensor attached. After a fair bit of R&D, we managed to modify a 3D printed GoPro mount, along with a bit of carbon fibre, and the DJI DroneAg solution was created! One big advantage of the positioning of the MicaSense RedEdge on the Inspire, was that the X3 camera could remain in place, allowing for high-res RGB imagery to be gathered simultaneously to the multispectral images, enabling production of an RGB orthomosaic (without the need for producing an RGB Composite in GIS software after processing).
Automatic missions were planned for the DJI Inspire to map both fields, with the aim of guaranteeing acceptable image overlap for both the multispectral and RGB sensors. In order to ensure that the spectral images captured by the MicaSense RedEdge had sufficient overlap for processing, the cameras settings had to be compared to those of the Inspire's X3 sensor, as these were the parameters that the flight planning app would use to generate the mission plan. In order to achieve the best results with the RedEdge sensor, it was important to gather images with a 75% overlap (sidelap and frontlap) [NOTE: This can now be done using the Atlas Flight app, so doesn't require any calculations]. As the 5-band sensor parameters differ from those of X3, comparative calculations were made to determine the flight track spacing on the Inspires mission needed to still produce a 75% sidelap for the RedEdge. The RedEdge sensor was then set to trigger automatically at an interval of 1s, and the Inspire's flight speed was then altered to allow for the 75% frontlap based on the image footprint from an altitude of 100m AGL. The result of these tweaks was a far higher sidelap for the X3 than would normally be required for a successful mapping mission, and therefore an increase in the amount of time to survey the area, however this was absolutely essential to get the best results from the MicaSense RedEdge (the main purpose of the mapping mission).
Each set of spectral images, and the additional RGB ones, were processed into separate georeferenced raster images, with the resulting reflectance maps for R, G, B, RedEdge and NIR radiometrically calibrated with the MicaSense calibration panel to ensure that the data accounted for incident light conditions. The RGB orthomosaic returned an average GSD (ground sample distance) of 4.5 cm/pixel, with each spectral band raster generated at 7.5 cm/pixel, allowing for both detailed, visual scouting and production of vegetation index maps. For their projects, the students at RAU requested the following vegetation indices; NDVI, GNDVI and NDRE, applying the Red, Green, RedEdge and NIR spectral bands to the respective algorithms. These VI maps then had fertiliser information applied to them to generate variable rate application maps, detailing the quantity of fertiliser that should be applied to specific areas of the field to produce the best results, and also save money. The application maps can be exported as shapefiles and put straight into the tractors software, completing the rapid workflow of drone to tractor.
In addition to the standard agricultural outputs, the team also generated a 3D model of the fields from the RGB images, which were then uploaded to an interactive, online platform, giving the farmers the ability to view and analyse their field in a completely different way.
All of the results will now be assessed and compared to ground-based data capture information and used as part of final dissertation projects investigating the uses of drone in agriculture.