UAV/UAS in Agriculture Nuffield Scholarship Follow-up (GRDC supported)

Today I thought I would pull together some of the information that has been produced as a part of my GRDC sponsored Nuffield Scholarship on the use of unmanned aerial vehicles in the grains industry. It has been an amazing experience and I encourage farmers from around the world check the Nuffield organisation to see if a scholarship would suit them.

Some of the published information includes:

I am in the final stages of writing my report which will be available through Nuffield later this year.

Challenges facing UAS in agriculture

After just finishing a post on some of the applications for UAS in agriculture, I thought I would share what I believe to be some of the biggest challenges the industry faces. As I continue my Nuffield Scholarship studies (thanks to GRDC) I am learning that unmanned aerial systems and the data produced is increasingly complex. In fact, many people will study a PhD just on one aspect of these systems alone (e.g. flight characteristics or remote sensing). Some of the more obvious challenges include:

1. Repeatability

If you were to go out and map a paddock at 11am and then again at 2pm, the resulting pixel values would be different. Even using the Normalised Difference Vegetation Index (NDVI) which by definition gives a normalised value, the data will be different. This is probably because the atmosphere and clouds do not seem to block/transmit/reflect wavelengths in parallel amounts.We see this effect in satellite imagery where the image is affected by cloud shadow. The table below is pixel value of Red and NIR from two points in the same barley paddock that has minimal variability.

Cloud shadow vs no shadow NDVI, Landsat 8, barley
Cloud shadow vs no shadow NDVI, Landsat 8, barley

Despite crop growth activity and biomass being very similar, these areas produce significantly different NDVI value due to the cloud shadow.

This can be applied to UAS imagery. The atmosphere, clouds and sun angle is constantly changing throughout the day affecting reflectance of all wavelengths differently. Therefore for data to be compared like to like it needs to be calibrated.

2. Calibration

This section follows on nicely from repeatability. If data is going to be used for more than just scouting then some sort of calibration will need to take place. As discussed in a previous blog post, calibration of NDVI could be potentially be achieved using an active ground device such as a GreenSeeker. There is also the new payloads that are true multispectral and have upward looking sensor to measure irradiance which could be used to calibrate data for true reflectance. Other forms of calibration/ground truthing include biomass cutting and weighing, tissue testing, the list goes on.

3. Algorithms & Indices

When it comes to remote sensing and vegetation, NDVI is the most famous index used, and for good reason, but it is not without its concerns. If you look on Wikipedia, the authors cover some of the issues with NDVI:

Users of NDVI have tended to estimate a large number of vegetation properties from the value of this index. Typical examples include the Leaf Area Index, biomass, chlorophyll concentration in leaves, plant productivity, fractional vegetation cover, accumulated rainfall, etc. Such relations are often derived by correlating space-derived NDVI values with ground-measured values of these variables. This approach raises further issues related to the spatial scale associated with the measurements, as satellite sensors always measure radiation quantities for areas substantially larger than those sampled by field instruments. Furthermore, it is of course illogical to claim that all these relations hold at once, because that would imply that all of these environmental properties would be directly and unequivocally related between themselves.

Thankfully we are not pigeon-holed to NDVI. Agribotix claim they get better results using the Difference Vegetation Index (DVI). Another example is Soil Adjusted Vegetation Index (SAVI).

4. Position

Without ground control points, the positional accuracy of data will be mediocre at best. Expect XY accuracy of a few meters and even more on the Z axis. GPS will record the position each frame is captured (+/- delay error), but the pitch, yaw and roll of the UAS which affects how the image is framed on the ground, is determined by an inertial measurement unit (IMU). The quality of the IMU will have a bearing on the positional accuracy if the processing software takes these variables into consideration. Expect to have to lay out minimum of 4 ground control points for high accuracy data.

UAS image processed with no GCP
UAS image processed with no GCP

5. Reliable data collection

As a consequence of the process involved in collecting data with a UAS there are several factors that contribute to the ability to reliably collect data. The process usually involves the UAS following a set lawnmower style track, conducting swaths up and back as it moves across the area of interest. As the vehicle is flying, it is capturing an image every few seconds depending on its speed. A forward and side overlap greater than 50% is required. Although processing software can handle images with some angle to the ground, extreme pitch and roll will affect the overall product.

In the data collection process if the UAS is hit by a gust of wind, it may put 2 or 3 images off target, while the autopilot makes adjustment. This can lead to a hole in the data set. This is not uncommon. Some UAS manufacturers allow you the rapidly transfer the raw data off the vehicle to a laptop in the paddock to check data coverage and quality in field (e.g. Precision Hawk).

So remember when someone claims their machine can fly in high winds, it is probably true, but the quality of the data being collected may not be of much use.

Sample flight path and image footprints
Sample flight path and image footprints

6. Data processing

Reliably collecting the data is just the first step in the process. Once all these images have been collect they need to be stitched together. It is amazing that stitching 300+ 12MP images together which are all taken at slightly different angles to the ground is even possible. Even more so that 3D surface models can be constructed from these 2D images. Given the complexity of this task it takes large amounts of computer power and time (think several hours for 100ha). For this reason there are several cloud based platforms which can offer this service (e.g. DroneMapper and PrecisionMapper). Processing on your own desktop computer versus online services do have both their pros and cons. A downside to the cloud services is the internet bandwidth required to first transfer the raw data to the server and then retrieve it once it has been processed. A downside to the desktop solution is the upfront cost of hardware and software and the required skill set may not be available in house.

7. Storage & sharing

Once the data is processed it needs to be stored somewhere and somehow distributed. Often one scene can be more than a gigabyte. If the processed data is not cut into tiles it can require a powerful machine just to view it. This is where online solutions come into play. Again the same issues exist as above around bandwidth requirements. At some point if the data is going to be used for more than just looking at it will most likely need to be transferred onto a local machine. This is an online map of one of the first areas I mapped with my DIY Finwing Penguin UAS. If for some reason the data is to be printed, it needs to be formatted as such which takes time and software.

8. System integration

Integrating data generated from UAS into existing precision agriculture software should be possible but not likely in its highest available spatial resolution. Software such as Farmworks and SST were not designed for such intense data sets which are able to be generated from UAS sensors. Resampling data to 0.5m resolution may be required.

9. Safety and legal

A whole other post could be written on this but basically in Australia we have the Civil Aviation Safety Authority (CASA). They require anyone who wants to fly UAS (they call them remotely piloted aircraft or RPA) commercially to have an Operators Certificate and Controllers Certificate. This, among other things requires a theory component equivalent to a private pilots license, comprehensive manuals written, and a stringent approvals process to ensure your vehicle is fit for service and your piloting skills are sufficient. CASA is at the moment reviewing this process and should hopefully have a revision out by the end of the year. This process seems strict, but it important to keep the UAS industry safe and professional.

Agricultural applications for UAS data

We can talk about how amazing this new technology is all we want but what most people are beginning to ask now is ‘how does it make me more money?’ I thought I would put together a list of some of the more common applications for data collected from small fixed wing UAS, particularly in broad acre agriculture.

1. Scouting

Probably the most talked about, and easiest to apply is the ability to aid in scouting paddocks. Think about the perspective you get when you fly over farm land. The process is something like fly over paddock with UAS using a NIR & visual capable sensor, stitch images together to form a georeferenced mosaic, and then calculate Normailsed Difference Vegetation Index (NDVI) or Difference Vegetation Index (DVI). The resulting map would show the variability of crop health over the entire paddock and allow you to concentrate on areas of poor health. In addition, human driven variation is very obvious such as planter problems, compaction, chemical application etc.

To make this process even easier there are a couple of iPad and Android tablet applications (e.g. PDF Maps) that allow you to import the map and use your GPS position to locate areas on the map you are looking for. You are then able to add notes and photos by putting down place marks on the maps. Below is a great demonstration of PDF Maps by Crop Tech Consulting.

2. Site specific weed control

In a similar process to Scouting, making sure you use a sensor with high spatial resolution, the resulting map could be used to identify where large individual weeds are located in a paddock. This could work in a fallow or in crop situation. In our farming environment we face glyphosate resistant weeds that require high chemical coverage to kill. Again an app like PDF Maps could be used to find single weeds in a paddock to mechanically or chemically eradicate.

Feathertop Rhodes Grass in a tidy fallow
Feathertop Rhodes Grass in a tidy fallow

3. Variable rate applications

Precisely placing inputs where they are most needed rather than blanket applications should increase yield and reduce wastage. Variable rate spreaders, sprayers and air seeders have been around for while now but the uptake use has been less than many expected. A rapidly developed georeferenced NDVI or DVI map from a UAS in combination with in paddock examination of what is causing variability puts you in a good position to generate a suitable VR fertiliser application map. Agribotix discuss their method for producing VRA maps here.

Variable rate application map
Variable rate application map

4. Insurance/Drift/Environmental regulations

I spent time at a few UAS conferences while traveling through the US & Canada and there seemed to be a consistent presence from insurance companies. I can understand why, especially in North America where insurance is such a major part of farming. If an adjustor from an insurance company is able to rapidly access a map that correlates closely with what they see on the ground then they are able to adjust the area much more accurately than a ground assessment alone, which is better for everyone involved.

Accurately mapping areas of drift damage and ability to map areas of environmental concern has similar benefits.

5. 3D/DEM/DSM

Employing the structure from motion technology, digital surface models can be generated from still images collected with a UAS. Trimble discuss the surveying potential of this technology at length in a white paper available here. In summary, with ground control points, survey grade information can be rapidly generated in their photogrammetry software with data collected from a UAS. Agisoft Photoscan and Pix4D offer similar functions.

Bezmiechow airfield 3D Digital Surface Model data from Pteryx UAV
Bezmiechow airfield 3D Digital Surface Model data from Pteryx UAV

6. Plant stand

Corn being a pillar crop in the mid-west USA, many are talking about the ability for UAS data to determine site specific plant stands in row crops. The application being to make the decision to replant or not and to evaluate planter performance.

Future applications to get excited about

Above are applications for UAS data than you can reliably apply right now for a reasonable amount of money. Some future applications include:

  • Thermal
  • Lidar
  • 7+ band multispectral
  • Hyperspectral

UAS for aerial mapping – a few options

Over the last 6 weeks during my Nuffield travels (sponsor GRDC) I have had the opportunity to meet with several different unmanned aerial system (UAS) manufactures and discuss their product. I am looking at the potential for UAS in agriculture, particularly broad acre grains. Some systems I have had a closer look at include:

senseFly: eBee

The eBee is probably the least intimidating system. It is almost a true turnkey system – you just provide a laptop for ground control station and PC for data processing software. It is a compact, lightweight (0.7kg) flying wing design which is hand launched. All packaged up, the case is small enough to be classed as carry-on baggage. There is a few payload options including the typical Canon S110, but more interestingly the Airinov Multispec 4C is also available (read here). The included processing software is top notch as it is based on Pix4D.

Interesting facts about the eBee:

  • The motor is turned off whenever an image is being captured to reduce image blur from vibration
  • Multiple eBee planes can be operated from the one ground control station with an automated collision avoidance system
  • The eBee can be optioned up to RTK GPS
senseFly eBee pair
senseFly eBee pair

SenseFly

Precision Hawk: Lancaster

The Precision Hawk Lancaster platform uses a traditional fixed wing design. Two processors running Linux handle flight management and any other in flight processes such as real time data assessment. The platform is easily hand launched and weighs about 1.4kg not including the payload. Precision Hawk offer several sensors ranging from the humble Canon S series camera through to many of the Tetracam options and can also carry Thermal, Lidar and Hyperspectral equipment. To Precision Hawk, the Lancaster platform is just a small part of the workflow. Data processing and sharing is an even bigger part of their business. They offer Precision Mapper, a cloud based system which allows raw data from any UAS to be uploaded, processed and shared.

Interesting facts about the Lancaster:

  • Precision Hawk plan to have an API system for sensor integration so 3rd parties can integrate their own sensor into the platform
  • The Lancaster creates its own flight plan after it has been launched and determined weather conditions
  • Precision Mapper is excellent value at about 25 cents a hectare! (Hope this lasts)

Precision Hawk

Precision Mapper

Farm Intelligence/Fourth Wing: Vireo

The Vireo is marketed as a tool to provide high quality data for the online farm management platform WingScan. In its own right it is a still a UAS worth looking at. It is a sort of hybrid flying wing cross traditional plane but does not have any control surfaces on the tail. It is hand launched and weighs 1.4kg total. The whole system including laptop packs into a provided travel case. They claim it can fly for an hour or more. The Vireo does not use a modified point and shoot, instead a dual-imager sensor payload which captures NIR and Visual (RGB) in a single pass at 10MP.

Interesting facts about the Vireo:

  • You can go onto the Fourth Wing online store to price their products. They sell the dual-imager sensor separately.
  • The Vireo does not use any foam in its construction, only carbon fiber and Kevlar

FI2 Sales and Leasing

Swift Radio Planes: Lynx

The Lynx is the largest of all the UAS mentioned here and also the system you would part with the least cash for. It weighs in at about 4.5kg, but is still hand launched. The Lynx will fly comfortably for 90 minutes. The plane is controlled by an APM 2.6 but has the ability to completely isolate itself from the autopilot as well for full manual control. Despite the in flight size of this system it packs down into a single case. Swift Radio Planes have developed a roll stabilised camera mount with sensor options from Sony, Canon and Tetracam.

Interesting facts about the Lynx:

  • It has the unique ability to deep stall meaning it can land in very tight spaces
  • The plane is entirely encased in Kevlar
  • Swift Radio Planes offer a server based platform for data processing and sharing

 

Swift Radio Planes: Lynx (me and the Swift Radio team)
Swift Radio Planes: Lynx (me and the Swift Radio Planes team)

Swift Radio Planes

AgEagle

AgEagle’s UAS is a flying wing that is launched from a slingshot style launcher to ensure consistent take-off every time. They pride themselves on a system that is tough in design, built especially for agriculture. It comes standard with a modified Canon camera, but soon available with true multispectral camera. AgEagle supply Agisoft Photoscan Standard and AgPixel in the standard package to process imagery.

Interesting facts about the AgEagle flying wing:

  • AgEagle describe a simple process of exporting a non geo-referenced JPEG from Photoscan, through AgPixel, into SMS for variable rate application maps
  • AgEagle are establishing a dealer network throughout the US and even sell their system in Australia
Bret Chilcott explaining the benefits of the AgEagle at PAAS
Bret Chilcott explaining the benefits of the AgEagle at PAAS

AgEagle

Falcon UAV – Australian dealer

Trimble: UX5

Most people would recognise the brand Trimble as they are well established in the surveying and precision agriculture market place. Trimble offer the UX5 UAS, which is a flying wing design weighing in at 2.5kg, made from EPP foam and carbon fiber and is and catapult launched. At the time I looked at the system they offered both standard and modified versions of a Sony mirrorless 16MP point and shoot camera. The systems comes with a rugged handheld computer for the ground control station. Trimble provide their own software for data processing which is a Photogrammetry Module for their Trimble Business Center Office Suite. This integrates with existing surveying processes but the link to Trimble’s ag products does not seem as complete (yet!).

Interesting facts about the UX5:

  • Trimble have published a white paper discussing survey accuracy of their photogrammetry software available here
  • The UX5 uses reverse thrust when landing to allow more predictable and accurate landings
Trimble UX5 at PAAS
Trimble UX5 at PAAS

Trimble UX5

Event38 / 3DR

Event38 and 3DR are separate companies but use similar components. They offer a much cheaper solution that is capable of performing many of the functions of the above UAS. The reality is that they do take some more learning to become familiar with their operation, but as far as value for money is concerned these are good products.

Event38

3D Robotics

Agribotix: Hornet

Agribotix are worth a mention. They build a UAS based on similar technology to Event38 / 3DR, so it can be built in house quickly and cheaply. They believe that too much attention is given to the flying machines and not enough to application of the data. Agribotix offer a drone lease structure where the UAS is essentially free to use and the cloud based data processing is what incurs a fee – therefore minimal capital outlay to get UAS up and running. A truly unique model.

Interesting facts about Agribotix:

  • Agribotix are extremely generous with information – their online blog has information on a lot of what they have learned in getting to where they are now

Agribotix

These are just some of the small UAS systems available on the market now. I have not included prices as they are always changing and each product is generally packaged up differently (e.g. processing software included or not). The alternative is to build your own UAS, read about my experiences here.

 

UAS in Ag: Sensors & Data

Traveling through the USA & Canada as part of my Nuffield Scholarship (Thanks to GRDC) I have heard the word data more times than I could count. In this post I am going to remove the platform aspect of the unmanned system and focus on sensors and the data they provide. Then in a later post how we can use it. In this article I do plan to try cover some concepts not widely discussed in the current UAS environment.

The reason an unmanned system is flown is to collect data, then turn that data into information to help monitor, assess and ultimately make timely, cost effective decisions based on that information. When collecting data it needs to be good quality. It is important not to confuse data quality with data type. For example, many tend to gravitate straight to the amount of megapixels a sensor captures, neglecting its spectral accuracy.

If we consider what our target is when collecting data from a UAS in a grain farming situation, it will most commonly be vegetation (not always but let’s focus on that). Collecting spatially referenced data of vegetation is by no means a new endeavour. This information has been collected as vastly as Landsat satellite imagery and as specific as a GreenSeeker. Generally, for vegetation, similar bandwidth reflectance is measured irrespective of proximity to the target. The same is true for sensors used in UAS. Why is this the case? Well you can read the long answer here (Remote Sensing of Biomass). The short answer is photosynthesis. In a plant that is photosynthesizing, the chlorophyll will absorb large amounts of ‘visual’ light, particularly blue and red, and reflect near infrared (NIR) light. The more photosynthetic activity, the more NIR light is reflected and less visual light absorbed. Conversely, inactive vegetation will reflect more visual and less NIR.

The best contrast is between red and NIR light which is what is generally used when calculating the Normalised Difference Vegetation Index (NDVI). NDVI is a good indicator of plant health and measure of biomass. Consequently, most sensors used to determine NDVI look into the red and NIR bands – some more accurately than others. The chart below shows the reflectance curve of green grass over different wavelengths. Below the X axis is a rough spectral guide to some of the well-known sensors available to us.

Reflectance of green grass & sensor spectral bands
Reflectance of green grass & sensor spectral bands

What is most notable is the wavelength spectrum or band at which each of the sensors read. If we consider the GreenSeeker, it is extremely specific at capturing a certain wavelength in the middle of the red spectrum and similarly specific in the NIR. At the other end of this comparison you can see that the S100 modified camera has very broad spectrum for each channel that it reads. Consider (what was before modification) the S100’s ‘red’ channel which reads roughly from 0.67um to 0.76um with the modified filter. Post modification, this channel is renamed NIR and measures reflectance in an area that covers red right through to NIR. The S100 modification retains the blue and green channels which replaces red when calculating NDVI. Another significant point that this chart does not show is the interference that can occur between the different bands in a point and shoot camera. Check out the S100 reflectance chart about half way down the page in this link which shows some NIR response in the blue and green channels.

It has to be noted that it is hardly fair to compare the S100 and GreenSeeker in a practical sense for several reasons with the main one being that you would not mount a GreenSeeker on a UAV as it needs to be close to the target (the GreenSeeker is an active sensor meaning that it emits light and measures that reflectance, the S100 is a passive sensor just reading reflectance from the sun). In addition, the GreenSeeker measures only one point whereas the S100 collects data on 12 million pixels. The reason I do compare them is because they can both be used to produce an NDVI map. In fact despite the spectral differences from each of these sensors and the proximity to the target, RoboFlight claim from their tests that NDVI data collected from a GreenSeeker and modified S100 correlate in a linear fashion very closely (r squared > 0.9). So we know that the two sensors correlate well but the correlation will never be a fixed formula because sunlight reflected will always be different based on sun angle, atmosphere conditions, cloud etc. The S100 and GreenSeeker would probably work best as tools which complement each other. For example, map a large area with the S100 on a UAS. The resulting dataset could be calibrated using GreenSeeker data collected in the field at the same time as the flight. Potentially if S100 data is always calibrated against the GreenSeeker, inter-paddock and inter-season comparisons can be made.

We are starting to see a new wave of sensor development designed specifically for UAS and agricultural and environmental industries. An available example is the Airinov Multispec 4C. This sensor captures 4 distinct, narrow spectral bands with no interference. These bands include green, red, red edge and NIR. What makes this package special is that not only does this sensor package look down at the vegetation; it is also looking up measuring sunlight with a lux meter. This should allow us to generate data that can be calibrated without the need for ground truthing with a GreenSeeker or similar ‘active’ device. Another feature of this sensor is that it uses a Global Shutter which means all pixels in a photo are captured at exactly the same time eradicating any motion blur. The 4C has much less spatial resolution than the S100 (1.2MP vs 12MP). Expect to pay over US$10,000 for this sensor package, not including UAV or processing software.

In summary, this article aims to explain how there is more to a UAS sensor than just megapixels. It is important to understand the spectral response of vegetation and how this can impact your sensor choice. A modified Canon camera such as the S100 is a great option for UAS but its limitations must be understood. Work needs to be done to analyse the results and accuracy of the new sensors such as the Multispec 4C.

S100 mounted in a DIY Finwing Penguin build
S100 mounted in a DIY Finwing Penguin build

* Further notes: The most common sensor used in small UAS (mid 2014) is a Canon S100 or similar variant. This camera was never designed to be flown in an UAS but the internal GPS, fast shutter speed (1/2000), relatively large sensor (12.1 MP 1/1.7″ Canon CMOS), low weight (198g), ability to be modified to detect NIR, CHDK compatibility for intervalometer, and low cost (<$500) all contribute to a well suited sensor for this application. Flown at 120m this camera can provide a ground resolution of 3.5cm.

Canon S100
Canon S100

UAS Ag Conference Comment: Delta AgTech & PAAS

This week I had the privileged to attend two conferences in the US focusing on unmanned aerial systems (UAS) in agriculture. The first was the Delta AgTech Symposium in Memphis, Tennessee and the second was the Precision Ag Aerial Show at Decator in Illinois. These conferences are the beginning of my private study as part of my Nuffield Scholarship, thanks to my sponsor the GRDC.

As a forward to this discussion, I do focus more so on the fixed wing type UAS for large area mapping – see why here – and therefore generally do not comment on multi-rotor systems, GoPros, First Person View (FPV) etc even though these were covered at these conferences.

Both conferences were a combination of educated/independent speakers and vendors. There was a couple 1 hour multi-rotor demonstrations at Delta AgTech and two full days of demonstrations of fixed wing and multi-rotors at PAAS. The audience was extremely diverse – from curious farmers, agronomists, all the way to sensor & software companies and interestingly a large presence from insurance companies.

Themes I picked up on:

  •  Everyone agrees that data needs to be more than pretty pictures – it needs to be actionable. Futhermore, boots still need to be on the ground, ground truthing data, explaining WHY. No one is claiming UAS replaces clever farmers and informed agronomists.
  • There is broad acknowledgement that UAS in agriculture needs to show a return on investment but only one case study was mentioned with actual figures.
  • Data processing is an issue. Some are claiming the solution is server based i.e. ‘in the cloud’ but acknowledge internet bandwidth is a bottleneck.
  • Sensors are a hot topic. Most manufacturers claim they have vastly improved sensors coming. Talk at the moment is around affordable true multispectral combined with irradiance/incident light measurement.
  • We were reminded a couple times not to dismiss satellite imagery as an option. Satellite sensors are always improving and getting cheaper. Will we have 10cm GSD from multispectral satellite in 10 years?
  • Generally frustration with the FAA on the time it is taking to develop rules around UAS. In saying this, there is acknowledgement that rules are needed.

Some other interesting points picked up over the last week:

  • We are starting to see some UAS companies offering early versions of vectorization (i.e. points, lines and polygons) in their server solutions using imagery collected via UAS with the main example variable rate application map of fertilizer in crop based on NDVI. This is not new technology – it has been done with satellite imagery for years.
  • With the speed of technology rapidly changing are we considering the upgrade path of current UAS? Can the sensor and GPS be upgraded without buying a whole new system?

Most of the commercial grade fixed wing UAS seem to be able to fly well, have autopilots that just work  and acceptable ground control software. Product differentiation will probably be around sensor integration and options, support, innovation and whole system workflows including speed of data processing. This is just a snapshot of what was covered in the last week. I will be compiling a report that encompasses my whole USA/Canada trip which will be available towards the end of the year.

Precision Ag Aerial Show 2014: Full house
Precision Ag Aerial Show 2014: Full house

Budget UAV for aerial mapping: my experience in agriculture

Finwing Penguin ready for maiden flight
Finwing Penguin ready for maiden flight

Built with a 3DR Pixhawk (APM:Plane 3.0.2), Finwing Penguin 2815 fixed wing air frame, S100 Canon camera and RFD900 Radio (excellent Australian product), my budget agricultural unmanned aerial vehicle (aUAV) has kept me busy and learning lots in any spare time over the last few months.

In my last post, Unmanned Aerial Vehicles (UAV) in Precision Agriculture, I outlined the main components of a UAV for precision agriculture focusing on a fixed wing platform for collecting high resolution paddock scale data. In this followup post I will attempt to log some of my experiences. Note that this is just a learning experience – there are many commercial UAV options available for agriculture that are less time consuming and provide similar or better results right away.

Components

Platform

I needed a fixed wing platform that is readily available, cheap, with potential for long battery life, stability in the air and plenty of space for electrical components. I chose the Finwing Penguin. With the standard 2815 Finwing motor, 60 amp electronic speed controller (ESC), 9×6 propeller combined with a 4400Mha 3 cell Lipo battery I was only able to achieve about 20minutes flight time or enough to map about 40ha at 12m/s. I have a CXN engine mount which enables me to go to a 10″ prop which some of the gurus recommend. I could also increase my battery capacity and battery cell count to get longer flight time.

The Penguin is rather unique in its class as it has inbuilt landing gear. This consists of a pair wheels at the front and a single wheel at the back. I think this assists in preserving the plane when there is nowhere soft to belly land. The landing gear also allows you to take off like a traditional plane rather than hand launching. After making weight or centre gravity (CG) changes I will often take off from the ground. The downside is that these wheels to block up with mud quite easily if you land in a wet paddock.

The wings (including tail wing) come off for transport. I usually remove the main wings but leave tail wing in place as it is quite hard to get on and off due to awkward wiring and attachment arrangement.

The Penguin UAV does come with a pre-cut hole to install a down facing camera but it does not suit to place the camera top facing forward which is desirable. It was also very awkward to get the camera in and out as I had the Pixhawk autopilot installed above the camera position. I decided to go at the plane with a hacksaw and build a camera mount that would allow the camera to be installed from underneath the plane and also enough space to mount the camera top face forward.

S100 down facing camera mount and landing gear
S100 down facing camera mount and landing gear
Finwing Penguin UAV wings off for transport
Finwing Penguin UAV wings off for transport
UAV Finwing Penguin internal shot
UAV Finwing Penguin internal shot

Autopilot, GPS & Radio modem

As far as autopilot is concerned 3DR Pixhawk with APM:Plane 3.0.2 was the best option. At first I had issues getting my plane to fly well but once I upgraded to version 3.0.2+ the autotune feature changed the game altogether. This allowed the APM to adjust the PID settings in the plane as I manually flew it around. It works really well! During my latest flight I had an 8km/h cross wind that the APM was able to fly against successfully.

The GPS is a Ublox LEA-6M. It works well considering the price point. I did not attempt autonomous landing which is when GPS accuracy is more important. This GPS is able to get a fix within seconds of start-up and generally no issues throughout flight.

I initially used the 3DR radio modems but had all sorts of problems keeping a solid connection with my GCS. I decided to bite the bullet and buy a quality radio modem that should last me a long time and exceed all my range requirements. The RFD900 Radio pair is compatible with 3DR equipment and slotted in quite well. I did have to manufacture a cable to connect it to the Pixhawk and it took a bit of searching to figure which wires went where but I got it sorted within an hour or so. The RFD900 did have some driver issues on Windows. I had to install an old driver before I could get Mission Planner connected to the Pixhawk through RFD900. This all equates to time spent mucking around… BUT once working this product is excellent and I always have strong telemetry signal.

UAV actual flight path exported to Google Earth
UAV actual flight path exported to Google Earth

Ground Control Station (GCS)

The Mission Planner software which runs on the ground station laptop allowing you to program the UAV and monitor it in flight is very good –  especially the Auto Waypoint Survey Grid feature. This allows you to draw the area you want to on the map. Simply load in a photograph from the camera you will be using and the target elevation. From this information is draws a flight path with your desired overlap.

Footprints Survey Grid
Mission Planner: Footprints Survey Grid
Mission Planner: In Flight Data
Mission Planner: In Flight Data
Ground Control Station
Ground Control Station

Sensor & Image Processing

Canon S100 is my sensor of choice as it is a great balance of quality, price functionality and size. I started with a Canon D10 but many of the photos came out under exposed. The S100 has a larger sensor and inbuilt GPS so it is a better choice for aerial mapping. The downside to the S100 is that the lens protrudes from the camera which exposes it to damage in a rough landing.

With UAV aerial mapping you need a way for the camera to trigger every few seconds on its own. With a Canon camera it is easy using the Canon Hack Development Kit (CHDK). This updates the camera firmware, allowing you to use intervalometer scripts to trigger camera every few seconds. CHDK also offers what seems like unlimited settings for the camera. It seems difficult to find a complete set of settings to use with CHDK, but for my next flight I will try using the DroneMapper documentation to setup CHDK.

In my last flight approximately 30% of my photos came out blurry. I discarded the worst of the photos but still had to use some poor quality photos to ensure the map had no blank spots. This is probably due to a combination of camera settings, camera mount and propeller slightly out of balance.

Using desktop software trial of Agisoft Photoscan I was able to product a 40ha orthomosaic. The application works surprisingly well considering all images are taken at slightly different angle and are only provided with one GPS point for each photo. It is a very computer intense process and if I was to do a significant amount of processing would need to upgrade my computer. Alternatively I could use DroneMapper, but my dataset did not meet their requirements because I had to cull some images. I hope to try DroneMapper next time.

UAV imagery : Suntop Wheat
UAV imagery : Suntop Wheat

I took my data a step further and set up a web server to host the data as tiles. You can check it out here. How to store and share data collected by UAVs is something I have been thinking about. An orthomosaic for a single paddock can be several gigabytes and take a powerful computer to view in its raw form. The web seems like a good way to display data having a server store the data and only send bits of the image that the end user requests as they zoom in and move around.

The S100 can be modified to collect NDVI data – check here for example.

Always learning the hard way

This is my second flying UAV. My first was a Finwing Penguin as well. I spent a couple days putting my plane together and all the components. It is a nervous time flying your brand new plane for the first time. The first time out my plane few OK in manual mode but since I am a very ordinary pilot I like to use assisted flying modes. I changed to Fly By Wire mode and due to a APM setting (that I had to set) the autopilot had reversed the elevator sending it crashing into the ground. This snapped the plane in half and bent up the fuselage. Thankfully this durable foam returns to shape when you put it in boiling water and the pieces can be glued back together, reinforced with carbon fiber and fiberglass tape. Now I follow the suggested checks in the APM:Plane instructions more closely.  I’ve had no crashes since but have landed in mud which can be painful to clean out of the landing gear.

Fuselage post crash on maiden flight
Fuselage post crash on maiden flight

Conclusion

Putting together this UAV I have learned how all the components of a UAV fit together, challenges faced by the commercial suppliers, and a better understanding of the enormous potential on offer. I think the biggest challenge is not the UAV platform itself but collecting high quality consistent data that can be quickly processed and given a useful, profitable application. The setup I have discussed here not including laptop or the countless hours of time comes to about AU$1200. Obviously for mapping large areas on a consistent basis, a commercial UAV would be preferred or even essential.

UAV Finwing Penguin: Clocked up some hours
UAV Finwing Penguin: Clocked up some hours