UAS in Ag: Sensors & Data

Traveling through the USA & Canada as part of my Nuffield Scholarship (Thanks to GRDC) I have heard the word data more times than I could count. In this post I am going to remove the platform aspect of the unmanned system and focus on sensors and the data they provide. Then in a later post how we can use it. In this article I do plan to try cover some concepts not widely discussed in the current UAS environment.

The reason an unmanned system is flown is to collect data, then turn that data into information to help monitor, assess and ultimately make timely, cost effective decisions based on that information. When collecting data it needs to be good quality. It is important not to confuse data quality with data type. For example, many tend to gravitate straight to the amount of megapixels a sensor captures, neglecting its spectral accuracy.

If we consider what our target is when collecting data from a UAS in a grain farming situation, it will most commonly be vegetation (not always but let’s focus on that). Collecting spatially referenced data of vegetation is by no means a new endeavour. This information has been collected as vastly as Landsat satellite imagery and as specific as a GreenSeeker. Generally, for vegetation, similar bandwidth reflectance is measured irrespective of proximity to the target. The same is true for sensors used in UAS. Why is this the case? Well you can read the long answer here (Remote Sensing of Biomass). The short answer is photosynthesis. In a plant that is photosynthesizing, the chlorophyll will absorb large amounts of ‘visual’ light, particularly blue and red, and reflect near infrared (NIR) light. The more photosynthetic activity, the more NIR light is reflected and less visual light absorbed. Conversely, inactive vegetation will reflect more visual and less NIR.

The best contrast is between red and NIR light which is what is generally used when calculating the Normalised Difference Vegetation Index (NDVI). NDVI is a good indicator of plant health and measure of biomass. Consequently, most sensors used to determine NDVI look into the red and NIR bands – some more accurately than others. The chart below shows the reflectance curve of green grass over different wavelengths. Below the X axis is a rough spectral guide to some of the well-known sensors available to us.

Reflectance of green grass & sensor spectral bands
Reflectance of green grass & sensor spectral bands

What is most notable is the wavelength spectrum or band at which each of the sensors read. If we consider the GreenSeeker, it is extremely specific at capturing a certain wavelength in the middle of the red spectrum and similarly specific in the NIR. At the other end of this comparison you can see that the S100 modified camera has very broad spectrum for each channel that it reads. Consider (what was before modification) the S100’s ‘red’ channel which reads roughly from 0.67um to 0.76um with the modified filter. Post modification, this channel is renamed NIR and measures reflectance in an area that covers red right through to NIR. The S100 modification retains the blue and green channels which replaces red when calculating NDVI. Another significant point that this chart does not show is the interference that can occur between the different bands in a point and shoot camera. Check out the S100 reflectance chart about half way down the page in this link which shows some NIR response in the blue and green channels.

It has to be noted that it is hardly fair to compare the S100 and GreenSeeker in a practical sense for several reasons with the main one being that you would not mount a GreenSeeker on a UAV as it needs to be close to the target (the GreenSeeker is an active sensor meaning that it emits light and measures that reflectance, the S100 is a passive sensor just reading reflectance from the sun). In addition, the GreenSeeker measures only one point whereas the S100 collects data on 12 million pixels. The reason I do compare them is because they can both be used to produce an NDVI map. In fact despite the spectral differences from each of these sensors and the proximity to the target, RoboFlight claim from their tests that NDVI data collected from a GreenSeeker and modified S100 correlate in a linear fashion very closely (r squared > 0.9). So we know that the two sensors correlate well but the correlation will never be a fixed formula because sunlight reflected will always be different based on sun angle, atmosphere conditions, cloud etc. The S100 and GreenSeeker would probably work best as tools which complement each other. For example, map a large area with the S100 on a UAS. The resulting dataset could be calibrated using GreenSeeker data collected in the field at the same time as the flight. Potentially if S100 data is always calibrated against the GreenSeeker, inter-paddock and inter-season comparisons can be made.

We are starting to see a new wave of sensor development designed specifically for UAS and agricultural and environmental industries. An available example is the Airinov Multispec 4C. This sensor captures 4 distinct, narrow spectral bands with no interference. These bands include green, red, red edge and NIR. What makes this package special is that not only does this sensor package look down at the vegetation; it is also looking up measuring sunlight with a lux meter. This should allow us to generate data that can be calibrated without the need for ground truthing with a GreenSeeker or similar ‘active’ device. Another feature of this sensor is that it uses a Global Shutter which means all pixels in a photo are captured at exactly the same time eradicating any motion blur. The 4C has much less spatial resolution than the S100 (1.2MP vs 12MP). Expect to pay over US$10,000 for this sensor package, not including UAV or processing software.

In summary, this article aims to explain how there is more to a UAS sensor than just megapixels. It is important to understand the spectral response of vegetation and how this can impact your sensor choice. A modified Canon camera such as the S100 is a great option for UAS but its limitations must be understood. Work needs to be done to analyse the results and accuracy of the new sensors such as the Multispec 4C.

S100 mounted in a DIY Finwing Penguin build
S100 mounted in a DIY Finwing Penguin build

* Further notes: The most common sensor used in small UAS (mid 2014) is a Canon S100 or similar variant. This camera was never designed to be flown in an UAS but the internal GPS, fast shutter speed (1/2000), relatively large sensor (12.1 MP 1/1.7″ Canon CMOS), low weight (198g), ability to be modified to detect NIR, CHDK compatibility for intervalometer, and low cost (<$500) all contribute to a well suited sensor for this application. Flown at 120m this camera can provide a ground resolution of 3.5cm.

Canon S100
Canon S100

Budget UAV for aerial mapping: my experience in agriculture

Finwing Penguin ready for maiden flight
Finwing Penguin ready for maiden flight

Built with a 3DR Pixhawk (APM:Plane 3.0.2), Finwing Penguin 2815 fixed wing air frame, S100 Canon camera and RFD900 Radio (excellent Australian product), my budget agricultural unmanned aerial vehicle (aUAV) has kept me busy and learning lots in any spare time over the last few months.

In my last post, Unmanned Aerial Vehicles (UAV) in Precision Agriculture, I outlined the main components of a UAV for precision agriculture focusing on a fixed wing platform for collecting high resolution paddock scale data. In this followup post I will attempt to log some of my experiences. Note that this is just a learning experience – there are many commercial UAV options available for agriculture that are less time consuming and provide similar or better results right away.

Components

Platform

I needed a fixed wing platform that is readily available, cheap, with potential for long battery life, stability in the air and plenty of space for electrical components. I chose the Finwing Penguin. With the standard 2815 Finwing motor, 60 amp electronic speed controller (ESC), 9×6 propeller combined with a 4400Mha 3 cell Lipo battery I was only able to achieve about 20minutes flight time or enough to map about 40ha at 12m/s. I have a CXN engine mount which enables me to go to a 10″ prop which some of the gurus recommend. I could also increase my battery capacity and battery cell count to get longer flight time.

The Penguin is rather unique in its class as it has inbuilt landing gear. This consists of a pair wheels at the front and a single wheel at the back. I think this assists in preserving the plane when there is nowhere soft to belly land. The landing gear also allows you to take off like a traditional plane rather than hand launching. After making weight or centre gravity (CG) changes I will often take off from the ground. The downside is that these wheels to block up with mud quite easily if you land in a wet paddock.

The wings (including tail wing) come off for transport. I usually remove the main wings but leave tail wing in place as it is quite hard to get on and off due to awkward wiring and attachment arrangement.

The Penguin UAV does come with a pre-cut hole to install a down facing camera but it does not suit to place the camera top facing forward which is desirable. It was also very awkward to get the camera in and out as I had the Pixhawk autopilot installed above the camera position. I decided to go at the plane with a hacksaw and build a camera mount that would allow the camera to be installed from underneath the plane and also enough space to mount the camera top face forward.

S100 down facing camera mount and landing gear
S100 down facing camera mount and landing gear
Finwing Penguin UAV wings off for transport
Finwing Penguin UAV wings off for transport
UAV Finwing Penguin internal shot
UAV Finwing Penguin internal shot

Autopilot, GPS & Radio modem

As far as autopilot is concerned 3DR Pixhawk with APM:Plane 3.0.2 was the best option. At first I had issues getting my plane to fly well but once I upgraded to version 3.0.2+ the autotune feature changed the game altogether. This allowed the APM to adjust the PID settings in the plane as I manually flew it around. It works really well! During my latest flight I had an 8km/h cross wind that the APM was able to fly against successfully.

The GPS is a Ublox LEA-6M. It works well considering the price point. I did not attempt autonomous landing which is when GPS accuracy is more important. This GPS is able to get a fix within seconds of start-up and generally no issues throughout flight.

I initially used the 3DR radio modems but had all sorts of problems keeping a solid connection with my GCS. I decided to bite the bullet and buy a quality radio modem that should last me a long time and exceed all my range requirements. The RFD900 Radio pair is compatible with 3DR equipment and slotted in quite well. I did have to manufacture a cable to connect it to the Pixhawk and it took a bit of searching to figure which wires went where but I got it sorted within an hour or so. The RFD900 did have some driver issues on Windows. I had to install an old driver before I could get Mission Planner connected to the Pixhawk through RFD900. This all equates to time spent mucking around… BUT once working this product is excellent and I always have strong telemetry signal.

UAV actual flight path exported to Google Earth
UAV actual flight path exported to Google Earth

Ground Control Station (GCS)

The Mission Planner software which runs on the ground station laptop allowing you to program the UAV and monitor it in flight is very good –  especially the Auto Waypoint Survey Grid feature. This allows you to draw the area you want to on the map. Simply load in a photograph from the camera you will be using and the target elevation. From this information is draws a flight path with your desired overlap.

Footprints Survey Grid
Mission Planner: Footprints Survey Grid
Mission Planner: In Flight Data
Mission Planner: In Flight Data
Ground Control Station
Ground Control Station

Sensor & Image Processing

Canon S100 is my sensor of choice as it is a great balance of quality, price functionality and size. I started with a Canon D10 but many of the photos came out under exposed. The S100 has a larger sensor and inbuilt GPS so it is a better choice for aerial mapping. The downside to the S100 is that the lens protrudes from the camera which exposes it to damage in a rough landing.

With UAV aerial mapping you need a way for the camera to trigger every few seconds on its own. With a Canon camera it is easy using the Canon Hack Development Kit (CHDK). This updates the camera firmware, allowing you to use intervalometer scripts to trigger camera every few seconds. CHDK also offers what seems like unlimited settings for the camera. It seems difficult to find a complete set of settings to use with CHDK, but for my next flight I will try using the DroneMapper documentation to setup CHDK.

In my last flight approximately 30% of my photos came out blurry. I discarded the worst of the photos but still had to use some poor quality photos to ensure the map had no blank spots. This is probably due to a combination of camera settings, camera mount and propeller slightly out of balance.

Using desktop software trial of Agisoft Photoscan I was able to product a 40ha orthomosaic. The application works surprisingly well considering all images are taken at slightly different angle and are only provided with one GPS point for each photo. It is a very computer intense process and if I was to do a significant amount of processing would need to upgrade my computer. Alternatively I could use DroneMapper, but my dataset did not meet their requirements because I had to cull some images. I hope to try DroneMapper next time.

UAV imagery : Suntop Wheat
UAV imagery : Suntop Wheat

I took my data a step further and set up a web server to host the data as tiles. You can check it out here. How to store and share data collected by UAVs is something I have been thinking about. An orthomosaic for a single paddock can be several gigabytes and take a powerful computer to view in its raw form. The web seems like a good way to display data having a server store the data and only send bits of the image that the end user requests as they zoom in and move around.

The S100 can be modified to collect NDVI data – check here for example.

Always learning the hard way

This is my second flying UAV. My first was a Finwing Penguin as well. I spent a couple days putting my plane together and all the components. It is a nervous time flying your brand new plane for the first time. The first time out my plane few OK in manual mode but since I am a very ordinary pilot I like to use assisted flying modes. I changed to Fly By Wire mode and due to a APM setting (that I had to set) the autopilot had reversed the elevator sending it crashing into the ground. This snapped the plane in half and bent up the fuselage. Thankfully this durable foam returns to shape when you put it in boiling water and the pieces can be glued back together, reinforced with carbon fiber and fiberglass tape. Now I follow the suggested checks in the APM:Plane instructions more closely.  I’ve had no crashes since but have landed in mud which can be painful to clean out of the landing gear.

Fuselage post crash on maiden flight
Fuselage post crash on maiden flight

Conclusion

Putting together this UAV I have learned how all the components of a UAV fit together, challenges faced by the commercial suppliers, and a better understanding of the enormous potential on offer. I think the biggest challenge is not the UAV platform itself but collecting high quality consistent data that can be quickly processed and given a useful, profitable application. The setup I have discussed here not including laptop or the countless hours of time comes to about AU$1200. Obviously for mapping large areas on a consistent basis, a commercial UAV would be preferred or even essential.

UAV Finwing Penguin: Clocked up some hours
UAV Finwing Penguin: Clocked up some hours