UAS in Ag: Sensors & Data

Traveling through the USA & Canada as part of my Nuffield Scholarship (Thanks to GRDC) I have heard the word data more times than I could count. In this post I am going to remove the platform aspect of the unmanned system and focus on sensors and the data they provide. Then in a later post how we can use it. In this article I do plan to try cover some concepts not widely discussed in the current UAS environment.

The reason an unmanned system is flown is to collect data, then turn that data into information to help monitor, assess and ultimately make timely, cost effective decisions based on that information. When collecting data it needs to be good quality. It is important not to confuse data quality with data type. For example, many tend to gravitate straight to the amount of megapixels a sensor captures, neglecting its spectral accuracy.

If we consider what our target is when collecting data from a UAS in a grain farming situation, it will most commonly be vegetation (not always but let’s focus on that). Collecting spatially referenced data of vegetation is by no means a new endeavour. This information has been collected as vastly as Landsat satellite imagery and as specific as a GreenSeeker. Generally, for vegetation, similar bandwidth reflectance is measured irrespective of proximity to the target. The same is true for sensors used in UAS. Why is this the case? Well you can read the long answer here (Remote Sensing of Biomass). The short answer is photosynthesis. In a plant that is photosynthesizing, the chlorophyll will absorb large amounts of ‘visual’ light, particularly blue and red, and reflect near infrared (NIR) light. The more photosynthetic activity, the more NIR light is reflected and less visual light absorbed. Conversely, inactive vegetation will reflect more visual and less NIR.

The best contrast is between red and NIR light which is what is generally used when calculating the Normalised Difference Vegetation Index (NDVI). NDVI is a good indicator of plant health and measure of biomass. Consequently, most sensors used to determine NDVI look into the red and NIR bands – some more accurately than others. The chart below shows the reflectance curve of green grass over different wavelengths. Below the X axis is a rough spectral guide to some of the well-known sensors available to us.

Reflectance of green grass & sensor spectral bands
Reflectance of green grass & sensor spectral bands

What is most notable is the wavelength spectrum or band at which each of the sensors read. If we consider the GreenSeeker, it is extremely specific at capturing a certain wavelength in the middle of the red spectrum and similarly specific in the NIR. At the other end of this comparison you can see that the S100 modified camera has very broad spectrum for each channel that it reads. Consider (what was before modification) the S100’s ‘red’ channel which reads roughly from 0.67um to 0.76um with the modified filter. Post modification, this channel is renamed NIR and measures reflectance in an area that covers red right through to NIR. The S100 modification retains the blue and green channels which replaces red when calculating NDVI. Another significant point that this chart does not show is the interference that can occur between the different bands in a point and shoot camera. Check out the S100 reflectance chart about half way down the page in this link which shows some NIR response in the blue and green channels.

It has to be noted that it is hardly fair to compare the S100 and GreenSeeker in a practical sense for several reasons with the main one being that you would not mount a GreenSeeker on a UAV as it needs to be close to the target (the GreenSeeker is an active sensor meaning that it emits light and measures that reflectance, the S100 is a passive sensor just reading reflectance from the sun). In addition, the GreenSeeker measures only one point whereas the S100 collects data on 12 million pixels. The reason I do compare them is because they can both be used to produce an NDVI map. In fact despite the spectral differences from each of these sensors and the proximity to the target, RoboFlight claim from their tests that NDVI data collected from a GreenSeeker and modified S100 correlate in a linear fashion very closely (r squared > 0.9). So we know that the two sensors correlate well but the correlation will never be a fixed formula because sunlight reflected will always be different based on sun angle, atmosphere conditions, cloud etc. The S100 and GreenSeeker would probably work best as tools which complement each other. For example, map a large area with the S100 on a UAS. The resulting dataset could be calibrated using GreenSeeker data collected in the field at the same time as the flight. Potentially if S100 data is always calibrated against the GreenSeeker, inter-paddock and inter-season comparisons can be made.

We are starting to see a new wave of sensor development designed specifically for UAS and agricultural and environmental industries. An available example is the Airinov Multispec 4C. This sensor captures 4 distinct, narrow spectral bands with no interference. These bands include green, red, red edge and NIR. What makes this package special is that not only does this sensor package look down at the vegetation; it is also looking up measuring sunlight with a lux meter. This should allow us to generate data that can be calibrated without the need for ground truthing with a GreenSeeker or similar ‘active’ device. Another feature of this sensor is that it uses a Global Shutter which means all pixels in a photo are captured at exactly the same time eradicating any motion blur. The 4C has much less spatial resolution than the S100 (1.2MP vs 12MP). Expect to pay over US$10,000 for this sensor package, not including UAV or processing software.

In summary, this article aims to explain how there is more to a UAS sensor than just megapixels. It is important to understand the spectral response of vegetation and how this can impact your sensor choice. A modified Canon camera such as the S100 is a great option for UAS but its limitations must be understood. Work needs to be done to analyse the results and accuracy of the new sensors such as the Multispec 4C.

S100 mounted in a DIY Finwing Penguin build
S100 mounted in a DIY Finwing Penguin build

* Further notes: The most common sensor used in small UAS (mid 2014) is a Canon S100 or similar variant. This camera was never designed to be flown in an UAS but the internal GPS, fast shutter speed (1/2000), relatively large sensor (12.1 MP 1/1.7″ Canon CMOS), low weight (198g), ability to be modified to detect NIR, CHDK compatibility for intervalometer, and low cost (<$500) all contribute to a well suited sensor for this application. Flown at 120m this camera can provide a ground resolution of 3.5cm.

Canon S100
Canon S100

3 thoughts on “UAS in Ag: Sensors & Data”

  1. Hi Ben yesterday I heard from Darrin Lee at Mingenew in WA he has a programmer processing data to define wheat heads and weeds. He claims to be able to calculate yield and spot spray small areas using data from a UAV hex a copter. Cheers Neil

    Like

  2. Hi Ben, I’m getting a little con fused with the options available with software post flight to analyse images. It seems the best option to stick images (that are geo referenced JPEG or RAW with either a RGB or converted NDVI camera) is with Agrisoft photscan standard edition? And then you need to use software like AgPixel to interpret the NDSVi or convert to NDVi images, Is this correct? Can you buy PixPixel outright or are there alternative options. Would appreciate a chat soon 0427688028. Regards Leighton Pearce Riverland South Australia

    Like

  3. Hi Ben Im also intereted in the difference between converted Cameras (NIR / NDVI etc) Vs multispectral cameras, What extra and or application can the Muti Spec cameras perform in terms of agronomy advise? Or can the Canon s110 &SX260 HS really achieve similar results from a farmers point of view?
    Regards Leighton pearce

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s