Satellite imagery for precision agriculture: Satamap

Satamap is a web based satellite imagery service for precision agriculture. It’s available at satamap.com.au. This is a project I am part of so the following is not an independent review, just a quickly written explanation of this innovative app. I understand my audience is fairly schooled in most things precision agriculture so I’ll skip the marketing talk and get straight to the point.

Today we are launching Satamap. This is a brand new service making up to date satellite imagery available to everyone. Our focus is on agriculture, therefore all imagery is paired with a vegetation index called Satamap Vegetation Index (SVI). It is similar to NDVI but we believe it is better at showing variability in high biomass crops and less impacted by soil colour. The colour ramp we use to represent the SVI values, while in your face at first, is designed to show biomass variability in all crops, at all stages of crop growth at all times of year. The colours remain consistent year round so that, for example blue represents the same as blue and red, red no matter which location or time of year. This is important because the Satamap slider allows any two image dates to be laid over the top of the other and the ability to slide between the two for a direct comparison. The same can be done with the standard colour imagery as well.

Satamap screenshot
Satamap screenshot

This service does not require drawing in of paddock boundaries or limit you to a small area of interest. Subscriptions are based on a 3 million plus hectare tile. It takes 5 minutes to subscribe and you have access to the whole area and an archive back to winter 2013. Imagery is captured at a 16 day interval. Cloud can get in the way at times which can be frustrating but we are working on increasing our imagery availability to reduce cloud impacts. The colour imagery has a resolution of 15 m and the SVI is 30 m. We cover all major cropping regions of Australia.

Satamap works best in an iPad or similar tablet device, but functions equally as well on a desktop computer. Other standard features in Satamap include custom markers, area measurement tools, imagery export and GPS location on the map. All these features themselves could warrant an article, but best to just watch the video to see some of them in action.

Satellite imagery has been available to agriculture and related industries for decades and those that have invested the time and money will attest to the value and significance in this technology but admit that all too much the time and money is often the biggest hindrance. We are aiming to solve these problems with Satamap and bring out the potential of satellite imagery for agriculture. Agronomists, grain traders, farmers, suppliers and more can all benefit from rapid, cost effective access to up to date satellite imagery.

We are in constant development. We are working on offering higher resolution imagery, ground truthing data points, exporting with post-processing and more. Currently only available in Australia, very soon we will be opening up to other parts of the world. Thanks for checking in.

Please check it out at satamap.com.au.

UAS in Ag: Sensors & Data

Traveling through the USA & Canada as part of my Nuffield Scholarship (Thanks to GRDC) I have heard the word data more times than I could count. In this post I am going to remove the platform aspect of the unmanned system and focus on sensors and the data they provide. Then in a later post how we can use it. In this article I do plan to try cover some concepts not widely discussed in the current UAS environment.

The reason an unmanned system is flown is to collect data, then turn that data into information to help monitor, assess and ultimately make timely, cost effective decisions based on that information. When collecting data it needs to be good quality. It is important not to confuse data quality with data type. For example, many tend to gravitate straight to the amount of megapixels a sensor captures, neglecting its spectral accuracy.

If we consider what our target is when collecting data from a UAS in a grain farming situation, it will most commonly be vegetation (not always but let’s focus on that). Collecting spatially referenced data of vegetation is by no means a new endeavour. This information has been collected as vastly as Landsat satellite imagery and as specific as a GreenSeeker. Generally, for vegetation, similar bandwidth reflectance is measured irrespective of proximity to the target. The same is true for sensors used in UAS. Why is this the case? Well you can read the long answer here (Remote Sensing of Biomass). The short answer is photosynthesis. In a plant that is photosynthesizing, the chlorophyll will absorb large amounts of ‘visual’ light, particularly blue and red, and reflect near infrared (NIR) light. The more photosynthetic activity, the more NIR light is reflected and less visual light absorbed. Conversely, inactive vegetation will reflect more visual and less NIR.

The best contrast is between red and NIR light which is what is generally used when calculating the Normalised Difference Vegetation Index (NDVI). NDVI is a good indicator of plant health and measure of biomass. Consequently, most sensors used to determine NDVI look into the red and NIR bands – some more accurately than others. The chart below shows the reflectance curve of green grass over different wavelengths. Below the X axis is a rough spectral guide to some of the well-known sensors available to us.

Reflectance of green grass & sensor spectral bands
Reflectance of green grass & sensor spectral bands

What is most notable is the wavelength spectrum or band at which each of the sensors read. If we consider the GreenSeeker, it is extremely specific at capturing a certain wavelength in the middle of the red spectrum and similarly specific in the NIR. At the other end of this comparison you can see that the S100 modified camera has very broad spectrum for each channel that it reads. Consider (what was before modification) the S100’s ‘red’ channel which reads roughly from 0.67um to 0.76um with the modified filter. Post modification, this channel is renamed NIR and measures reflectance in an area that covers red right through to NIR. The S100 modification retains the blue and green channels which replaces red when calculating NDVI. Another significant point that this chart does not show is the interference that can occur between the different bands in a point and shoot camera. Check out the S100 reflectance chart about half way down the page in this link which shows some NIR response in the blue and green channels.

It has to be noted that it is hardly fair to compare the S100 and GreenSeeker in a practical sense for several reasons with the main one being that you would not mount a GreenSeeker on a UAV as it needs to be close to the target (the GreenSeeker is an active sensor meaning that it emits light and measures that reflectance, the S100 is a passive sensor just reading reflectance from the sun). In addition, the GreenSeeker measures only one point whereas the S100 collects data on 12 million pixels. The reason I do compare them is because they can both be used to produce an NDVI map. In fact despite the spectral differences from each of these sensors and the proximity to the target, RoboFlight claim from their tests that NDVI data collected from a GreenSeeker and modified S100 correlate in a linear fashion very closely (r squared > 0.9). So we know that the two sensors correlate well but the correlation will never be a fixed formula because sunlight reflected will always be different based on sun angle, atmosphere conditions, cloud etc. The S100 and GreenSeeker would probably work best as tools which complement each other. For example, map a large area with the S100 on a UAS. The resulting dataset could be calibrated using GreenSeeker data collected in the field at the same time as the flight. Potentially if S100 data is always calibrated against the GreenSeeker, inter-paddock and inter-season comparisons can be made.

We are starting to see a new wave of sensor development designed specifically for UAS and agricultural and environmental industries. An available example is the Airinov Multispec 4C. This sensor captures 4 distinct, narrow spectral bands with no interference. These bands include green, red, red edge and NIR. What makes this package special is that not only does this sensor package look down at the vegetation; it is also looking up measuring sunlight with a lux meter. This should allow us to generate data that can be calibrated without the need for ground truthing with a GreenSeeker or similar ‘active’ device. Another feature of this sensor is that it uses a Global Shutter which means all pixels in a photo are captured at exactly the same time eradicating any motion blur. The 4C has much less spatial resolution than the S100 (1.2MP vs 12MP). Expect to pay over US$10,000 for this sensor package, not including UAV or processing software.

In summary, this article aims to explain how there is more to a UAS sensor than just megapixels. It is important to understand the spectral response of vegetation and how this can impact your sensor choice. A modified Canon camera such as the S100 is a great option for UAS but its limitations must be understood. Work needs to be done to analyse the results and accuracy of the new sensors such as the Multispec 4C.

S100 mounted in a DIY Finwing Penguin build
S100 mounted in a DIY Finwing Penguin build

* Further notes: The most common sensor used in small UAS (mid 2014) is a Canon S100 or similar variant. This camera was never designed to be flown in an UAS but the internal GPS, fast shutter speed (1/2000), relatively large sensor (12.1 MP 1/1.7″ Canon CMOS), low weight (198g), ability to be modified to detect NIR, CHDK compatibility for intervalometer, and low cost (<$500) all contribute to a well suited sensor for this application. Flown at 120m this camera can provide a ground resolution of 3.5cm.

Canon S100
Canon S100

Unmanned Aerial Vehicles (UAV) in Precision Agriculture

Technology in farming is constantly evolving. Collecting accurate, reliable georeferenced (location in terms of GPS coordinates) data is essential to capitalise on technologies such as variable rate application of chemicals and fertiliser and aid in crop monitoring at a level once not imagined. Some current forms of collecting georeferenced paddock data include:

  • Combine harvester – yield maps (crop yield as harvester works through paddock)
  • Satellite imagery – colour and near infrared (NIR) bands to produce natural images & vegetation indices such as Normalised Difference Vegetation Index (NDVI)
  • Aerial imagery – similar to satellite but offers higher resolution at higher price & some other sensor options
  • Tractor – Greenseeker (plant biomass), digital elevation model (DEM) collected from high accuracy GPS
  • Utility vehicles e.g. Soil sampling pH & nutrition, electromagnetic conductivity, Greenseeker, DEM
  • Handheld with GPS – Greenseeker, soil sampling
  • Stationary – moisture probe, weather station

Unmanned Aerial Vehicles (UAVs) are emerging as a cost effective way to collect data with many advantages over the traditional forms listed above. UAVs are as the name suggests an unmanned vehicle which flies over the paddock to collect data. These machines are generally compact, can be cheap, mechanically simple, fly below cloud cover and are on there way to being easy to operate with advanced autopilot systems.

Over the last 6 months I have begun researching civilian UAVs and their application in agriculture as part of my Nuffield Scholarship. Furthermore, I have been testing a budget UAV platform which I will discuss in a later post. The aim of this post is to aggregate key information and ideas on the topic into one space. It is by no means comprehensive – more of a beginning. Note that I am not a pilot or lawyer. This article is general in nature and does not give permission to fly or legal advice. Lets start with a sky-high view.

The Agricultural UAV Solution

It is important to consider all aspects pertaining to the agricultural UAV (aUAV) Solution which I define as a robust, timely, cost effective way to collect usable data to improve yields and overall profitability in sustainable farming systems. Consider the following formula:

aUAV Solution = platform + GPS + autopilot & communication + sensor + data processing & integration + legal & operation

All components of the formula need to be working well and working together for the product to be successful technology. Now enough of inventing acronyms and formulas that will inevitably change, it’s time to flesh out the components of the aUAV Solution.

Platforms

There are two main platforms available: fixed wing and multi-rotor. A fixed wing platform has the advantage of covering large areas efficiently, whereas a multirotor shines in being able to remain very stable in challenging conditions with large payloads.

Due to the scale of broadacre grain growing in Australia, my interest lies predominately with the fixed wing platform type, as paddocks often exceed 250ha (~620ac). ConservationDrones has an excellent list of budget fixed wing platforms they have used as an example.

GPS

Global Positioning Systems (GPS) are the backbone of most spatial technologies. GPS on the UAV tells the autopilot where it is at all times. In addition, GPS links the data collected to it’s spatial position (aka geo-referencing).

Many UAVs are equipped with a u-blox GPS receiver or similar which is compact and provides <5m horizontal accuracy. These systems are affordable and are accurate for most situations.

An exciting development is the Piksi by Swift Navigation, which is a low cost Real Time Kinetic (RTK) GPS receiver that promises to sell for around $1000 which is unheard of in the world of GPS. The Piksi offers centimetre level accuracy inside a compact design ideal for small UAVs. The improved accuracy will be invaluable for autonomous landings and improved accuracy of geo-referencing data.

Autopilot

We are seeing UAV autopilots improve very quickly with increased reliability, especially within the open source community. Autopilots are essential for being able to effortlessly fly over a whole area to collect the desired data. DIY DronesAPM:Plane is often the autopilot of choice for hobbyists and entry to mid level platforms. It uses the same hardware and similar software to the APM:Rover I built last year.

There are several other autopilots available, commercial and open source, that are worth checking out. Google it.

Usually the UAV is communicating with a ground control station (GCS) via radio link. GCS is usually just a laptop computer with software such as Mission Planner. Mission Planner is also used to set the flight paths for the UAV missions.

Sensors

The most complex part of collecting good data is having the correct sensor. For plant biomass data, the most important spectral range is in the near infrared spectrum. The two most common options include Tetracam ADC Lite built specifically for UAVs or a digital camera modified to capture within this spectrum (MaxMax for example). The latter option is the most cost effective solution. Some preliminary studies show that some good results can be achieved.

Researchers are working hard improving sensors for UAVs. For example, TerraLuma, is a research group at the University of Tasmania. Projects of interest include high accuracy geo-referencing of imagery ‘on the fly’ and the use of a hyperspectral pushbroom scanner to collect data.

Public Lab (an open source community) is also working at modifying cameras similar to MaxMax but also on cheaper devices such as web cams. The recently achieved funding through a Kickstarter campaign. Maybe we will have another cost effective solution soon. See also Pi NoIR.

It is worth mentioning that it is very common for UAVs to have a GoPro camera (or similar) mounted to capture high definition video footage. This video footage is valuable for visually monitoring crops from the sky but is generally not processed to geo-referenced data. There is always exceptions such as shown in this video over a construction site where video footage is used to generate a 3D model.

Data Processing & Integration

Although collecting good data is the most challenging part, the most time consuming (and/or expensive) part can be processing it to a point where it can be integrated into precision agriculture systems. Generally the UAV will follow a lawn mower track collecting an image images at a defined interval with a generous defined overlap. The raw data will usually be images (up to several hundreds – think gigabytes) with a single GPS position and maybe a bearing per image. The challenge for the data processing is to stitch these images together to generate one homogenous data set. Every image is affected by the differing roll, yaw and pitch of the UAV as the image is captured. Some of the more common applications include:

  • Drone Mapper is a successful web based startup which effectively filled the affordable yet professional data processing gap
  • Agisoft Photoscan
  • Pix4D
  • Microsoft ICE is a web based and free to use but only stitches images without offering geo-referencing or 3D modelling unlike the above mentioned applications
  • VisualSFM, CMVS, and CMPVMS – Flight Riot does the hard work explaining how to use this software to generate a 3D model from digital camera photos. This is probably one of the more complex processes but uses all free(ish) software.

Once a geo-referenced, homogenous, data set over a paddock is achieved it could possibly undergo further post processing to determine NDVI. This raster data may then, for example, be used to define zones for in crop variable rate fertiliser application.

As mentioned, some of the above software is able to create 3D models from 2D photographs. These 3D models could be used to create digital elevation models (DEM) which is valuable in farming for determining water movement.

Legal & Operation

In Australia, the Cival Aviation Safety Authority (CASA) rule the sky. CASA has rules governing the use of UAVs (which they call unmanned aerial systems or UAS) and is in the process of re-evaluating some regulations. See a summary of a recent speech from CASA here.

To operate a UAV/UAS commercially in Australia you need to have a certified operators certificate. A list of those certified is available here.

CASA have done well to have a system set up for UAS. The USA is lagging behind and is just now establishing rules and regulations are UAVs.

Getting into it

There are many companies that are focusing on developing UAVs for the ag industry that fulfill many of the components of the aUAV Solution including AG-Wing, AgEagle and PrecisionHawk. Get your link here.

Edit: See also MarcusUAV

You can buy a calibrated, tested, ready to fly system built from budget readily available components and open source autopilot. For example Event38 and Flight Riot.

The third option is to go fully DIY. I have tried this using a Finwing Penguin fixed wing platform, APM:Plane autopilot, ordinary Canon digital camera as sensor. I am yet to process any images into geo-referenced datasets. I will post more about this soon. Here is an image from one of my first flights.

http://rcm-na.amazon-adsystem.com/e/cm?t=agma-20&o=1&p=8&l=as1&asins=B00AGOSQI8&ref=tf_til&fc1=000000&IS2=1&lt1=_blank&m=amazon&lc1=0000FF&bc1=FFFFFF&bg1=FFFFFF&f=ifr

2013/2014 Sorghum from UAV, captured with Canon Powershot D10.
2013/2014 Sorghum from UAV, captured with Canon Powershot D10.