It’s an exciting time to come up with ideas in ag

There has never been a better time to come up with creative ideas in ag that involve hardware and software, thanks to the mobile phone, maker movement, open source and enormous amount of free education online. Just about anyone with passion, grit and time (the hardest of all) can build just about anything.

I am not thanking the mobile phone because it gives us communication and access to apps etc, that’s all pretty boring and taken for granted now. I thank the mobile phone because it makes everything so damn cheap. Think mass production of GPS, cameras, wifi, gyros etc – economics work in our favor.

The maker movement is basically taking advantage of the situation of cheaper parts construction and packaging up parts especially for curious hobbyists. This makes sourcing parts easy for hands on prototyping, even in Australia. Also gives us access to cool technology like 3D printing.

Open source is generally something able to copyrighted – such computer code, 3D model or information – that is licenced in a way that basically anyone can use it for free as long as they adhere to the license terms. Generally, large, successful open source projects have thousands in their ‘community’ and sometimes hundreds of contributing developers. What this means is a lot of the complex core work in many situations is already done. The APM autopilot code base is a fantastic example. Build a drone or rover OR EVEN A SUBMARINE and there is tested working autopilots ready to control them for free with great communities to help you out.

Now all this is really neat because people that have qualifications in say agronomy or experience in farm management that have real problems to solve can upskill and then have an overlapping understanding in agriculture and their associated technology of interest.

Once you have an understanding (or know where to look) of what is out there, it doesn’t take long before stacking together some cheap hardware with open source software ideas can become a reality.

Here are some ideas that may be more simple to build:

  • Vehicle tracking devices
  • Moisture probes
  • Grain flow meters
  • Weather stations

More ambitious ideas floating around in my head:

  • RTK (repeatable, centimeter accurate GPS) is getting very cheap, and we have had high quality long range radios for a while (for sending data to and from base to rover). We could build a an open source autosteer controller for tractors as an alternative to options from Trimble and John Deere. We could use all open formats and generic hardware. See what Matt Reimer built using the APM autopilot controller. The end result is about as incredible as the support he got on the DIY Drones discussion forum. I expect RTK GPS prices to fall even further in the next couple years.
  • Using our RTK GPS and rover version of the APM we could build small autonomous vehicles to do spot spraying. I took my first step a few years ago by building a small prototype. I have not got round to building a larger one (yet! – there is still time).
  • We could mount WeedSeeker or WeedIT cameras on our open source autonomous vehicles or we could build our own! Computer vision and AI to ‘learn’ what objects look like is all the rage at the moment with some good open source software packages such as caffe. Rather than be limited to just whether an object has chlorophyll by a vegetation index, how great would it be to determine if it was grass or broadleaf. On board computers are fast and cheap and cameras are probably not as expensive as you think. This is getting very complex but not impossible.
  • Lets take it one step further. Why not have a drone tethered to the rover. It could be up high constantly scanning for weeds and using some sort of AI travelling salesman algorithm find the most efficient way to spot spray all the weeds with the ground vehicle. Now this is getting somewhere albeit very ambitious.
Solution of a travelling salesman problem: the black line shows the shortest possible loop that connects every red dot. Source: Wikipedia 2017

All sound exciting? Where do you start?

  • Remember – small steps and time.
  • I learned to write computer code at Udacity for free by doing the Intro to Computer Science and Web Development. Both are a few years old now but still fantastic. edX now exists which looks exciting for high quality free online learning. I’d love to do a course in software machine integration, computer vision or AI next.
  • Maybe hop straight to it and start working with Raspberry Pi or Arduino projects. My First project was a fixed wing UAV.
  • Putting this post together I discovered Farm Hack – a website that basically shares this idea that it’s a great time to innovate in ag.

Do some reading, Google searches and have a think. You may just be able to make that thing your have been dreaming of yourself!

Satellite imagery for precision agriculture: Satamap

Satamap is a web based satellite imagery service for precision agriculture. It’s available at satamap.com.au. This is a project I am part of so the following is not an independent review, just a quickly written explanation of this innovative app. I understand my audience is fairly schooled in most things precision agriculture so I’ll skip the marketing talk and get straight to the point.

Today we are launching Satamap. This is a brand new service making up to date satellite imagery available to everyone. Our focus is on agriculture, therefore all imagery is paired with a vegetation index called Satamap Vegetation Index (SVI). It is similar to NDVI but we believe it is better at showing variability in high biomass crops and less impacted by soil colour. The colour ramp we use to represent the SVI values, while in your face at first, is designed to show biomass variability in all crops, at all stages of crop growth at all times of year. The colours remain consistent year round so that, for example blue represents the same as blue and red, red no matter which location or time of year. This is important because the Satamap slider allows any two image dates to be laid over the top of the other and the ability to slide between the two for a direct comparison. The same can be done with the standard colour imagery as well.

Satamap screenshot
Satamap screenshot

This service does not require drawing in of paddock boundaries or limit you to a small area of interest. Subscriptions are based on a 3 million plus hectare tile. It takes 5 minutes to subscribe and you have access to the whole area and an archive back to winter 2013. Imagery is captured at a 16 day interval. Cloud can get in the way at times which can be frustrating but we are working on increasing our imagery availability to reduce cloud impacts. The colour imagery has a resolution of 15 m and the SVI is 30 m. We cover all major cropping regions of Australia.

Satamap works best in an iPad or similar tablet device, but functions equally as well on a desktop computer. Other standard features in Satamap include custom markers, area measurement tools, imagery export and GPS location on the map. All these features themselves could warrant an article, but best to just watch the video to see some of them in action.

Satellite imagery has been available to agriculture and related industries for decades and those that have invested the time and money will attest to the value and significance in this technology but admit that all too much the time and money is often the biggest hindrance. We are aiming to solve these problems with Satamap and bring out the potential of satellite imagery for agriculture. Agronomists, grain traders, farmers, suppliers and more can all benefit from rapid, cost effective access to up to date satellite imagery.

We are in constant development. We are working on offering higher resolution imagery, ground truthing data points, exporting with post-processing and more. Currently only available in Australia, very soon we will be opening up to other parts of the world. Thanks for checking in.

Please check it out at satamap.com.au.

Challenges facing UAS in agriculture

After just finishing a post on some of the applications for UAS in agriculture, I thought I would share what I believe to be some of the biggest challenges the industry faces. As I continue my Nuffield Scholarship studies (thanks to GRDC) I am learning that unmanned aerial systems and the data produced is increasingly complex. In fact, many people will study a PhD just on one aspect of these systems alone (e.g. flight characteristics or remote sensing). Some of the more obvious challenges include:

1. Repeatability

If you were to go out and map a paddock at 11am and then again at 2pm, the resulting pixel values would be different. Even using the Normalised Difference Vegetation Index (NDVI) which by definition gives a normalised value, the data will be different. This is probably because the atmosphere and clouds do not seem to block/transmit/reflect wavelengths in parallel amounts.We see this effect in satellite imagery where the image is affected by cloud shadow. The table below is pixel value of Red and NIR from two points in the same barley paddock that has minimal variability.

Cloud shadow vs no shadow NDVI, Landsat 8, barley
Cloud shadow vs no shadow NDVI, Landsat 8, barley

Despite crop growth activity and biomass being very similar, these areas produce significantly different NDVI value due to the cloud shadow.

This can be applied to UAS imagery. The atmosphere, clouds and sun angle is constantly changing throughout the day affecting reflectance of all wavelengths differently. Therefore for data to be compared like to like it needs to be calibrated.

2. Calibration

This section follows on nicely from repeatability. If data is going to be used for more than just scouting then some sort of calibration will need to take place. As discussed in a previous blog post, calibration of NDVI could be potentially be achieved using an active ground device such as a GreenSeeker. There is also the new payloads that are true multispectral and have upward looking sensor to measure irradiance which could be used to calibrate data for true reflectance. Other forms of calibration/ground truthing include biomass cutting and weighing, tissue testing, the list goes on.

3. Algorithms & Indices

When it comes to remote sensing and vegetation, NDVI is the most famous index used, and for good reason, but it is not without its concerns. If you look on Wikipedia, the authors cover some of the issues with NDVI:

Users of NDVI have tended to estimate a large number of vegetation properties from the value of this index. Typical examples include the Leaf Area Index, biomass, chlorophyll concentration in leaves, plant productivity, fractional vegetation cover, accumulated rainfall, etc. Such relations are often derived by correlating space-derived NDVI values with ground-measured values of these variables. This approach raises further issues related to the spatial scale associated with the measurements, as satellite sensors always measure radiation quantities for areas substantially larger than those sampled by field instruments. Furthermore, it is of course illogical to claim that all these relations hold at once, because that would imply that all of these environmental properties would be directly and unequivocally related between themselves.

Thankfully we are not pigeon-holed to NDVI. Agribotix claim they get better results using the Difference Vegetation Index (DVI). Another example is Soil Adjusted Vegetation Index (SAVI).

4. Position

Without ground control points, the positional accuracy of data will be mediocre at best. Expect XY accuracy of a few meters and even more on the Z axis. GPS will record the position each frame is captured (+/- delay error), but the pitch, yaw and roll of the UAS which affects how the image is framed on the ground, is determined by an inertial measurement unit (IMU). The quality of the IMU will have a bearing on the positional accuracy if the processing software takes these variables into consideration. Expect to have to lay out minimum of 4 ground control points for high accuracy data.

UAS image processed with no GCP
UAS image processed with no GCP

5. Reliable data collection

As a consequence of the process involved in collecting data with a UAS there are several factors that contribute to the ability to reliably collect data. The process usually involves the UAS following a set lawnmower style track, conducting swaths up and back as it moves across the area of interest. As the vehicle is flying, it is capturing an image every few seconds depending on its speed. A forward and side overlap greater than 50% is required. Although processing software can handle images with some angle to the ground, extreme pitch and roll will affect the overall product.

In the data collection process if the UAS is hit by a gust of wind, it may put 2 or 3 images off target, while the autopilot makes adjustment. This can lead to a hole in the data set. This is not uncommon. Some UAS manufacturers allow you the rapidly transfer the raw data off the vehicle to a laptop in the paddock to check data coverage and quality in field (e.g. Precision Hawk).

So remember when someone claims their machine can fly in high winds, it is probably true, but the quality of the data being collected may not be of much use.

Sample flight path and image footprints
Sample flight path and image footprints

6. Data processing

Reliably collecting the data is just the first step in the process. Once all these images have been collect they need to be stitched together. It is amazing that stitching 300+ 12MP images together which are all taken at slightly different angles to the ground is even possible. Even more so that 3D surface models can be constructed from these 2D images. Given the complexity of this task it takes large amounts of computer power and time (think several hours for 100ha). For this reason there are several cloud based platforms which can offer this service (e.g. DroneMapper and PrecisionMapper). Processing on your own desktop computer versus online services do have both their pros and cons. A downside to the cloud services is the internet bandwidth required to first transfer the raw data to the server and then retrieve it once it has been processed. A downside to the desktop solution is the upfront cost of hardware and software and the required skill set may not be available in house.

7. Storage & sharing

Once the data is processed it needs to be stored somewhere and somehow distributed. Often one scene can be more than a gigabyte. If the processed data is not cut into tiles it can require a powerful machine just to view it. This is where online solutions come into play. Again the same issues exist as above around bandwidth requirements. At some point if the data is going to be used for more than just looking at it will most likely need to be transferred onto a local machine. This is an online map of one of the first areas I mapped with my DIY Finwing Penguin UAS. If for some reason the data is to be printed, it needs to be formatted as such which takes time and software.

8. System integration

Integrating data generated from UAS into existing precision agriculture software should be possible but not likely in its highest available spatial resolution. Software such as Farmworks and SST were not designed for such intense data sets which are able to be generated from UAS sensors. Resampling data to 0.5m resolution may be required.

9. Safety and legal

A whole other post could be written on this but basically in Australia we have the Civil Aviation Safety Authority (CASA). They require anyone who wants to fly UAS (they call them remotely piloted aircraft or RPA) commercially to have an Operators Certificate and Controllers Certificate. This, among other things requires a theory component equivalent to a private pilots license, comprehensive manuals written, and a stringent approvals process to ensure your vehicle is fit for service and your piloting skills are sufficient. CASA is at the moment reviewing this process and should hopefully have a revision out by the end of the year. This process seems strict, but it important to keep the UAS industry safe and professional.

#SocialWeatherFeed – Twitter App

#SocialWeatherFeed allows anyone to create a Twitter account (or use your own) to give a weather update for a selected weather station in Australia. It uses BoM weather data to give text updates every 3 hours and a 3 day charted history every morning.

It has taken me a while to figure out the best way to implement it and I am not sure if I have found it – but this is it for the moment.

Here is a list of the active Twitter accounts using #SocialWeatherFeed:

To add to this list you need to:

  1. Create a new Twitter account (you will need to use a different email address to your original account to open a second) and call it something like ‘Townsville Weather’
  2. Go to http://socialweatherfeed.appspot.com
  3. Select the weather station from the drop down box and ‘Submit Query’
  4. Authorise the Twitter account and you’re away!
  5. Let me know and I’ll add it to the list

#SocialWeatherFeed

Unmanned Aerial Vehicles (UAV) in Precision Agriculture

Technology in farming is constantly evolving. Collecting accurate, reliable georeferenced (location in terms of GPS coordinates) data is essential to capitalise on technologies such as variable rate application of chemicals and fertiliser and aid in crop monitoring at a level once not imagined. Some current forms of collecting georeferenced paddock data include:

  • Combine harvester – yield maps (crop yield as harvester works through paddock)
  • Satellite imagery – colour and near infrared (NIR) bands to produce natural images & vegetation indices such as Normalised Difference Vegetation Index (NDVI)
  • Aerial imagery – similar to satellite but offers higher resolution at higher price & some other sensor options
  • Tractor – Greenseeker (plant biomass), digital elevation model (DEM) collected from high accuracy GPS
  • Utility vehicles e.g. Soil sampling pH & nutrition, electromagnetic conductivity, Greenseeker, DEM
  • Handheld with GPS – Greenseeker, soil sampling
  • Stationary – moisture probe, weather station

Unmanned Aerial Vehicles (UAVs) are emerging as a cost effective way to collect data with many advantages over the traditional forms listed above. UAVs are as the name suggests an unmanned vehicle which flies over the paddock to collect data. These machines are generally compact, can be cheap, mechanically simple, fly below cloud cover and are on there way to being easy to operate with advanced autopilot systems.

Over the last 6 months I have begun researching civilian UAVs and their application in agriculture as part of my Nuffield Scholarship. Furthermore, I have been testing a budget UAV platform which I will discuss in a later post. The aim of this post is to aggregate key information and ideas on the topic into one space. It is by no means comprehensive – more of a beginning. Note that I am not a pilot or lawyer. This article is general in nature and does not give permission to fly or legal advice. Lets start with a sky-high view.

The Agricultural UAV Solution

It is important to consider all aspects pertaining to the agricultural UAV (aUAV) Solution which I define as a robust, timely, cost effective way to collect usable data to improve yields and overall profitability in sustainable farming systems. Consider the following formula:

aUAV Solution = platform + GPS + autopilot & communication + sensor + data processing & integration + legal & operation

All components of the formula need to be working well and working together for the product to be successful technology. Now enough of inventing acronyms and formulas that will inevitably change, it’s time to flesh out the components of the aUAV Solution.

Platforms

There are two main platforms available: fixed wing and multi-rotor. A fixed wing platform has the advantage of covering large areas efficiently, whereas a multirotor shines in being able to remain very stable in challenging conditions with large payloads.

Due to the scale of broadacre grain growing in Australia, my interest lies predominately with the fixed wing platform type, as paddocks often exceed 250ha (~620ac). ConservationDrones has an excellent list of budget fixed wing platforms they have used as an example.

GPS

Global Positioning Systems (GPS) are the backbone of most spatial technologies. GPS on the UAV tells the autopilot where it is at all times. In addition, GPS links the data collected to it’s spatial position (aka geo-referencing).

Many UAVs are equipped with a u-blox GPS receiver or similar which is compact and provides <5m horizontal accuracy. These systems are affordable and are accurate for most situations.

An exciting development is the Piksi by Swift Navigation, which is a low cost Real Time Kinetic (RTK) GPS receiver that promises to sell for around $1000 which is unheard of in the world of GPS. The Piksi offers centimetre level accuracy inside a compact design ideal for small UAVs. The improved accuracy will be invaluable for autonomous landings and improved accuracy of geo-referencing data.

Autopilot

We are seeing UAV autopilots improve very quickly with increased reliability, especially within the open source community. Autopilots are essential for being able to effortlessly fly over a whole area to collect the desired data. DIY DronesAPM:Plane is often the autopilot of choice for hobbyists and entry to mid level platforms. It uses the same hardware and similar software to the APM:Rover I built last year.

There are several other autopilots available, commercial and open source, that are worth checking out. Google it.

Usually the UAV is communicating with a ground control station (GCS) via radio link. GCS is usually just a laptop computer with software such as Mission Planner. Mission Planner is also used to set the flight paths for the UAV missions.

Sensors

The most complex part of collecting good data is having the correct sensor. For plant biomass data, the most important spectral range is in the near infrared spectrum. The two most common options include Tetracam ADC Lite built specifically for UAVs or a digital camera modified to capture within this spectrum (MaxMax for example). The latter option is the most cost effective solution. Some preliminary studies show that some good results can be achieved.

Researchers are working hard improving sensors for UAVs. For example, TerraLuma, is a research group at the University of Tasmania. Projects of interest include high accuracy geo-referencing of imagery ‘on the fly’ and the use of a hyperspectral pushbroom scanner to collect data.

Public Lab (an open source community) is also working at modifying cameras similar to MaxMax but also on cheaper devices such as web cams. The recently achieved funding through a Kickstarter campaign. Maybe we will have another cost effective solution soon. See also Pi NoIR.

It is worth mentioning that it is very common for UAVs to have a GoPro camera (or similar) mounted to capture high definition video footage. This video footage is valuable for visually monitoring crops from the sky but is generally not processed to geo-referenced data. There is always exceptions such as shown in this video over a construction site where video footage is used to generate a 3D model.

Data Processing & Integration

Although collecting good data is the most challenging part, the most time consuming (and/or expensive) part can be processing it to a point where it can be integrated into precision agriculture systems. Generally the UAV will follow a lawn mower track collecting an image images at a defined interval with a generous defined overlap. The raw data will usually be images (up to several hundreds – think gigabytes) with a single GPS position and maybe a bearing per image. The challenge for the data processing is to stitch these images together to generate one homogenous data set. Every image is affected by the differing roll, yaw and pitch of the UAV as the image is captured. Some of the more common applications include:

  • Drone Mapper is a successful web based startup which effectively filled the affordable yet professional data processing gap
  • Agisoft Photoscan
  • Pix4D
  • Microsoft ICE is a web based and free to use but only stitches images without offering geo-referencing or 3D modelling unlike the above mentioned applications
  • VisualSFM, CMVS, and CMPVMS – Flight Riot does the hard work explaining how to use this software to generate a 3D model from digital camera photos. This is probably one of the more complex processes but uses all free(ish) software.

Once a geo-referenced, homogenous, data set over a paddock is achieved it could possibly undergo further post processing to determine NDVI. This raster data may then, for example, be used to define zones for in crop variable rate fertiliser application.

As mentioned, some of the above software is able to create 3D models from 2D photographs. These 3D models could be used to create digital elevation models (DEM) which is valuable in farming for determining water movement.

Legal & Operation

In Australia, the Cival Aviation Safety Authority (CASA) rule the sky. CASA has rules governing the use of UAVs (which they call unmanned aerial systems or UAS) and is in the process of re-evaluating some regulations. See a summary of a recent speech from CASA here.

To operate a UAV/UAS commercially in Australia you need to have a certified operators certificate. A list of those certified is available here.

CASA have done well to have a system set up for UAS. The USA is lagging behind and is just now establishing rules and regulations are UAVs.

Getting into it

There are many companies that are focusing on developing UAVs for the ag industry that fulfill many of the components of the aUAV Solution including AG-Wing, AgEagle and PrecisionHawk. Get your link here.

Edit: See also MarcusUAV

You can buy a calibrated, tested, ready to fly system built from budget readily available components and open source autopilot. For example Event38 and Flight Riot.

The third option is to go fully DIY. I have tried this using a Finwing Penguin fixed wing platform, APM:Plane autopilot, ordinary Canon digital camera as sensor. I am yet to process any images into geo-referenced datasets. I will post more about this soon. Here is an image from one of my first flights.

http://rcm-na.amazon-adsystem.com/e/cm?t=agma-20&o=1&p=8&l=as1&asins=B00AGOSQI8&ref=tf_til&fc1=000000&IS2=1&lt1=_blank&m=amazon&lc1=0000FF&bc1=FFFFFF&bg1=FFFFFF&f=ifr

2013/2014 Sorghum from UAV, captured with Canon Powershot D10.
2013/2014 Sorghum from UAV, captured with Canon Powershot D10.

Full Season Chickpeas Photo Log

Here is another photo log from a couple years ago. Chickpeas, once introduced as a break crop from a crop rotation dominated by cereals (i.e. wheat and barley) is now just as important and can be just as profitable as wheat and barley. Although chickpea varieties are getting better all the time they are still susceptible to water logging and disease due to too much rain. The variety used in this season was ‘Jimbour’. We have now switched to ‘Hat-Trick’.

The season was set up to be a bumper crop with great crop establishment and moisture profile but then it just kept on raining and we ended up with a lot of plant and not many peas in the pods due to water logging, disease and wind. Enjoy viewing some of the hardships of farming.

The video is me harvesting chickpeas, but not this particular crop.