There has never been a better time to come up with creative ideas in ag that involve hardware and software, thanks to the mobile phone, maker movement, open source and enormous amount of free education online. Just about anyone with passion, grit and time (the hardest of all) can build just about anything.
I am not thanking the mobile phone because it gives us communication and access to apps etc, that’s all pretty boring and taken for granted now. I thank the mobile phone because it makes everything so damn cheap. Think mass production of GPS, cameras, wifi, gyros etc – economics work in our favor.
The maker movement is basically taking advantage of the situation of cheaper parts construction and packaging up parts especially for curious hobbyists. This makes sourcing parts easy for hands on prototyping, even in Australia. Also gives us access to cool technology like 3D printing.
Open source is generally something – such computer code, 3D model or information – that is licensed in a way that basically anyone can use it for free as long as they adhere to the license terms. Generally, large, successful open source projects have thousands in their ‘community’ and sometimes hundreds of contributing developers. What this means is a lot of the complex core work in many situations is already done. The APM autopilot code base is a fantastic example. Build a drone or rover OR EVEN A SUBMARINE and there is tested working autopilots ready to control them for free with great communities to help you out.
Now all this is really neat because people that have qualifications in say agronomy or experience in farm management that have real problems to solve can upskill and then have an overlapping understanding in agriculture and their associated technology of interest.
Once you have an understanding (or know where to look) of what is out there, it doesn’t take long before stacking together some cheap hardware with open source software ideas can become a reality.
Here are some ideas that may be more simple to build:
Vehicle tracking devices
Grain flow meters
More ambitious ideas floating around in my head:
RTK (repeatable, centimeter accurate GPS) is getting very cheap, and we have had high quality long range radios for a while (for sending data to and from base to rover). We could build a an open source autosteer controller for tractors as an alternative to options from Trimble and John Deere. We could use all open formats and generic hardware. See what Matt Reimer built using the APM autopilot controller. The end result is about as incredible as the support he got on the DIY Drones discussion forum. I expect RTK GPS prices to fall even further in the next couple years.
Using our RTK GPS and rover version of the APM we could build small autonomous vehicles to do spot spraying. I took my first step a few years ago by building a small prototype. I have not got round to building a larger one (yet! – there is still time).
We could mount WeedSeeker or WeedIT cameras on our open source autonomous vehicles or we could build our own! Computer vision and AI to ‘learn’ what objects look like is all the rage at the moment with some good open source software packages such as caffe. Rather than be limited to just whether an object has chlorophyll by a vegetation index, how great would it be to determine if it was grass or broadleaf. On board computers are fast and cheap and cameras are probably not as expensive as you think. This is getting very complex but not impossible.
Lets take it one step further. Why not have a drone tethered to the rover. It could be up high constantly scanning for weeds and using some sort of smart travelling salesman algorithm find the most efficient way to spot spray all the weeds with the ground vehicle. Now this is getting somewhere albeit very ambitious.
All sound exciting? Where do you start?
Remember – small steps and time.
I learned to write computer code at Udacity for free by doing the Intro to Computer Science and Web Development. Both are a few years old now but still fantastic. edX now exists which looks exciting for high quality free online learning. I’d love to do a course in software machine integration, computer vision or AI next.
Maybe hop straight to it and start working with Raspberry Pi or Arduino projects. My First project was a fixed wing UAV.
Putting this post together I discovered Farm Hack – a website that basically shares this idea that it’s a great time to innovate in ag.
Do some reading, Google searches and have a think. You may just be able to make that thing your have been dreaming of yourself!
Today I thought I would pull together some of the information that has been produced as a part of my GRDC sponsored Nuffield Scholarship on the use of unmanned aerial vehicles in the grains industry. It has been an amazing experience and I encourage farmers from around the world check the Nuffield organisation to see if a scholarship would suit them.
Satamap is a web based satellite imagery service for precision agriculture. It’s available at satamap.com.au. This is a project I am part of so the following is not an independent review, just a quickly written explanation of this innovative app. I understand my audience is fairly schooled in most things precision agriculture so I’ll skip the marketing talk and get straight to the point.
Today we are launching Satamap. This is a brand new service making up to date satellite imagery available to everyone. Our focus is on agriculture, therefore all imagery is paired with a vegetation index called Satamap Vegetation Index (SVI). It is similar to NDVI but we believe it is better at showing variability in high biomass crops and less impacted by soil colour. The colour ramp we use to represent the SVI values, while in your face at first, is designed to show biomass variability in all crops, at all stages of crop growth at all times of year. The colours remain consistent year round so that, for example blue represents the same as blue and red, red no matter which location or time of year. This is important because the Satamap slider allows any two image dates to be laid over the top of the other and the ability to slide between the two for a direct comparison. The same can be done with the standard colour imagery as well.
This service does not require drawing in of paddock boundaries or limit you to a small area of interest. Subscriptions are based on a 3 million plus hectare tile. It takes 5 minutes to subscribe and you have access to the whole area and an archive back to winter 2013. Imagery is captured at a 16 day interval. Cloud can get in the way at times which can be frustrating but we are working on increasing our imagery availability to reduce cloud impacts. The colour imagery has a resolution of 15 m and the SVI is 30 m. We cover all major cropping regions of Australia.
Satamap works best in an iPad or similar tablet device, but functions equally as well on a desktop computer. Other standard features in Satamap include custom markers, area measurement tools, imagery export and GPS location on the map. All these features themselves could warrant an article, but best to just watch the video to see some of them in action.
Satellite imagery has been available to agriculture and related industries for decades and those that have invested the time and money will attest to the value and significance in this technology but admit that all too much the time and money is often the biggest hindrance. We are aiming to solve these problems with Satamap and bring out the potential of satellite imagery for agriculture. Agronomists, grain traders, farmers, suppliers and more can all benefit from rapid, cost effective access to up to date satellite imagery.
We are in constant development. We are working on offering higher resolution imagery, ground truthing data points, exporting with post-processing and more. Currently only available in Australia, very soon we will be opening up to other parts of the world. Thanks for checking in.
After just finishing a post on some of the applications for UAS in agriculture, I thought I would share what I believe to be some of the biggest challenges the industry faces. As I continue my Nuffield Scholarship studies (thanks to GRDC) I am learning that unmanned aerial systems and the data produced is increasingly complex. In fact, many people will study a PhD just on one aspect of these systems alone (e.g. flight characteristics or remote sensing). Some of the more obvious challenges include:
If you were to go out and map a paddock at 11am and then again at 2pm, the resulting pixel values would be different. Even using the Normalised Difference Vegetation Index (NDVI) which by definition gives a normalised value, the data will be different. This is probably because the atmosphere and clouds do not seem to block/transmit/reflect wavelengths in parallel amounts.We see this effect in satellite imagery where the image is affected by cloud shadow. The table below is pixel value of Red and NIR from two points in the same barley paddock that has minimal variability.
Despite crop growth activity and biomass being very similar, these areas produce significantly different NDVI value due to the cloud shadow.
This can be applied to UAS imagery. The atmosphere, clouds and sun angle is constantly changing throughout the day affecting reflectance of all wavelengths differently. Therefore for data to be compared like to like it needs to be calibrated.
This section follows on nicely from repeatability. If data is going to be used for more than just scouting then some sort of calibration will need to take place. As discussed in a previous blog post, calibration of NDVI could be potentially be achieved using an active ground device such as a GreenSeeker. There is also the new payloads that are true multispectral and have upward looking sensor to measure irradiance which could be used to calibrate data for true reflectance. Other forms of calibration/ground truthing include biomass cutting and weighing, tissue testing, the list goes on.
3. Algorithms & Indices
When it comes to remote sensing and vegetation, NDVI is the most famous index used, and for good reason, but it is not without its concerns. If you look on Wikipedia, the authors cover some of the issues with NDVI:
Users of NDVI have tended to estimate a large number of vegetation properties from the value of this index. Typical examples include the Leaf Area Index, biomass, chlorophyll concentration in leaves, plant productivity, fractional vegetation cover, accumulated rainfall, etc. Such relations are often derived by correlating space-derived NDVI values with ground-measured values of these variables. This approach raises further issues related to the spatial scale associated with the measurements, as satellite sensors always measure radiation quantities for areas substantially larger than those sampled by field instruments. Furthermore, it is of course illogical to claim that all these relations hold at once, because that would imply that all of these environmental properties would be directly and unequivocally related between themselves.
Thankfully we are not pigeon-holed to NDVI. Agribotix claim they get better results using the Difference Vegetation Index (DVI). Another example is Soil Adjusted Vegetation Index (SAVI).
Without ground control points, the positional accuracy of data will be mediocre at best. Expect XY accuracy of a few meters and even more on the Z axis. GPS will record the position each frame is captured (+/- delay error), but the pitch, yaw and roll of the UAS which affects how the image is framed on the ground, is determined by an inertial measurement unit (IMU). The quality of the IMU will have a bearing on the positional accuracy if the processing software takes these variables into consideration. Expect to have to lay out minimum of 4 ground control points for high accuracy data.
5. Reliable data collection
As a consequence of the process involved in collecting data with a UAS there are several factors that contribute to the ability to reliably collect data. The process usually involves the UAS following a set lawnmower style track, conducting swaths up and back as it moves across the area of interest. As the vehicle is flying, it is capturing an image every few seconds depending on its speed. A forward and side overlap greater than 50% is required. Although processing software can handle images with some angle to the ground, extreme pitch and roll will affect the overall product.
In the data collection process if the UAS is hit by a gust of wind, it may put 2 or 3 images off target, while the autopilot makes adjustment. This can lead to a hole in the data set. This is not uncommon. Some UAS manufacturers allow you the rapidly transfer the raw data off the vehicle to a laptop in the paddock to check data coverage and quality in field (e.g. Precision Hawk).
So remember when someone claims their machine can fly in high winds, it is probably true, but the quality of the data being collected may not be of much use.
6. Data processing
Reliably collecting the data is just the first step in the process. Once all these images have been collect they need to be stitched together. It is amazing that stitching 300+ 12MP images together which are all taken at slightly different angles to the ground is even possible. Even more so that 3D surface models can be constructed from these 2D images. Given the complexity of this task it takes large amounts of computer power and time (think several hours for 100ha). For this reason there are several cloud based platforms which can offer this service (e.g. DroneMapper and PrecisionMapper). Processing on your own desktop computer versus online services do have both their pros and cons. A downside to the cloud services is the internet bandwidth required to first transfer the raw data to the server and then retrieve it once it has been processed. A downside to the desktop solution is the upfront cost of hardware and software and the required skill set may not be available in house.
7. Storage & sharing
Once the data is processed it needs to be stored somewhere and somehow distributed. Often one scene can be more than a gigabyte. If the processed data is not cut into tiles it can require a powerful machine just to view it. This is where online solutions come into play. Again the same issues exist as above around bandwidth requirements. At some point if the data is going to be used for more than just looking at it will most likely need to be transferred onto a local machine. This is an online map of one of the first areas I mapped with my DIY Finwing Penguin UAS. If for some reason the data is to be printed, it needs to be formatted as such which takes time and software.
8. System integration
Integrating data generated from UAS into existing precision agriculture software should be possible but not likely in its highest available spatial resolution. Software such as Farmworks and SST were not designed for such intense data sets which are able to be generated from UAS sensors. Resampling data to 0.5m resolution may be required.
9. Safety and legal
A whole other post could be written on this but basically in Australia we have the Civil Aviation Safety Authority (CASA). They require anyone who wants to fly UAS (they call them remotely piloted aircraft or RPA) commercially to have an Operators Certificate and Controllers Certificate. This, among other things requires a theory component equivalent to a private pilots license, comprehensive manuals written, and a stringent approvals process to ensure your vehicle is fit for service and your piloting skills are sufficient. CASA is at the moment reviewing this process and should hopefully have a revision out by the end of the year. This process seems strict, but it important to keep the UAS industry safe and professional.
Traveling through the USA & Canada as part of my Nuffield Scholarship (Thanks to GRDC) I have heard the word data more times than I could count. In this post I am going to remove the platform aspect of the unmanned system and focus on sensors and the data they provide. Then in a later post how we can use it. In this article I do plan to try cover some concepts not widely discussed in the current UAS environment.
The reason an unmanned system is flown is to collect data, then turn that data into information to help monitor, assess and ultimately make timely, cost effective decisions based on that information. When collecting data it needs to be good quality. It is important not to confuse data quality with data type. For example, many tend to gravitate straight to the amount of megapixels a sensor captures, neglecting its spectral accuracy.
If we consider what our target is when collecting data from a UAS in a grain farming situation, it will most commonly be vegetation (not always but let’s focus on that). Collecting spatially referenced data of vegetation is by no means a new endeavour. This information has been collected as vastly as Landsat satellite imagery and as specific as a GreenSeeker. Generally, for vegetation, similar bandwidth reflectance is measured irrespective of proximity to the target. The same is true for sensors used in UAS. Why is this the case? Well you can read the long answer here (Remote Sensing of Biomass). The short answer is photosynthesis. In a plant that is photosynthesizing, the chlorophyll will absorb large amounts of ‘visual’ light, particularly blue and red, and reflect near infrared (NIR) light. The more photosynthetic activity, the more NIR light is reflected and less visual light absorbed. Conversely, inactive vegetation will reflect more visual and less NIR.
The best contrast is between red and NIR light which is what is generally used when calculating the Normalised Difference Vegetation Index (NDVI). NDVI is a good indicator of plant health and measure of biomass. Consequently, most sensors used to determine NDVI look into the red and NIR bands – some more accurately than others. The chart below shows the reflectance curve of green grass over different wavelengths. Below the X axis is a rough spectral guide to some of the well-known sensors available to us.
What is most notable is the wavelength spectrum or band at which each of the sensors read. If we consider the GreenSeeker, it is extremely specific at capturing a certain wavelength in the middle of the red spectrum and similarly specific in the NIR. At the other end of this comparison you can see that the S100 modified camera has very broad spectrum for each channel that it reads. Consider (what was before modification) the S100’s ‘red’ channel which reads roughly from 0.67um to 0.76um with the modified filter. Post modification, this channel is renamed NIR and measures reflectance in an area that covers red right through to NIR. The S100 modification retains the blue and green channels which replaces red when calculating NDVI. Another significant point that this chart does not show is the interference that can occur between the different bands in a point and shoot camera. Check out the S100 reflectance chart about half way down the page in this link which shows some NIR response in the blue and green channels.
It has to be noted that it is hardly fair to compare the S100 and GreenSeeker in a practical sense for several reasons with the main one being that you would not mount a GreenSeeker on a UAV as it needs to be close to the target (the GreenSeeker is an active sensor meaning that it emits light and measures that reflectance, the S100 is a passive sensor just reading reflectance from the sun). In addition, the GreenSeeker measures only one point whereas the S100 collects data on 12 million pixels. The reason I do compare them is because they can both be used to produce an NDVI map. In fact despite the spectral differences from each of these sensors and the proximity to the target, RoboFlight claim from their tests that NDVI data collected from a GreenSeeker and modified S100 correlate in a linear fashion very closely (r squared > 0.9). So we know that the two sensors correlate well but the correlation will never be a fixed formula because sunlight reflected will always be different based on sun angle, atmosphere conditions, cloud etc. The S100 and GreenSeeker would probably work best as tools which complement each other. For example, map a large area with the S100 on a UAS. The resulting dataset could be calibrated using GreenSeeker data collected in the field at the same time as the flight. Potentially if S100 data is always calibrated against the GreenSeeker, inter-paddock and inter-season comparisons can be made.
We are starting to see a new wave of sensor development designed specifically for UAS and agricultural and environmental industries. An available example is the Airinov Multispec 4C. This sensor captures 4 distinct, narrow spectral bands with no interference. These bands include green, red, red edge and NIR. What makes this package special is that not only does this sensor package look down at the vegetation; it is also looking up measuring sunlight with a lux meter. This should allow us to generate data that can be calibrated without the need for ground truthing with a GreenSeeker or similar ‘active’ device. Another feature of this sensor is that it uses a Global Shutter which means all pixels in a photo are captured at exactly the same time eradicating any motion blur. The 4C has much less spatial resolution than the S100 (1.2MP vs 12MP). Expect to pay over US$10,000 for this sensor package, not including UAV or processing software.
In summary, this article aims to explain how there is more to a UAS sensor than just megapixels. It is important to understand the spectral response of vegetation and how this can impact your sensor choice. A modified Canon camera such as the S100 is a great option for UAS but its limitations must be understood. Work needs to be done to analyse the results and accuracy of the new sensors such as the Multispec 4C.
* Further notes: The most common sensor used in small UAS (mid 2014) is a Canon S100 or similar variant. This camera was never designed to be flown in an UAS but the internal GPS, fast shutter speed (1/2000), relatively large sensor (12.1 MP 1/1.7″ Canon CMOS), low weight (198g), ability to be modified to detect NIR, CHDK compatibility for intervalometer, and low cost (<$500) all contribute to a well suited sensor for this application. Flown at 120m this camera can provide a ground resolution of 3.5cm.
Built with a 3DR Pixhawk (APM:Plane 3.0.2), Finwing Penguin 2815 fixed wing air frame, S100 Canon camera and RFD900 Radio (excellent Australian product), my budget agricultural unmanned aerial vehicle (aUAV) has kept me busy and learning lots in any spare time over the last few months.
In my last post, Unmanned Aerial Vehicles (UAV) in Precision Agriculture, I outlined the main components of a UAV for precision agriculture focusing on a fixed wing platform for collecting high resolution paddock scale data. In this followup post I will attempt to log some of my experiences. Note that this is just a learning experience – there are many commercial UAV options available for agriculture that are less time consuming and provide similar or better results right away.
I needed a fixed wing platform that is readily available, cheap, with potential for long battery life, stability in the air and plenty of space for electrical components. I chose the Finwing Penguin. With the standard 2815 Finwing motor, 60 amp electronic speed controller (ESC), 9×6 propeller combined with a 4400Mha 3 cell Lipo battery I was only able to achieve about 20minutes flight time or enough to map about 40ha at 12m/s. I have a CXN engine mount which enables me to go to a 10″ prop which some of the gurus recommend. I could also increase my battery capacity and battery cell count to get longer flight time.
The Penguin is rather unique in its class as it has inbuilt landing gear. This consists of a pair wheels at the front and a single wheel at the back. I think this assists in preserving the plane when there is nowhere soft to belly land. The landing gear also allows you to take off like a traditional plane rather than hand launching. After making weight or centre gravity (CG) changes I will often take off from the ground. The downside is that these wheels to block up with mud quite easily if you land in a wet paddock.
The wings (including tail wing) come off for transport. I usually remove the main wings but leave tail wing in place as it is quite hard to get on and off due to awkward wiring and attachment arrangement.
The Penguin UAV does come with a pre-cut hole to install a down facing camera but it does not suit to place the camera top facing forward which is desirable. It was also very awkward to get the camera in and out as I had the Pixhawk autopilot installed above the camera position. I decided to go at the plane with a hacksaw and build a camera mount that would allow the camera to be installed from underneath the plane and also enough space to mount the camera top face forward.
Autopilot, GPS & Radio modem
As far as autopilot is concerned 3DR Pixhawk with APM:Plane 3.0.2 was the best option. At first I had issues getting my plane to fly well but once I upgraded to version 3.0.2+ the autotune feature changed the game altogether. This allowed the APM to adjust the PID settings in the plane as I manually flew it around. It works really well! During my latest flight I had an 8km/h cross wind that the APM was able to fly against successfully.
The GPS is a Ublox LEA-6M. It works well considering the price point. I did not attempt autonomous landing which is when GPS accuracy is more important. This GPS is able to get a fix within seconds of start-up and generally no issues throughout flight.
I initially used the 3DR radio modems but had all sorts of problems keeping a solid connection with my GCS. I decided to bite the bullet and buy a quality radio modem that should last me a long time and exceed all my range requirements. The RFD900 Radio pair is compatible with 3DR equipment and slotted in quite well. I did have to manufacture a cable to connect it to the Pixhawk and it took a bit of searching to figure which wires went where but I got it sorted within an hour or so. The RFD900 did have some driver issues on Windows. I had to install an old driver before I could get Mission Planner connected to the Pixhawk through RFD900. This all equates to time spent mucking around… BUT once working this product is excellent and I always have strong telemetry signal.
Ground Control Station (GCS)
The Mission Planner software which runs on the ground station laptop allowing you to program the UAV and monitor it in flight is very good – especially the Auto Waypoint Survey Grid feature. This allows you to draw the area you want to on the map. Simply load in a photograph from the camera you will be using and the target elevation. From this information is draws a flight path with your desired overlap.
Sensor & Image Processing
Canon S100 is my sensor of choice as it is a great balance of quality, price functionality and size. I started with a Canon D10 but many of the photos came out under exposed. The S100 has a larger sensor and inbuilt GPS so it is a better choice for aerial mapping. The downside to the S100 is that the lens protrudes from the camera which exposes it to damage in a rough landing.
With UAV aerial mapping you need a way for the camera to trigger every few seconds on its own. With a Canon camera it is easy using the Canon Hack Development Kit (CHDK). This updates the camera firmware, allowing you to use intervalometer scripts to trigger camera every few seconds. CHDK also offers what seems like unlimited settings for the camera. It seems difficult to find a complete set of settings to use with CHDK, but for my next flight I will try using the DroneMapper documentation to setup CHDK.
In my last flight approximately 30% of my photos came out blurry. I discarded the worst of the photos but still had to use some poor quality photos to ensure the map had no blank spots. This is probably due to a combination of camera settings, camera mount and propeller slightly out of balance.
Using desktop software trial of Agisoft Photoscan I was able to product a 40ha orthomosaic. The application works surprisingly well considering all images are taken at slightly different angle and are only provided with one GPS point for each photo. It is a very computer intense process and if I was to do a significant amount of processing would need to upgrade my computer. Alternatively I could use DroneMapper, but my dataset did not meet their requirements because I had to cull some images. I hope to try DroneMapper next time.
I took my data a step further and set up a web server to host the data as tiles. You can check it out here. How to store and share data collected by UAVs is something I have been thinking about. An orthomosaic for a single paddock can be several gigabytes and take a powerful computer to view in its raw form. The web seems like a good way to display data having a server store the data and only send bits of the image that the end user requests as they zoom in and move around.
This is my second flying UAV. My first was a Finwing Penguin as well. I spent a couple days putting my plane together and all the components. It is a nervous time flying your brand new plane for the first time. The first time out my plane few OK in manual mode but since I am a very ordinary pilot I like to use assisted flying modes. I changed to Fly By Wire mode and due to a APM setting (that I had to set) the autopilot had reversed the elevator sending it crashing into the ground. This snapped the plane in half and bent up the fuselage. Thankfully this durable foam returns to shape when you put it in boiling water and the pieces can be glued back together, reinforced with carbon fiber and fiberglass tape. Now I follow the suggested checks in the APM:Plane instructions more closely. I’ve had no crashes since but have landed in mud which can be painful to clean out of the landing gear.
Putting together this UAV I have learned how all the components of a UAV fit together, challenges faced by the commercial suppliers, and a better understanding of the enormous potential on offer. I think the biggest challenge is not the UAV platform itself but collecting high quality consistent data that can be quickly processed and given a useful, profitable application. The setup I have discussed here not including laptop or the countless hours of time comes to about AU$1200. Obviously for mapping large areas on a consistent basis, a commercial UAV would be preferred or even essential.
Technology in farming is constantly evolving. Collecting accurate, reliable georeferenced (location in terms of GPS coordinates) data is essential to capitalise on technologies such as variable rate application of chemicals and fertiliser and aid in crop monitoring at a level once not imagined. Some current forms of collecting georeferenced paddock data include:
Combine harvester – yield maps (crop yield as harvester works through paddock)
Satellite imagery – colour and near infrared (NIR) bands to produce natural images & vegetation indices such as Normalised Difference Vegetation Index (NDVI)
Aerial imagery – similar to satellite but offers higher resolution at higher price & some other sensor options
Tractor – Greenseeker (plant biomass), digital elevation model (DEM) collected from high accuracy GPS
Utility vehicles e.g. Soil sampling pH & nutrition, electromagnetic conductivity, Greenseeker, DEM
Handheld with GPS – Greenseeker, soil sampling
Stationary – moisture probe, weather station
Unmanned Aerial Vehicles (UAVs) are emerging as a cost effective way to collect data with many advantages over the traditional forms listed above. UAVs are as the name suggests an unmanned vehicle which flies over the paddock to collect data. These machines are generally compact, can be cheap, mechanically simple, fly below cloud cover and are on there way to being easy to operate with advanced autopilot systems.
Over the last 6 months I have begun researching civilian UAVs and their application in agriculture as part of my Nuffield Scholarship. Furthermore, I have been testing a budget UAV platform which I will discuss in a later post. The aim of this post is to aggregate key information and ideas on the topic into one space. It is by no means comprehensive – more of a beginning. Note that I am not a pilot or lawyer. This article is general in nature and does not give permission to fly or legal advice. Lets start with a sky-high view.
The Agricultural UAV Solution
It is important to consider all aspects pertaining to the agricultural UAV (aUAV) Solution which I define as a robust, timely, cost effective way to collect usable data to improve yields and overall profitability in sustainable farming systems. Consider the following formula:
All components of the formula need to be working well and working together for the product to be successful technology. Now enough of inventing acronyms and formulas that will inevitably change, it’s time to flesh out the components of the aUAV Solution.
There are two main platforms available: fixed wing and multi-rotor. A fixed wing platform has the advantage of covering large areas efficiently, whereas a multirotor shines in being able to remain very stable in challenging conditions with large payloads.
Due to the scale of broadacre grain growing in Australia, my interest lies predominately with the fixed wing platform type, as paddocks often exceed 250ha (~620ac). ConservationDrones has an excellent list of budget fixed wing platforms they have used as an example.
Global Positioning Systems (GPS) are the backbone of most spatial technologies. GPS on the UAV tells the autopilot where it is at all times. In addition, GPS links the data collected to it’s spatial position (aka geo-referencing).
Many UAVs are equipped with a u-blox GPS receiver or similar which is compact and provides <5m horizontal accuracy. These systems are affordable and are accurate for most situations.
An exciting development is the Piksi by Swift Navigation, which is a low cost Real Time Kinetic (RTK) GPS receiver that promises to sell for around $1000 which is unheard of in the world of GPS. The Piksi offers centimetre level accuracy inside a compact design ideal for small UAVs. The improved accuracy will be invaluable for autonomous landings and improved accuracy of geo-referencing data.
We are seeing UAV autopilots improve very quickly with increased reliability, especially within the open source community. Autopilots are essential for being able to effortlessly fly over a whole area to collect the desired data. DIY Drones‘ APM:Plane is often the autopilot of choice for hobbyists and entry to mid level platforms. It uses the same hardware and similar software to the APM:Rover I built last year.
There are several other autopilots available, commercial and open source, that are worth checking out. Google it.
Usually the UAV is communicating with a ground control station (GCS) via radio link. GCS is usually just a laptop computer with software such as Mission Planner. Mission Planner is also used to set the flight paths for the UAV missions.
The most complex part of collecting good data is having the correct sensor. For plant biomass data, the most important spectral range is in the near infrared spectrum. The two most common options include Tetracam ADC Lite built specifically for UAVs or a digital camera modified to capture within this spectrum (MaxMax for example). The latter option is the most cost effective solution. Some preliminary studies show that some good results can be achieved.
Researchers are working hard improving sensors for UAVs. For example, TerraLuma, is a research group at the University of Tasmania. Projects of interest include high accuracy geo-referencing of imagery ‘on the fly’ and the use of a hyperspectral pushbroom scanner to collect data.
Public Lab (an open source community) is also working at modifying cameras similar to MaxMax but also on cheaper devices such as web cams. The recently achieved funding through a Kickstarter campaign. Maybe we will have another cost effective solution soon. See also Pi NoIR.
It is worth mentioning that it is very common for UAVs to have a GoPro camera (or similar) mounted to capture high definition video footage. This video footage is valuable for visually monitoring crops from the sky but is generally not processed to geo-referenced data. There is always exceptions such as shown in this video over a construction site where video footage is used to generate a 3D model.
Data Processing & Integration
Although collecting good data is the most challenging part, the most time consuming (and/or expensive) part can be processing it to a point where it can be integrated into precision agriculture systems. Generally the UAV will follow a lawn mower track collecting an image images at a defined interval with a generous defined overlap. The raw data will usually be images (up to several hundreds – think gigabytes) with a single GPS position and maybe a bearing per image. The challenge for the data processing is to stitch these images together to generate one homogenous data set. Every image is affected by the differing roll, yaw and pitch of the UAV as the image is captured. Some of the more common applications include:
Drone Mapper is a successful web based startup which effectively filled the affordable yet professional data processing gap
Microsoft ICE is a web based and free to use but only stitches images without offering geo-referencing or 3D modelling unlike the above mentioned applications
VisualSFM, CMVS, and CMPVMS – Flight Riot does the hard work explaining how to use this software to generate a 3D model from digital camera photos. This is probably one of the more complex processes but uses all free(ish) software.
Once a geo-referenced, homogenous, data set over a paddock is achieved it could possibly undergo further post processing to determine NDVI. This raster data may then, for example, be used to define zones for in crop variable rate fertiliser application.
As mentioned, some of the above software is able to create 3D models from 2D photographs. These 3D models could be used to create digital elevation models (DEM) which is valuable in farming for determining water movement.
Legal & Operation
In Australia, the Cival Aviation Safety Authority (CASA) rule the sky. CASA has rules governing the use of UAVs (which they call unmanned aerial systems or UAS) and is in the process of re-evaluating some regulations. See a summary of a recent speech from CASA here.
To operate a UAV/UAS commercially in Australia you need to have a certified operators certificate. A list of those certified is available here.
CASA have done well to have a system set up for UAS. The USA is lagging behind and is just now establishing rules and regulations are UAVs.
You can buy a calibrated, tested, ready to fly system built from budget readily available components and open source autopilot. For example Event38 and Flight Riot.
The third option is to go fully DIY. I have tried this using a Finwing Penguin fixed wing platform, APM:Plane autopilot, ordinary Canon digital camera as sensor. I am yet to process any images into geo-referenced datasets. I will post more about this soon. Here is an image from one of my first flights.