Few activities are more tied to location and the geospatial landscape than agriculture. Agricultural businesses, research and policy makers rely on quantitative data about soils, water, weather, inputs, productivity, outputs, and markets. This summit will tackle the big questions on big data for agriculture in New Zealand and globally: how to make it really work for farmers, policy-makers, markets and consumers?
Presentations and workshops will cover
Precision Agriculture
Environmental Data and Information
The Internet of Things and new sensor technologies
Applications and mobile
Privacy, security and protections
Maps and models – current and future
Collaborations and standards in action
Join international geospatial experts along with local innovators in Palmerston North for this one day Summit.
Dan Bloomer attended the 20th Symposium on Precision Agriculture in Sydney.
The PA Symposium brings together farmers, growers, researchers, advisors and industry to discuss and absorb developments. Speakers covered cutting edge research, on-farm application by researchers, advisors and farmers, and industry background information such as the state of telecommunications and data ownership.
As Brett Whelan told delegates, “The purpose of precision agriculture has always been to increase the number of correct decisions made in the businesses of crop and animal management. It is a logical step in the evolution of agricultural management systems toward increased efficiency of inputs relative to production, minimized waste and improved product quality, traceability and marketability.”
Crop and soil sensing continues to develop, and there is increasing use of new approaches. Canopy assessment has relied heavily on NDVI, the 1970s vegetation index chosen for distinguishing forest from desert and ocean. In recent years a wider range of sensors capturing more light bands (blue, green, red and infrared) have become affordable and available. Some look at red-edge and thermal infra-red, two bands often related to crop stress of some form. Off the shelf cameras that fit simple UAVs are within farm budgets now.
Ian Yule described research with hyperspectral sensors that capture very detailed images with hundreds of light bands. Hundreds of ground control samples provide “real” information and enormous amounts of data get analysed to identify relationships. The capacity of this to determine species, plant nutrient status and other useful information is remarkable. The current research equipment and processing is very expensive but assume price drops as commercialisation progresses.
Machine vision including object shape, texture and colour is being used to recognise individual objects such as plants, parts of plants or specific weeds. Discussing robotics research to guide decision making on vegetable farms Zhe Xu noted, “If a human can recognise something, a machine can be taught to as well.” Get used to artificial intelligence, neural programming and autonomous phenotyping!
We presented our own onions research which is using smartphone cameras to capture very useful crop development information quickly and cost effectively. Combined with crop models and web based calculation we can predict final yields with fair accuracy early enough to support crop management decisions.
An Australian vegetable research project is using similar approaches to support decision making in carrot crops and investigating others with promise. That team includes researchers and farmers, and is increasingly using yield monitors for crops such as potatoes and carrots. Converting yield data to value allows farmers to estimate costs of variability and how much to invest to fix problem areas.
Data capture, communications and analysis was a key theme. Kim Bryceson described the establishment of a sensor network and analytics using IoT (internet of things) tools at Queensland University Gatton. Rob Bramley explained a process that predicted sugar yields at regional scale to promote better fertiliser management in that industry. Patrick Filippi presented a “big data” approach to predicting grain yield.
The data revolution is changing our world in ways we can’t yet imagine. The increasing amount of things measured, the spatial scale and time span of collection and development of data science to analyse huge streams of information revolutionise our understanding. These are exciting times. Some jobs are going to go, but others will be created as we require completely new skills for jobs not heard of a decade ago.
“We are all in the position of making decisions from a limited understanding or a particular perspective, working with biological systems that are incredibly complex and impossible to fully understand, “ said Ian Yule. “Recent experience with new sensing technologies and data processing has produced new information that challenges our preconceived ideas and understandings,” he said.
The PA Symposium is presented by SPAA, the Society for Precision Agriculture Australia, and the Precision Agriculture Laboratory at the University of Sydney. There has always been a New Zealand presence because while some details are unique, the tools and processes are for the most part generic.
In 2017 our 15th Annual Conference focuses on automated tools for data collection, decision making and doing actual tasks on the farm (and beyond).
What do you want?
What’s on offer?
How will farms and management have to change?
We have a comprehensive programme. We’ve gone a bit outside the box to bring a variety including from outside the horticultural and arable sectors. We find cross-pollination and hybrid vigour valuable!
So register, come along and listen to excellent presenters, discuss the ideas with colleagues and go away with new understanding and plans.
We also welcome our Gold Sponsors, meal sponsors and trade displays new and old. These are the organisations that make conferences like this possible and affordable.
Join them and us at the Havelock North Function Centre on 24-25 May 2017 to mix with leading practitioners, farmers, growers, researchers, technology developers and providers.
GrowMaps principal Luke Posthuma completed the survey, and says his observations as the survey progressed suggest there is a reasonable spread of pH across our relatively small area.
As well as Veris sampling, Luke took a number of soil samples for verification and calibration checks.
The Veris equipment also maps soil electrical conductivity (EC) down to 60cm. Soil EC is a measurement of how much electrical current soil can conduct. It is often an effective way to map soil texture because smaller soil particles such as clay conduct more current than larger silt and sand particles.
Part of the Veris pH mapping is post-survey processing to create the most reliable result. We await the processed maps with considerable interest.
We previously had a similar soil conductivity map provided by AgriOptics and it will be interesting to compare the results.
Now in year two of our OnionsNZ SFF project, we have trials at the MicroFarm and monitoring sites at three commercial farms in Hawke’s Bay and three more in Pukekohe.
2015-16
A summary of Year 1 is on our website. A key aspect was testing a range of sensors and camera systems for assessing crop size and variability. Because onions are like needles poking from the ground, all sensors struggled especially when plants were small. This is when we want to know about the developing crop, as it is the time we make decisions and apply management.
By November our sensing was more satisfactory. At this stage we captured satellite, UAV, smartphone and GreenSeeker data and created a series of maps.
We used the satellite image to create canopy maps and identify zones. We sampled within the zones at harvest, and used the raltioship between November canopy and February yield to create yield maps and profit maps.
We also developed relationships between photographs of ground cover, laboratory measurements of fresh weight and leaf area and the final crop yield.
In reviewing the season’s worth of MicroFarm plot measurements and noticed there were areas where yield reached its potential, areas where yield was limited by population (establishment), some where yield was limited by canopy growth (development) and some by both population and development.
This observation helped us form a concept of Management Action Zones, based on population and canopy development assessments.
2016-17
Our aims for Year 2 are on the website. We set out to confirm the relationships we found in Year 1.
This required developing population expectations and determining estimates of canopy development as the season progressed, against which field measurement could be compared.
We had to select our “zones” before the crop got established as we did a lot of base line testing of the soil. So our zones were chosen based on paddock history and a fair bit of guess work. Really, we need to be able to identify zones within an establishing or developing crop, then determine what is going on so we can try to fix it as quickly as possible.
In previous seasons we experimented with smartphone cameras and image processing to assess canopy size and relate that to final yields. We are very pleased that photographs of sampling plots processed using the “Canopeo” app compare very well with Leaf Area Index again this season.
Through the season we tracked crop development in the plots and using plant counts and canopy cover assessments to try and separate the effects of population (establishment) and soil or other management factors.
We built a web calculator to do the maths, aiming for a tool any grower or agronomist can use to aid decision making. The web calculator was used to test our theories about yield prediction and management zones.
ASL Software updated the “CoverMap” smartphone application and we obtained consistent results from it. The app calculates canopy ground cover and logs data against GPS position in real time. Because we have confidence that ground cover from image processing is closely related to Leaf Area Index we are working to turn our maps into predictions of final yields.
The current season’s MicroFarm crop is certainly variable. Some is deliberate: we sat the irrigator over some areas after planting to simulate heavy rain events, and we have a poorly irrigated strip. We know some relates to different soil and cover crop histories.
But some differences are unexpected and so far reasons unexplained.
Together with Plant and Food Research we have been taking additional soil samples to try and uncover the causes of patchiness.
We’ve determined one factor is our artificial rain storm, some crop loss is probably runoff from that and some is historic compaction. We’ve even identified where a shift in our GPS AB line has left 300mm strips of low production where plants are on last year’s wheel tracks!
But there is a long way to go before this tricky crop gives up its secrets.
This project is in collaboration with Plant and Food Research and is funded by OnionsNZ and the MPI Sustainable Farming Fund.
A version of this article previously appeared in The Grower
Dan Bloomer has been travelling in Australia and Europe asking, “How ready are robots for farmers and how ready are farmers for robots?”
Notable areas of active research and development globally are scouting, weeding and fruit picking. Success requires machines that can determine and follow a route traversing whatever terrain it must, capture information, identify and selectively remove weeds, and identify, pick and transport fruit. They have to sense, analyse, plan and act.
Robotics is widespread in industries such as car manufacturing that have the exactly the same task being repeated over and over again. With possible exception of robotic milking, farm operations are not like that. Virtually every single case is unique with unique responses needed.
Many groups around the world are looking at robotic weeding . There are many items needing attention. How do we tell weeds from crop plants? Can we do that fast enough and reliably enough to make a robot commercially viable on-farm? Once identified, how do we optimise robotic arm movement to best attack a patch of weeds?
The Australian Centre for Field Robotics (ACFR) at the University of Sydney is well known for its field robots such as the solar powered Ladybird. The new generation Ladybird is known as Rippa, and is currently undergoing endurance testing. Look on YouTube for ACFR videos and you’ll even see SwagBot moving around rolling hill country.
A key theme for Rob Fitch and colleagues is Active Perception: perception being what we can detect with what accuracy and confidence; active meaning in real time and including planning actions. They invest heavily in developing mathematics to get fast results. And they are succeeding.
Using Intel’s RealSense structured light camera it takes them less than half a second to identify and precisely locate groups of apples on a trellis. Within that time they also calculate exactly where to place the camera to get a second confirming view.
Cheryl McCarthy and colleagues at the National Centre for Engineering in Agriculture (NCEA) are conducting a range of research projects that integrate autonomous sensing and control with on-farm operations to robotically manage inputs within a crop. Major projects include automation for weed spot spraying, adaptive control for irrigation optimisation, and remote crop surveillance using cameras and remotely piloted aircraft.
Now Cheryl is using UAVs to capture photos of crops, stitching the pictures to get a whole paddock image, then splitting it up again to efficiently identify and locate individual plants and weeds. This is enabling her to create accurate maps some other weed destroying robot can use.
SwarmFarm founders, Andrew and Jocie Bate grow cereals and pulses near Emerald. Spray-fallow is used to conserve water in this dryland environment and WeedSeeker® and Weedit® technologies reduce chemical use to a very small percentage of traditional broadcast application.
With large areas, most growers move to bigger machinery to maximise labour efficiency. This has a number of adverse effects including significant soil damage and inability to work small areas or work efficiently around obstacles such as trees.
SwarmFarm chose robots as practical light weight equipment. They reason that several small machines working together reduce soil impact and have the same work rate as one big machine. Andrew estimates that adoption of 8 m booms versus 34 m booms could increase the effective croppable area in Queensland by 2%.
Are these robots ready for farmers? Are farmers ready for these robots?
Only SwarmFarm has multiple machines currently working on farm in Australia. They are finalising a user interface that will allow non-graduate engineers (smart farmers) to manage the machines.
The question that remains is, “Why would I buy a specialised machine when I can put a driver on a cheaper conventional tractor or higher work rate sprayer and achieve the same?”
Is it the same?
Travel to Australia was supported by a Trimble Foundation Study Grant
A desire to reduce soil compaction and avoid high and inefficient use of chemicals and energy inspired Steve Tanner and Aurelien Demaurex to found eco-Robotix in Switzerland.
Their solution is a light-weight fully solar-powered weeding robot, a 2 wheel drive machine with 2D camera vision and basic GPS. Two robotic arms position herbicide nozzles or a mechanical device for precision weed control.
The ecoRobotix design philosophy is simplicity and value: avoiding batteries cuts weight, technology requirements and slashes capital costs. It is a step towards their vision of cheap autonomous machines swarming around the farm.
Bought by small farms, Naio Technologies’ Oz440 is a small French robot designed to mechanically weed between rows. The robots are left weeding while the farmer spends time on other jobs or serving customers. Larger machines for vegetable cropping and viticulture are in development.
Naio co-founder Gaetan Severac notes Oz440 has no GPS, relying instead on cameras and LiDAR range finders to identify rows and navigate. These are small machines with a total price similar to a conventional agricultural RTK-GPS system, so alternatives are essential.
Tech companies have responded and several “RTK-GPS” systems are now available under $US1000. Their accuracy and reliability is not known!
Broccoli is one of the world’s largest vegetable crops and is almost entirely manually harvested, which is costly. Leader Tom Duckett says robotic equipment being developed at the University of Lincoln in England is as good as human pickers at detecting broccoli heads of the right size, especially if the robot can pick through the night. With identification in hand, development is now on mechanical cutting and collecting.
In 1996, Tillett and Hague Technologies demonstrated an autonomous roving machine selectively spraying individual cabbages. Having done that, they determined that tractors were effective and concentrated on automating implements. They are experts in vision systems and integration with row and plant identification and machinery actuation, technology embedded in Garford row crop equipment.
Parrish Farms has their own project adapting a Garford mechanical to strip spray between onion rows. Nick Parrish explained that Black Grass control was difficult, and as available graminicides strip wax off onions boom spraying prevents use of other products for up to two weeks.
Route planning to avoid hazards and known obstacles
Laser range finder to sense objects and define them as obstacles
Wide area safety curtain sensing ground objects at 2m
Dead man’s handle possibly via smartphone
Collapsible bumper as a physical soft barrier that activates Stop
Big Red Buttons anyone close can see and use to stop the machine
Machines that are small, slow and light minimise inertia
“Hands free hectare” is Harper Adams University’s attempt to grow a commercial crop using open source software and commercially available equipment in an area no-one enters.
Harper Adams research to develop a robotic strawberry harvester is notable for the integration of genetics for varieties with long stalks, a growing system that has plants off the ground, and the robotic technologies to identify, locate and assess the ripeness of individual berries and pick them touching only the peduncle (stalk).
So what have I learned about farm robotics?
People believe our food production systems have to change
Farm labour is in short supply throughout the western world
Machines can’t get bigger as the soil can’t support that
Robotics has huge potential but when
Safety is a key issue but manageable
There is huge investment in research at universities, but also in industry
It’s about rethinking the whole system not replacing the driver
There are many technologies available, but probably not the mix you want for your application.
As Simon Pearson at the National Centre for Food Manufacturing says, “It’s a Frankenstein thing, this agrobotics. There are all sorts of great bits available but you have to seek them out and stitch them together yourself to make the creature you want.”
Dan’s travel was supported by a Trimble Foundation Study Grant
After identifying areas within paddocks that had yields limited by different probably causes, we conceived the idea of Management Action Zones (MAZs).
Some areas showed that yield was limited by plant number: establishment was poor. Others had the expected population, but low biomass: the plants were small due to some other limiting factor.
If we can identify zones easily, and determine the causes, we should be able to target a management response accordingly. So for this season, we set out a revised research aim.
What we want to know:
Can we successfully determine a management action zone in a field?
Why do we need to know this?
Develop a tool to increase uniformity and yield outcomes
Develop a tool to evaluate management practices and crop productivity
If we want to successfully determine a management action zone in a field then there are two main steps to achieve in this year’s work:
Confirm the relationship between digital data and crop model parameters
Does the relationship stay constant over time and sites?
How early in growth can a difference be detected?
Can the relationship be used to show a growth map across a field?
Develop an approach to gather information and ways to input and display results, initially using a website approach.
Can we integrate a plant count and yield information to start developing a management action zone?
How should this be put together in a way growers can start to use to gather information about their crops?
At the MicroFarm, we established six research zones based on paddock history and excessive wetness at establishment.
We have three paddock histories: two years of onion production with autumn cover crops of Caliente mustard, two years of onion production with autumn cover crops of oats, and no previous onion crops planted after previous summer sweetcorn and autumn sown rye grass. In each of these areas, we deliberately created sub-zones by applying about 45mm of spray irrigation as a “large rain event”.
The impact of the artificial rainstorm is evident on images taken at the end of November.
The Precision Agriculture Association NZ is presenting workshops focused on technologies available to help reduce nitrogen leaching. There are two North Island workshops being offered at:
Massey Universityon Thursday 1st September 2016 [PDF here]
and
Ellwood Centre, Hastings on Friday 2nd September 2016 [PDF here]
Programme
The ‘Technology to Reduce N Leaching’ workshops are similar to the well received program conducted in Ashburton in March 2016 and will address where we are and what we can do about nitrate leaching limits in a North Island context utilising a range of technologies and farm systems options.
The particular areas for focus for the program are:
Variable rate technologies and systems
Precision irrigation
Precision spreading systems and services
Soil mapping
Soil moisture monitoring, sensors, metering
Nutrient budgeting and environmental monitoring
A Q & A time slot is devoted in the afternoon session for attendees to interact with members and presenters on the day to share learnings and understandings about the issues. This will also be possible over the lunch break on both days with one and half hours devoted for this.
Offer to PAANZ Members
As part of the Hastingsprogram only on 2nd September, PAANZ members are offered the opportunity to participate as trade/sector participants for technologies and products as may be appropriate to support the program.
PAANZ is not able to offer trade/sector stand space at the Palmerston North venue due to space restrictions unfortunately so only the Hastings venue will be able to accommodate this option for members.
If you would like to participate please advise Jim Grennell, E-mail: jim@paanz.co.nz
Mobile:021 330 626, places are limited to ten organisations for the Hastings workshop to be involved as a trade/sector participant so it will be on a first come basis.
The cost of participation will be $100.00 plus GST per stand with attendance fee of $100.00 per person additional.
As these are indoors Workshops, with a technology focus and space at the Hastings venue is limited no large equipment or hardware can be accommodated.
Confirmation of members wishing to take up this opportunity is required by Monday 22nd August 2016 after which time the opportunity to participate will be made available to non-members.
Effective and reliable sensing for the performance of robotic tasks, such as manipulation in the outdoor environment remains a challenging problem.
While commercially available solutions such as ASA-LIFT are available for specific tasks and crops, and for operation in specific conditions, the systems are either not cost effective and or physically unsuitable for specific farming conditions and practices.
This research proposed to develop a mobile robot system with flexibility to adapt and with intelligence to cope with natural variability; through a two-fold aim utilising vision for navigation and manipulation. This talk discussed some of the recent developments on these aspects.
In particular, the talk focused on a novel approach that analyses point cloud information from a time-of-flight (ToF) camera to identify the location of foremost spring onions along the crop bed, for the intention of robotic manipulation. The process uses a combination of 2D image processing on the amplitude data, as well as 3D spatial analysis, extracted from the camera to locate the desired object.
Whilst the experimental results demonstrated the robustness of this approach, further testing was required to determine the ability of a system to cope with different scenarios that exist in the naturally varying environment.
For validation, the vision system was integrated with a robotic manipulation system and initial results of the investigation were presented.