How many crop and soil sensors are needed?
and soil sensors may have increased the precision with which farmers
can manage their crops but recently released research in Agronomy
Journal questions whether more precise management is necessarily more
August 11, 2008 – On-the-go crop and soil sensors may have increased the precision with which farmers can manage their crops but recently released research in Agronomy Journal questions whether more precise management is necessarily more efficient.
Researchers examined how precise sensor and application grids should be for optimal efficiency. They discovered that the law of diminishing returns applies to precision agriculture, calculating how large of an application area is optimal for precision management techniques. According to the authors, this change could present significant cost savings for farmers.
To determine the ideal amount of data needed for precision management, the researchers calculated the optimal combination of physical sensor density (number of sensors along the applicator apparatus) and sensor output density (sensor readings per unit distance along the travel path).
The researchers found that sensor grid size can be increased from the current smallest size of 0.5 square meters to 5.1 square meters with no significant impact on the overall mapping of a crop’s canopy or field variation. The larger grid requires fewer sensors and makes fertilizer application easier and more cost efficient. This tenfold increase in grid size could have significant cost savings for farmers using precision management techniques.
An abstract of the article can be viewed at http://agron.scijournals.org/cgi/content/abstract/100/2/454.
Print this page