Old MacDonald had a robot: The future of automated agriculture

01 January 2024
By Mara Johnson-Groh
The Model B Smart Sprayer. Photo credit: Verdant Robotics

At farms across America, you might notice something new in the fields: robots. Not unlike Roombas, these robots are automating some of the most tedious tasks in agriculture. Some robots till land, others carry crops, and some power entirely autonomous tractors. A product called the Model B Smart Sprayer—a 19-foot-long machine that rides on the back of tractors—can identify, target, and spray up to 700,000 weeds with chemical or organic herbicides every hour with submillimeter precision. It also simultaneously collects valuable information on how each plant is growing over the season.

This is not your grandparents’ farming.

These farming robots are built by Verdant Robotics, one company among many in a young and bourgeoning sector that are leveraging cutting-edge automation techniques to revolutionize agriculture. In 2022, venture investors put nearly a billion dollars into ag-tech start-ups.

This investment comes at a pivotal time. Farming has never been easy, yet now more than ever, it’s a demanding job. Costs are skyrocketing. The climate is becoming harsher and more unpredictable. In many places, there is also an ongoing labor shortage for a career that is known to be more punishing than profitable. But new technologies are trying to alter that trajectory.

From weeding to detecting disease and harvesting crops, robots are helping improve yields, reduce labor requirements, and lessen environmental impact. This adoption of technology is playing a necessary role in helping usher farming through the coming era where new sustainable practices are needed to feed the globe without further environmental damages. By 2050, the global population is expected to reach 9.7 billion people, and crop yields are simultaneously expected to see a climate change-driven decline, according to scientific models.

At the heart of this robot revolution are camera systems that will serve as the eyes of agriculture. Optics has already had a place at the agricultural table, from designing better greenhouses to improving artificial LED lighting sources. Now, paired with image analysis software, optical technologies are helping farmers see better and see more.

“Computer vision is going to fundamentally change agriculture,” says Curtis Garner, co-founder and chief commercial officer for Verdant Robotics.

The Smart Sprayer’s success comes from multiple high-resolution cameras paired with an intricate computational program that combines neural networks, video processing, and complex algorithms to determine plant location and qualities on the go as the robot bounces down dusty rows of crops. The machine not only automates the work of many farmhands, but also tailors the needs of each plant in terms of water, fertilizer, herbicide, and so on.

“Historically, we farmed on a whole field basis. We would apply everything uniformly across the field—water, fertilizer, herbicide, pesticides,” says Alex Thomasson, a professor of agricultural and biological engineering as well as director of the Agricultural Autonomy Institute at Mississippi State University. “With automation comes the ability to be more precise.”

An imaging spectrograph, RGB camera, and a multispectral camera being tested for detecting disease in grape plants. Photo credit: Gerrit Polder

Today, farmers can use systems like Verdant Robotics’ to apply water and chemicals only where needed. This optimizes costs for the farmer, but also improves environmental stewardship. Verdant Robotics says its Smart Sprayer uses 96 percent fewer chemicals than standard methods and can reduce the cost of farming an acre from $3,000 to just $30.

“Fundamentally, it’s about doing more with less,” says Gabe Sibley, Verdant Robotics CEO and co-founder. “The amount of efficiency that can be gained with the technology is actually huge.”

The role of cameras in replacing the eyes of farmers dates to the 1920s, when pilots realized airplane flyovers could help detect disease in cotton fields. Today, cameras in agriculture run the gamut from off-the shelf units to custom-designed models costing thousands of dollars.

The most common, and the ones employed by the Smart Sprayer, are RGB cameras—the same type put in phones and that use red, green, and blue channels to make full-color images. Their ubiquity means the cameras are relatively cheap and accessible. As a result, many scientists are researching their use in various agricultural automations.

At Wageningen University in the Netherlands, Gerrit Polder works on machine-vision and robotics projects in agriculture. He and a team recently developed a disease detection robot for vineyards. It’s a coffee-table-sized wheeled box that follows behind a tractor and uses RGB cameras and computer vision software to detect disease in grape plants at the earliest stages.

While RGB cameras can work well for disease detection in plants, researchers have known since the 1930s that other wavelengths are even more useful for measuring plant wellbeing. For example, wavelengths from the red edge of visible light to the near infrared are important for calculating vegetation indexes, a proxy for plant health. Similarly, shortwave infrared wavelengths are useful for probing a plant’s water content, which can reveal the level of crop stress.

In his work to develop a disease detector for grapes, Polder also tested multispectral cameras, which combine several specific wavelength ranges that include visible light and beyond. While these cameras can detect diseases earlier, they are also more challenging to work with.

That’s because, for a camera to detect a disease or a weed, it must know what it’s looking at. This is typically achieved with artificial intelligence or neural network software trained on a large dataset of images to tell the computer what is and isn’t a disease or a weed. Training datasets for ubiquitous RGB cameras are easy to come by; those for multispectral cameras, however, are not. This means that, contrary to expectations, the resulting software for multispectral cameras can actually be worse at detecting disease because they are trained on smaller multispectral datasets.

“In the end we decided to switch back to just [using RGB cameras] because the neural networks work so well for them,” Polder says. Additionally, multispectral cameras remain prohibitively expensive at around $10,000 apiece, whereas RGB cameras cost in the hundreds of dollars range.

“I think in the future this will change,” Polder says, citing the rapid development of photonics technologies in recent years. “More multispectral data will help develop deep learning networks as well.”

Other researchers are testing the limits of hyperspectral imaging, which is a step up from multispectral imaging and covers hundreds of even-more-narrow wavelength ranges. However, due to their incredibly high cost of $75,000 to $220,000 apiece, only a limited amount of research has been done on what these cameras can achieve, though prices are starting to drop.

“Hyperspectral imaging is not a new technology. It has been around for 30 years or so,” says Bing Lu, an assistant professor of geography at Simon Fraser University, who studies the use of hyperspectral imaging in agriculture. “But the sensor price is quite high, so it’s not even widely studied in academia and less in the industry to support farm practice.”

Hyperspectral cameras are more commonly flown on satellites to image large areas. That doesn’t help the farmer who is interested in mapping their fields at a per-crop level. Lu and others have completed studies, however, that show hyperspectral cameras could work well when flown with drones or used from ground-based platforms that pass over single fields. The cameras were able to identify weeds and plants’ nutrient levels, such as nitrogen, phosphorus, and potassium. While promising, it could be a while before they’re widely used.

“The hyperspectral and multispectral [cameras] have yet to be successfully commercially deployed into agriculture,” Garner says. But when they are, “There’s going to be a lot of value derived in that color space for early disease detection [and] pest-damage quantification.”

Hyperspectral cameras are also enhancing the building blocks of farming—the plants themselves—by revolutionizing plant phenomics, the study of a plant’s visible traits that are dependent on the organism’s environment as well as its genes. A large group of researchers are studying crop phenotypes, for example, to design plants that can better weather droughts, floods, and the coming weather challenges associated with climate change.

Hyperspectral camera being outfitted on a drone for testing. Photo credit: Bing Lu

In recent years, advances in genomic sequencing have allowed researchers to quickly develop new crop test varieties. However, plant traits—such as biochemistry, morphology, and production levels—are highly dependent on both environment and genetics. This means researchers have to test large numbers of genetic strains under different growing conditions to truly assess which varieties can meet specific needs.

Traditionally, phenomics relies on time- and labor-intensive methods that destroy the plant in the process. But in recent years, many types of cameras have been employed to automate the data collection and actively track plant growth all the way through harvest—a technique dubbed high-throughput plant phenotyping.

“Improving and advancing plant phenotyping has the potential to impact many lines of research aimed at improving and advancing agriculture,” says Malia Gehan, a principal investigator at the Donald Danforth Plant Science Center, who works with one such robot system called the Transportation Energy Resources from Renewable Agriculture Phenotyping Reference Platform, or TERRA-REF.

TERRA-REF, which is the largest agricultural robot in the world, works in a field south of Phoenix, Arizona, that is typically planted with sorghum. These plants are highly studied for their ability to adapt to adverse climate conditions. At the farm, the giant robot moves on long rails to scan more than one thousand plants across a one-acre field every day. The robot is equipped with a host of sensors, including hyperspectral, thermal infrared, and RGB cameras.

The images are analyzed to determine traits like carbon uptake, water use, tissue chemistry, and more, which helps predict the stress resistance and crop yield for various sorghum varieties. Researchers with the project have made their image analysis software, PlantCV, open source to encourage its use.

“When we started PlantCV nine years ago, we could analyze RGB, near infrared, and grayscale fluorescence images, but have since expanded to hyperspectral and thermal image data,” Gehan says. “We quickly realized that there was a need for open-source, well-documented, user-friendly, highly flexible image analysis software.”

By speeding up the phenotyping process with automation, the TERRA-REF organization hopes they can streamline the development of new crops that can help farmers combat the effect of climate change.

 As robotics works its way into agriculture, it promises to utterly upend farming while simultaneously helping return it to its roots. The ability to map and deliver care on an individual plant level could help farmers turn away from monocropping to mixed-species fields, which would be better for pollinators and the soil.

Autonomous agricultural machines could also provide potentially more benefits to small farmers, Thomasson says. Although costs for farming robots are currently high, there is room for fee-for-service operations and rentals, as Verdant Robotics has done with its Model B Smart Sprayer. Thomasson recently began studying the socio-economic changes that might result from automation in farming. He already suspects there might be a change in the labor force with the use of robotics.

“I think the types of jobs in agriculture will be upskilled,” Thomasson says. “I think instead of having people sitting on tractors and driving them through the field, we’re more likely to have people sitting either in an office or in a pickup truck in the field, overseeing multiple machines at the same time.”

Cameras and optical technology are showing up in other sectors of agriculture as well. They have been utilized to analyze soil chemistry, pick ripe produce, prune fruit trees, semi-automate sheep shearing, and sort product quality. It seems likely that eventually no area of agriculture will be untouched by automation.

For now, robots are largely limited to a few applications, start-ups, and scientific studies. The adaption of robotic technology, Sibley says, is in its early days. “I think that it’s sort of a pre-Cambrian explosion and there’s still lots of evolution to come.”

Mara Johnson-Groh is a freelance science writer and photographer who writes about everything under the sun, and even things beyond it.

Recent News
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research