Plant biologists welcome their robot overlords

Thursday, February 16, 2017

Old-school areas of plant biology are getting tech upgrades that herald more detailed, faster data collection.

by Heidi Ledford

25 January 2017
A robot measures the crops in an agricultural field near Columbia, Missouri (credit: DeSouza/Fritschi/Shafiekhani/Suhas/University of Missouri)
As a postdoc, plant biologist Christopher Topp was not satisfied with the usual way of studying root development: growing plants on agar dishes and placing them on flatbed scanners to measure root lengths and angles. Instead, he would periodically stuff his car with plants in pots dripping with water and drive more than 600 kilometres from North Carolina to Georgia to image his specimens in 3D, using an X-ray machine in a physics lab.

Five years later, the idea of using detailed imaging to study plant form and function has caught on. The use of drones and robots is also on the rise as researchers pursue the ‘quantified plant’ — one in which each trait has been carefully and precisely measured from nearly every angle, from the length of its root hairs to the volatile chemicals it emits under duress. Such traits are known as an organism’s phenotype, and researchers are looking for faster and more comprehensive ways of characterizing it.

From 10 to 14 February, scientists will gather in Tucson, Arizona, to compare their methods. Some will describe drones that buzz over research plots armed with hi-tech cameras; others will discuss robots that lumber through fields bearing equipment to log each plant’s growth.

The hope is that such efforts will speed up plant breeding and basic research, uncovering new aspects of plant physiology that can determine whether a plant will thrive in the field. “Phenotype is infinite,” says Topp, who now works at the Donald Danforth Plant Science Center in St Louis, Missouri. “The best we can do is capture an aspect of it — and we want to capture the most comprehensive aspect we can.”

The plummeting cost of DNA sequencing has made it much easier to find genes, but working out what they do remains a challenge, says plant biologist Ulrich Schurr of the Jülich Research Centre in Germany. “It is very easy now to sequence a lot of stuff,” he says. “But what was not developed with the same kind of speed was the analysis of the structure and function of plants.”

Plant breeders are also looking beyond the traits they used to focus on — such as yield and plant height — for faster ways to improve crops. “Those traits are useful but not enough,” says Gustavo Lobos, an ecophysiologist at the University of Talca in Chile. “To cope with what is happening with climate change and food security, some breeders want to be more efficient.” Researchers aiming to boost drought tolerance, for example, might look at detailed features of a plant’s root system, or at the arrangement of its leaves.

False-colour images of a bean-breeding trial captured by a camera mounted on a drone (credit: Lav R. Khot/Washington State University & Phillip N Miklas/USDA-ARS)

A need for speed

The needs of these researchers have bred an expanding crop of phenotyping facilities and projects. In 2015, the US Department of Energy announced a US$34-million project to generate the robotics, sensors and methods needed to characterize sorghum, a biofuel crop. Last year, the European Union launched a project to create a pan-European network of phenotyping facilities. And academic networks have sprung up around the globe as plant researchers attempt to standardize approaches and data analyses.

Large-scale phenotyping has long been used in industry, but was too expensive for academic researchers, says Fiona Goggin, who studies plant–insect interactions at the University of Arkansas in Fayetteville. Now, the falling prices of cameras and drones, as well as the rise of the ‘maker’ movement that focuses on homemade apparatus, are enticing more academics to enter the field, she says.

At Washington State University in Pullman, biological engineer Sindhuja Sankaran’s lab is preparing to deploy drones carrying lidar, the laser equivalent of radar. The system will scan agricultural fields to gather data on plant height and the density of leaves and branches. Sankaran also uses sensors to measure the volatile chemicals that plants give off, particularly when they are under attack from insects or disease. She hopes eventually to mount the sensors on robots.

A drone loaded with thermal imaging equipment flies over grapevines (credit: Lav R. Khot/Washington State University)

Sankaran’s mechanical minions return from their field season with hundreds of gigabytes of raw data, and analysing the results keeps her team glued to computers for the better part of a year, she says. Many researchers do not realize the effort and computing savvy it takes to pick through piles of such data, says Edgar Spalding, a plant biologist at the University of Wisconsin–Madison. “The pheno­typing community has rushed off to collect data and the computing is an afterthought.”

Standardizing the technology is another barrier, says Nathan Springer, a geneticist at the University of Minnesota in St Paul. The lack of equipment everyone can use means that some researchers have to rely on slower data-collection methods. Springer has been working with 45 research groups to characterize 1,000 varieties of maize (corn) grown in 20 different environments across the United States and Canada. The project has relied heavily on hand measurements rather than on drones and robots, he says.

Topp now has his own machine to collect computed tomography (CT) images, but processing samples is still a little slow for his liking. He speaks with reverence of a facility at the University of Nottingham, UK, that speeds up its scans by using robots to feed the plants through the CT machine. But he’s pleased that he no longer has to haul his soggy cargo across three states to take measurements. “It’s just endless, the number of possibilities.”

Nature 541, 445–446 (26 January 2017) | doi:10.1038/541445a

NSF Dear Colleague Letter: Supporting Fundamental Research in Unmanned Aerial Systems (UAS)

Tuesday, August 23, 2016

August 8, 2016

Dear Colleagues,

With this Dear Colleague Letter (DCL), the National Science Foundation’s (NSF) Directorates for Computer and Information Science and Engineering (CISE) and Engineering (ENG) announce their intention to support, foster, and accelerate fundamental research that advances the positive use of Unmanned Aerial Systems (UAS) to save lives, increase safety and efficiency, and enable more effective science and engineering research. These research investments will be made through existing CISE and ENG core and crosscutting research programs.

NSF-funded advancements are enabling a wide variety of beneficial applications of UAS in areas such as monitoring and inspection of physical infrastructure, prevention of airport bird strikes, smart emergency/disaster response, natural gas leak detection, agriculture support, personal services, and observation and study of weather phenomena including severe storms. These advances are made possible through fundamental investments in theoretical principles of UAS, including intelligent sensing, perception, and control; estimation; communications; collaboration and teaming; UAS adaptation and learning; human-UAS interaction; and safety, security, and privacy of UAS. These novel fundamental approaches enable increased understanding of how to intelligently and effectively design, control, and apply UAS to beneficial applications.

NSF welcomes proposals that accelerate fundamental technological advances in UAS; these proposals should be submitted to existing CISE and ENG core and crosscutting research programs, following all proposal preparation instructions specified in the corresponding program announcements and solicitations. All proposals must meet the requirements of NSF’s Grant Proposal Guide (GPG), along with any program- or solicitation-specific proposal preparation instructions and review criteria. Proposals must be synergistic with the goals of the programs to which they are submitted.

For further information, interested PIs may contact:

  • Dr. Reid Simmons, CISE/IIS, Program Director, at resimmon [at] nsf [dot] gov; and
  • Dr. Jordan M. Berg, ENG/CMMI, Program Director, at jberg [at] nsf [dot] gov.


Jim Kurose
Assistant Director, CISE

Grace Wang
Acting Assistant Director, ENG

Click here for the PDF version of the Dear Colleague Letter.

EPSCoR Imaging Workshop - April 13, 2017

Event date(s): Thursday, April 13, 2017
Location: Donald Danforth Plant Science Center, 975 North Warson Road, Saint Louis, MO 63132

Date:                               April 13, 2017, 8:45 a.m.-5:30 p.m.
Location:                       Donald Danforth Plant Science Center
                                            975 North Warson Road
                                            Saint Louis, MO 63132


The EPSCoR imaging workshop will be held April 13, 2017 at the Donald Danforth Plant Science Center in Saint Louis, MO. The purpose of the workshop is to link imaging and image processing to phenotyping and to share collective capacity on plant imaging technology and data. There will be presentations and work groups so that researchers can interact with each other’s data.  There will also be opportunities for imaging equipment demonstrations and presentations during breaks and lunch.   There is a $25.00 registration fee to attend this workshop.  

Click here for the workshop agenda.

Register here:


Click here for hotel accommodation at the Drury Inn and Suites-Creve Coeur located at 11980 Olive Blvd, Creve Coeur, MO 63141.  If you require shuttle service between the Drury and the Danforth Center, please request during check-in.

For questions, contact:

Kathleen Mackey, kmackey [at] danforthcenter [dot] org, 314-587-1203

Download Flyer: