Skip to main content

Research Repository

Advanced Search

RootNav 2.0: Deep learning for automatic navigation of complex plant root architectures

Yasrab, Robail; Atkinson, Jonathan A; Wells, Darren M; French, Andrew P; Pridmore, Tony P; Pound, Michael P

RootNav 2.0: Deep learning for automatic navigation of complex plant root architectures Thumbnail


Authors

Robail Yasrab

DARREN WELLS DARREN.WELLS@NOTTINGHAM.AC.UK
Principal Research Fellow

Profile Image

ANDREW FRENCH andrew.p.french@nottingham.ac.uk
Professor of Computer Science

TONY PRIDMORE tony.pridmore@nottingham.ac.uk
Professor of Computer Science



Abstract

© The Author(s) 2019. Published by Oxford University Press. BACKGROUND: In recent years quantitative analysis of root growth has become increasingly important as a way to explore the influence of abiotic stress such as high temperature and drought on a plant's ability to take up water and nutrients. Segmentation and feature extraction of plant roots from images presents a significant computer vision challenge. Root images contain complicated structures, variations in size, background, occlusion, clutter and variation in lighting conditions. We present a new image analysis approach that provides fully automatic extraction of complex root system architectures from a range of plant species in varied imaging set-ups. Driven by modern deep-learning approaches, RootNav 2.0 replaces previously manual and semi-automatic feature extraction with an extremely deep multi-task convolutional neural network architecture. The network also locates seeds, first order and second order root tips to drive a search algorithm seeking optimal paths throughout the image, extracting accurate architectures without user interaction. RESULTS: We develop and train a novel deep network architecture to explicitly combine local pixel information with global scene information in order to accurately segment small root features across high-resolution images. The proposed method was evaluated on images of wheat (Triticum aestivum L.) from a seedling assay. Compared with semi-automatic analysis via the original RootNav tool, the proposed method demonstrated comparable accuracy, with a 10-fold increase in speed. The network was able to adapt to different plant species via transfer learning, offering similar accuracy when transferred to an Arabidopsis thaliana plate assay. A final instance of transfer learning, to images of Brassica napus from a hydroponic assay, still demonstrated good accuracy despite many fewer training images. CONCLUSIONS: We present RootNav 2.0, a new approach to root image analysis driven by a deep neural network. The tool can be adapted to new image domains with a reduced number of images, and offers substantial speed improvements over semi-automatic and manual approaches. The tool outputs root architectures in the widely accepted RSML standard, for which numerous analysis packages exist (http://rootsystemml.github.io/), as well as segmentation masks compatible with other automated measurement tools. The tool will provide researchers with the ability to analyse root systems at larget scales than ever before, at a time when large scale genomic studies have made this more important than ever.

Citation

Yasrab, R., Atkinson, J. A., Wells, D. M., French, A. P., Pridmore, T. P., & Pound, M. P. (2019). RootNav 2.0: Deep learning for automatic navigation of complex plant root architectures. GigaScience, 8(11), Article giz123. https://doi.org/10.1093/gigascience/giz123

Journal Article Type Article
Acceptance Date Sep 22, 2019
Online Publication Date Nov 8, 2019
Publication Date Nov 8, 2019
Deposit Date Jan 22, 2020
Publicly Available Date Jan 23, 2020
Journal GigaScience
Electronic ISSN 2047-217X
Publisher Oxford University Press
Peer Reviewed Peer Reviewed
Volume 8
Issue 11
Article Number giz123
DOI https://doi.org/10.1093/gigascience/giz123
Public URL https://nottingham-repository.worktribe.com/output/3747889
Publisher URL https://academic.oup.com/gigascience/article/8/11/giz123/5614712

Files





Downloadable Citations