Classification-based mapping of trees in commercial orchards and natural forests

Giorgi Kozhoridze, Nikolai Orlovsky, Leah Orlovsky, Dan G. Blumberg, Avi Golan-Goldhirsh

Research output: Contribution to journalArticlepeer-review

6 Scopus citations

Abstract

Hyperspectral remote sensing (RS) and images of various spatial resolution open new vistas for classification and mapping trees. These approaches would improve plant classification in a complex population of forest trees of diverse species, genera, and families, as well as monitoring commercial orchards. In this work, we used new RS indices for cellulose, lignin, wax, chlorophyll, carotenoid, and anthocyanin for plant species classification in natural forests and commercial orchards. For proof of concept, the indices were applied to the classification and mapping of various horticultural crop orchards, where error due to the spatial mixing of different trees is minimal. The classification accuracy of the maps varied between 65 and 82%. This wide range was a result of the following factors: The RS index used, the season, and the spatial resolution of the hyperspectral images. The classification quality was highest when the full set of RS indices was used. The effect of the wax index on accuracy was significant. Furthermore, seasonality played an important role in the classification; the target species were better resolved in spring than in the summer. The higher spatial resolution of the images does not necessarily yield better classification and mapping results; it appeared to be case-specific and greatly depended on the species/crop and the unique environment.

Original languageEnglish
Pages (from-to)8784-8797
Number of pages14
JournalInternational Journal of Remote Sensing
Volume39
Issue number23
DOIs
StatePublished - 2 Dec 2018

ASJC Scopus subject areas

  • General Earth and Planetary Sciences

Fingerprint

Dive into the research topics of 'Classification-based mapping of trees in commercial orchards and natural forests'. Together they form a unique fingerprint.

Cite this