Publications

Detailed Information

Identifying and extracting bark key features of 42 tree species using convolutional neural networks and class activation mapping

Cited 0 time in Web of Science Cited 10 time in Scopus
Authors

Kim, Tae Kyung; Hong, Jeonghyun; Ryu, Daun; Kim, Sukyung; Byeon, Si Yeon; Huh, Woojin; Kim, Kunhyo; Baek, Gyu Heon; Kim, Hyun Seok

Issue Date
2022-12
Publisher
Nature Publishing Group
Citation
Scientific Reports, Vol.12 No.1, p. 4772
Abstract
© 2022, The Author(s).The significance of automatic plant identification has already been recognized by academia and industry. There were several attempts to utilize leaves and flowers for identification; however, bark also could be beneficial, especially for trees, due to its consistency throughout the seasons and its easy accessibility, even in high crown conditions. Previous studies regarding bark identification have mostly contributed quantitatively to increasing classification accuracy. However, ever since computer vision algorithms surpassed the identification ability of humans, an open question arises as to how machines successfully interpret and unravel the complicated patterns of barks. Here, we trained two convolutional neural networks (CNNs) with distinct architectures using a large-scale bark image dataset and applied class activation mapping (CAM) aggregation to investigate diagnostic keys for identifying each species. CNNs could identify the barks of 42 species with > 90% accuracy, and the overall accuracies showed a small difference between the two models. Diagnostic keys matched with salient shapes, which were also easily recognized by human eyes, and were typified as blisters, horizontal and vertical stripes, lenticels of various shapes, and vertical crevices and clefts. The two models exhibited disparate quality in the diagnostic features: the old and less complex model showed more general and well-matching patterns, while the better-performing model with much deeper layers indicated local patterns less relevant to barks. CNNs were also capable of predicting untrained species by 41.98% and 48.67% within the correct genus and family, respectively. Our methodologies and findings are potentially applicable to identify and visualize crucial traits of other plant organs.
ISSN
2045-2322
URI
https://hdl.handle.net/10371/195530
DOI
https://doi.org/10.1038/s41598-022-08571-9
Files in This Item:
There are no files associated with this item.
Appears in Collections:

Altmetrics

Item View & Download Count

  • mendeley

Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.

Share