Publications
Detailed Information
A Deep-Learning Ensemble Method to Detect Atmospheric Rivers and Its Application to Projected Changes in Precipitation Regime
Cited 8 time in
Web of Science
Cited 6 time in Scopus
- Authors
- Issue Date
- 2023-06
- Publisher
- John Wiley & Sons, Inc.
- Citation
- Journal of Geophysical Research: Atmospheres, Vol.128 No.12
- Abstract
- This study aims to detect atmospheric rivers (ARs) around the world by developing a deep-learning ensemble method using AR catalogs of the ClimateNet data set. The ensemble method, based on 20 semantic segmentation algorithms, notably reduces the bias of the testing data set, with its intersection over union score being 1.7%–10.1% higher than that of individual algorithms. This method is then applied to the Coupled Model Intercomparison Project Phase 6 (CMIP6) datasets to quantify AR frequency and its related precipitation in the historical period (1985–2014) and future period (2070–2099) under the Shared Socioeconomic Pathways 5–8.5 warming scenario. The six key regions, which are distributed in different continents of the globe and greatly influenced by ARs, are particularly highlighted. The results show that CMIP6 multi-model mean with the deep-learning ensemble method reasonably reproduces the observed AR frequency. In most key regions, both heavy precipitation (90–99 percentile) and extremely heavy precipitation (>99 percentile) are projected to increase in a warming climate mainly due to the increased AR-related precipitation. The AR contributions to future heavy and extremely heavy precipitation increase range from 145.1% to 280.5% and from 36.2% to 213.5%, respectively, indicating that ARs should be taken into account to better understand the future extreme precipitation changes.
- ISSN
- 2169-897X
- Files in This Item:
- There are no files associated with this item.
Related Researcher
- College of Natural Sciences
- Department of Earth and Environmental Sciences
Item View & Download Count
Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.