An example template for conducting PAWN global sensitivity analysis on parameters of the PRMS model using the PRMS-Python framework
|DOI:||10.4211/hs.bae3f93a5dc54dd886729265eecc784f How to Cite|
|Created:||Sep 20, 2018 at 10:50 p.m.|
|Last updated:||Sep 22, 2018 at midnight by John Volk|
Global sensitivity analysis GSA is a useful tool for diagnosing and quantifying uncertainty within hydrologic models. Facilitating advanced model analyses such as GSA of parameters has the potential to help advance our fundamental understanding of hydrologic process representations. This document acts as a working template to apply a GSA method for parameters of the well-known Preceipitation-Runoff Modeling System (PRMS) hydrologic model maintained by the United States Geological Survey. Specifically, it documents a workflow for a moment-independent, GSA method based on empirical cumulative distribution functions named PAWN. The template is a Jupyter notebook that uses an open-source Python package called PRMS-Python; installation instructions for PRMS-Python and links to both PAWN and the Python software are included. PRMS-Python has a built in routine for Monte Carlo parameter resampling that this template demonstrates and uses to implement PAWN. The template is written so that it could be modified for an arbitrary set of PRMS parameters and is heavily commented for clarity. As such, this template along with the open-source Python package aim to encourage and facilitate the greater hydrologic modeling community to conduct advanced model analyses such as GSA. Similarly, the PRMS-Python framework has tools for self-generation of metadata files that track data provenance of large model ensembles- a useful tool for sharing model results on platforms such as HydroShare.
template,PAWN,Python,sensitivity analysis,Jupyter notebook,PRMS
How to cite
This resource is shared under the Creative Commons Attribution CC BY.http://creativecommons.org/licenses/by/4.0/
|John Volk||University of Nevada Reno|
Please wait for the process to complete.