Reflective error: a metric for assessing predictive performance at extreme events
When using machine learning to model environmental systems, it is often a model’s ability to predict extreme behaviors that yields the highest practical value to policy makers. However, most existing error metrics used to evaluate the performance of environmental machine learning models weigh error equally across test data. Thus, routine performance is prioritized over a model’s ability to robustly quantify extreme behaviors. In this work, we present a new error metric, termed Reflective Error , which quantifies the degree at which our model error is distributed around our extremes, in contrast to existing model evaluation methods that aggregate error over all events. The suitability of our proposed metric is demonstrated on a real-world hydrological modeling problem, where extreme values are of particular concern.
Details
Publication status:
Published
Author(s):
Authors: Rouse, Robert Edwin ORCID record for Robert Edwin Rouse, Moss, Henry, Hosking, Scott ORCID record for Scott Hosking, McRobie, Allan, Shuckburgh, Emily