Please use this identifier to cite or link to this item:
|Title:||Distributed Sensitivity Analysis of Flood Inundation Model Calibration|
|Authors:||HALL Jim; TARANTOLA STEFANO|
|Citation:||JOURNAL OF HYDRAULIC ENGINEERING-ASCE vol. 131 no. 2 p. 117-126|
|Publisher:||ASCE-AMER SOC CIVIL ENGINEERS|
|Type:||Articles in Journals|
|Abstract:||Uncertainties in hydrodynamic model calibration and boundary conditions can have a significant influence on flood inundation predictions. Uncertainty analysis involves quantification of these uncertainties and their propagation through to inundation predictions. In this paper the inverse problem of sensitivity analysis is tackled, in order to diagnose the influence that model input variables, together and in combination, have on the uncertainty in the inundation model prediction. Variance-based global sensitivity analysis is applied to simulation of a flood on a reach of the River Thames (United Kingdom) for which a synthetic aperture radar image of the extent of flooding was available for model validation. The sensitivity analysis using the method of Sobol’ quantifies the significant influence of variance in the Manning channel roughness coefficient in raster-based flood inundation model predictions of flood outline and flood depth. The spatial influence of the Manning channel roughness coefficient is analyzed by dividing the channel into subreaches and calculating variance-based sensitivity indices for each subreach. Replicated Latin hypercube sampling is used for sensitivity analysis with correlated input variables. The methodology identifies subreaches of channel that have the most influence on variance in the model predictions, demonstrating how far boundary effects propagate into the model and indicating where further data acquisition and nested higherresolution model studies should be targeted.|
|JRC Institute:||Institute for the Protection and Security of the Citizen|
Files in This Item:
There are no files associated with this item.
Items in repository are protected by copyright, with all rights reserved, unless otherwise indicated.