Methodologieën voor de reductie van de onzekerheid op de resultaten van rivierwaterkwaliteitsmodellen = Methodologies for reduction of output uncertainty of river water quality models
Vandenberghe, V. (2008). Methodologieën voor de reductie van de onzekerheid op de resultaten van rivierwaterkwaliteitsmodellen = Methodologies for reduction of output uncertainty of river water quality models. PhD Thesis. Universiteit Gent: Gent. ISBN 978-90-5989-221-7. 267 pp.
In European legislation, especially in the Water Framework Directive (WFD), it is stated that all water bodies need to have a good ecological status, close to the pristine conditions. Pristine water means that the water is kept natural and healthy as it was in ancient times, before any influence of humans. In the Clean Water Act (CWA), the principal governing law for water in the United States, the objective is the restoration and maintenance of the chemical, physical and biological integrity of the nation's water. To reach the goals of the WFD or the CWA, a lot of measures are still needed to improve the water quality.
To evaluate the present situation and to predict the effects of measures taken to improve the river water quality, models are used. For several reasons, the uncertainty on the model results is sometimes very high. First there is the problem of the use of poor quality data as inputs or for the calibration of model parameters. Second, the models used nowadays are more complex and as such they contain more parameters. Because of correlation and parameter dependencies, it is not possible to estimate all parameters. Hence, some parameters need to be fixed while only the most important parameters are changed during the calibration . As a consequence, the uncertainty on the model predictions becomes even larger because of fixing wrong subsets of parameters or due to setting a subset of the model parameters on wrong values taken from literature.
It is the aim of this dissertation to promote good modelling practices and to provide in a systematic way methods that help the water manager or engineer to minimise the uncertainty on the model results . The methods developed and applied in this work are simple, straightforward, with easy to use software or with software that can easily be developed by the user. The methods and tools were applied on real case studies, either the river Dender or the river Nete, both in Belgium.
Reduction of the output uncertainty of a model can be achieved during many stages of the modelling process. In this work, for every modelling step, some important issues related to model reliability were answered by discussing and applying methods and tools that help to analyse the behaviour of the model and perform actions to reduce output uncertainty.
The first step in the modelling process is the model study plan and the decision of the most adequate model for the problem under consideration. It is not only necessary to decide for a model that can describe the current state of the river well, but also it must be capable of evaluating the probable changes in the system when the model is used for scenario analysis. With a sensitivity analysis two different water quality concepts, QUAL2E and RWQM I, were evaluated with regard to their use in management decisions. It was shown that for the case study on the Dender river, the DO model results of QUAL2E-based water quality models mainly relate to the algae processes whereas the RWQM I is also taking into account sedimentation and stresses processes performed by different microbial communities. This study shows that managers should be aware of the possibilities and limitation of the model they use and choose a model that fits their problem and expectations. Also, knowing which processes will become important after execution of a scenario can make that during model set up extra attention is paid towards those processes in order to get more reliable results.
Sometimes one needs to make a model of a basin with few or no available data. In such situation it is difficult to decide what processes are important and need to be included in the model. The only data that are easily obtained are data which can be gathered by direct observation like hilly region, algae bloom in summer, high summer temperatures. In this work a study for model application in ungauged basins is performed in which the most important parameters are identified for different circumstances by using a kind of sensitivity analysis of a sensitivity analysis. It is concluded that the model shows different sensitivities to the parameters in different external circumstances. A table could be established in which external circumstances, here called soft data, i.e. data that are easily collected, are related to the importance of the parameters. In this table a first indication is given of which parameters/processes one should focus on in a particular catchment characterised by the soft data. Knowing the most influential set of parameters is important for calibration of a model , optimal experimental design, uncertainty estimations and scenario analysis where other processes can become important compared to the base case.
The second step involves data gathering and measurement campaigns. It is obvious that it is important to know what input data is needed and what kind of measurement data (frequency, location, amount, ... ) for calibration is best suited to have minimal uncertainty of the output results. A method of iterative optimal experimental design (QED) was proposed to minimise uncertainty of parameter estimates during calibration. It is shown that QED methods can be used for an iterative , sequential design of a strategy for measuring water quality variables in a river, in view of the calibration of water quality models. In a first stage a relatively extensive set of measurements is needed to set up a model for the river. Using this initial model, the OED method enable s the definition of efficient measurement strategies, to find better model parameter estimates and reduce the uncertainty of those estimates. For the collection of input data, an uncertainty analysis is performed to guide measurement campaigns. Parameters, diffuse and point pollution inputs were considered separately, providing information on the model sensitivity towards these three. This study showed important periods and locations for measurements.
The third step comprises of the set-up of the model. Also in this step of the modelling process different actions exist to assure minimal output uncertainty. There is the need of profound checks of the input files, performance of test runs and checking mass balances. In this work , no additional research related to this step in the modelling process is performed.
A calibration and validation of the model is performed in the fourth step of the modelling process. Two problems arise during the calibration. The first one is that not all parameters can be estimated because of correlation and dependencies. In this work a sensitivity analysis was applied to identify the most important parameters related to a modelling problem. The SA revealed that only around 10 parameters need to be changed during calibration to obtain good fits between simulated and measured values, for the periods with dissolved oxygen concentrations below a critical value. Second, the problem of fixing wrong parameter subsets on literature values should be dealt with. A practical example shows the consequences of using the wrong parameter subset for the calibration of the model. It was demonstrated that calibrating with different subsets of parameters gives very different model predictions and can lead to different conclusions.
In the last step a simulation and evaluation of the model results needs to be performed. Once the model is calibrated and validated the model can be used for scenario analysis and comparison of different scenarios. Next to the simulation results, uncertainty calculations are needed as well because uncertainty on the results can be too high to find a significant difference between the results of two scenarios. Two practical examples, the evaluation of the cost-effectiveness of in-stream aeration for the Dender river and the assessment of the effect of shading along the Nete river are presented. Uncertainty analysis on the results for the Dender, for the scenario 'with aeration' shows that it is possible for the dissolved oxygen content of the river to drop below a critical level even when the designed aeration system is installed, albeit for very short periods. The uncertainty analysis also shows that the two options , "with" or "without aeration" are significantly different. For the Nete case study about shading, it could be concluded that shading can effectively influence the water quality of a surface water body, in particular in streams that suffer from excessive algal growth during the summer periods because algal growth is reduced by shading up to 20%. In this case study however, no significant positive effects of shading on the minimum, maximum and average concentrations of DO, COD, phosphates, ammonium and nitrates in the water were identified.
The overall conclusion of this PhD study is that by applying different methodologies on the models, uncertainty on the results can be made smaller and a number of suggestions for better measurement campaigns were formulated. Measurement campaigns that aim to calibrate the model better for the low DO concentrations in the river should preferably be organised in spring . When calibrating the model for the Dender river for point pollution , measurements during dry periods are needed. For calibration of the model to study diffuse pollution, measurements during periods with rainfall and high flows are needed.
All data in the Integrated Marine Information System (IMIS) is subject to the VLIZ privacy policy