Spelling suggestions: "subject:"blood estimation"" "subject:"flood estimation""
1 |
Statistical techniques for regional flood-frequency analysisWiltshire, S. E. January 1987 (has links)
No description available.
|
2 |
Application of Monte Carlo Simulation Technique with URBS Runoff-Routing Model for design flood estimation in large catchmentsCharalambous, James, University of Western Sydney, College of Science, Technology and Environment, School of Engineering and Industrial Design January 2004 (has links)
In recent years, there have been significant researches on holistic approaches to design flood estimation in Australia. The Monte Carlo Simulation technique, an approximate form of Joint Probability Approach, has been developed and tested to small gauged catchments. This thesis presents the extension of the Monte Carlo Simulation Technique to large catchments using runoff routing model URBS. The URBS-Monte Carlo Technique(UMCT),has been applied to the Johnstone River and Upper Mary River catchments in Queensland. The thesis shows that the UMCT can be applied to large catchments and be readily used by hydrologists and floodplain managers. Further the proposed technique provides deeper insight into the hydrologic behaviour of large catchments and allows assessment of the effects of errors in inputs variables on design flood estimates. The research also highlights the problems and potentials of the UMCT for application in practical situations. / Masters of Engineering (Hons.)
|
3 |
Design Flood Criteria toward Integrated Watershed Management in the Johor River Watershed, Malaysia / マレーシア・ジョホール川流域における統合的流域管理へ向けた洪水設計基準の構築Yazawa, Taishi 23 March 2017 (has links)
京都大学 / 0048 / 新制・課程博士 / 博士(工学) / 甲第20352号 / 工博第4289号 / 新制||工||1664(附属図書館) / 京都大学大学院工学研究科都市環境工学専攻 / (主査)教授 清水 芳久, 教授 米田 稔, 准教授 KIM,SUNMIN / 学位規則第4条第1項該当 / Doctor of Philosophy (Engineering) / Kyoto University / DFAM
|
4 |
Evaluation of the SDF method using a customised design flood estimation toolGericke, Ockert Jacobus 12 1900 (has links)
Thesis (MScEng (Civil Engineering))--University of Stellenbosch, 2010. / ENGLISH ABSTRACT: The primary aim of this study was to evaluate, calibrate and verify the SDF run-off
coefficients at a quaternary catchment level in the C5 secondary drainage region
(SDF basin 9) and other selected SDF basins in South Africa by establishing the
catchment parameters and SDF/probability distribution-ratios. The probability
distribution-ratios were based on the comparison between the flood peaks
estimated by the SDF method and statistical analyses of observed flow data.
These quaternary run-off coefficients were then compared with the existing
regional SDF run-off coefficients, whilst the run-off coefficient adjustment factors
as proposed by Van Bladeren (2005) were also evaluated.
It was evident from this study that the calibrated run-off coefficients obtained are
spread around those of Alexander (2003), but were generally lower in magnitude.
The adjusted run-off coefficients (Van Bladeren, 2005) had a tendency to
decrease in magnitude with increasing recurrence interval, whilst some of the
adjusted run-off coefficients exceeded unity.
The extent to which the original SDF method overestimated the magnitude and
frequency of flood peaks varied form basin to basin, with the SDF/probability
distribution-ratios the highest in the Highveld and southern coastal regions with
summer convective precipitation. In these regions the flood peak-ratios were
occasionally different by up to a factor of 3 or even more. The southern coastal
regions with winter orographic/frontal precipitation demonstrated the best flood
peak-ratios, varying from 0.78 to 1.63.
The adjusted SDF method results (Van Bladeren, 2005) were only better in 26%
of all the basins under consideration when compared to those estimated by the
original SDF method. On average, the adjusted SDF/probability distribution-ratios
varied between 0.30 and 6.58, which is unacceptable.
The calibrated version of the SDF method proved to be the most accurate in all
the basins under consideration. On average, the calibrated SDF/probability distribution-ratios varied between 0.85
and 1.15, whilst at some basins and individual return periods, less accurate
results were evident.
Verification tests were conducted in catchments not considered during the
calibration process with a view to establish whether the calibrated run-off
coefficients are predictable and to confirm that the method is reliable. The
verification results showed that the calibrated/verified SDF method is the most
accurate and similar trends were evident in all the basins under consideration. On
average, the verified SDF/probability distribution-ratios varied between 0.82 and
1.19, except in SDF basins 6 and 21 where the 5 to 20-year return period flood
peaks were overestimated by 41% and 56% respectively, which is still
conservative.
The secondary aim of this study was to develop a customised, user-friendly
Design Flood Estimation Tool (DFET) in a Microsoft Office Excel/Visual Basic
for Applications environment in order to assess the use and applicability of the
various design flood estimation methods.
The developed DFET will provide designers with a software tool for the rapid
investigation and evaluation of alternative design flood estimation methods either
at a regional or site specific scale. The focus user group of the application will
comprises of engineering technicians, engineering technologist and engineers
employed at civil engineering consultants, not necessarily specialists in the field of
flood hydrology. The DFET processed all the catchment, meteorological
(precipitation) and hydrological (observed flows) data used as input for the various
design flood estimation methods. / AFRIKAANSE OPSOMMING: Die primêre doelwit van die studie was om die SDF-afloopkoëffisiënte op ‘n
kwartinêre opvangsgebiedvlak in die C5-sekondêre dreineringsgebied (SDFopvangsgebied
9) en ander gekose SDF-opvangsgebiede in Suid-Afrika te
evalueer, te kalibreer en te verifieer deur die opvangsgebiedparameters en
SDF/waarskynlikheidsverspreiding-verhoudings vas te stel. Dié
waarskynlikheidsverspreiding-verhoudings was gebaseer op die vergelyking
tussen die vloedpieke soos beraam deur die SDF-metode en statistiese analises
van waargenome vloeidata. Dié kwartinêre afloopkoëffisiënte is met die
bestaande streeksgebonde SDF-afloopkoëffisiënte vergelyk, terwyl die
afloopkoëffisiënt-aanpassingsfaktore soos voorgestel deur Van Bladeren (2005)
ook geëvalueer is.
Dit het duidelik uit die studie geblyk dat die gekalibreerde afloopkoëffisiënte
verspreid rondom die van Alexander (2003) is, maar in die algemeen laer in
omvang. Die aangepaste afloopkoëffisiënte (Van Bladeren, 2005) was geneig om
af te neem in grootte met ‘n toename in die herhalingsperiode, terwyl sommige
afloopkoëffisiënte ‘n waarde van 1 oorskry het.
Die omvang waartoe die oorspronklike SDF metode die grootte en herhaalperiode
van vloedpieke oorskat het, wissel van opvangsgebied tot opvangsgebied, met die
SDF/waarskynlikheidsverspreiding-verhoudings die hoogste in die Hoëveld en
suidelike kusstreke gekenmerk deur konveksie-somerreënval. In hierdie streke het
die vloedpiekverhoudings gereeld verskil tot en met ‘n faktor van 3 of selfs meer.
Die suidelike kusstreke met kenmerkende ortografiese/frontale winterreënval het
oor die beste vloedpiekverhoudings beskik wat gewissel het tussen 0.78 en 1.63.
Die resultate van die aangepaste SDF-metode (Van Bladeren, 2005) was slegs in
26% van al die opvangsgebiede beter as die beramings van die oorspronklike
SDF-metode. Die aangepaste SDF/waarskynlikheidsverspreiding-verhoudings
het, met verwysing na gemiddeldes, tussen 0.30 en 6.58 gewissel, wat
onaanvaarbaar is. Die gekalibreerde weergawe van die SDF-metode was die mees akkurate metode
in al die opvangsgebiede van belang. Die gekalibreerde
SDF/waarskynlikheidsverspreiding-verhoudings het, met verwysing na
gemiddeldes, tussen 0.85 en 1.15 gewissel, terwyl die resultate van sommige
opvangsgebiede en individuele herhalingsperiodes minder akkuraat was.
Verifikasietoetse is uitgevoer in die opvangsgebiede wat nie tydens die
kalibrasieproses gebruik was nie om vas te stel of die gekalibreerde
afloopkoëffisiënte voorspelbaar is en om te bevestig dat die metode betroubaar is.
Die verifikasieresultate het getoon dat die gekalibreerde/geverifieerde SDFmetode
die mees akkurate metode is en dat soortgelyke tendense duidelik was in
al die relevante opvangsgebiede. Die geverifieerde
SDF/waarskynlikheidsverspreiding-verhoudings het, met verwysing na
gemiddeldes, tussen 0.82 en 1.19 gewissel, behalwe in SDF-opvangsgebiede 6
en 21 waar die 5- en 20-jaar herhalingsperiode-vloedpieke onderskeidelik met
41% en 56% oorskat is, wat steeds konserwatief is.
Die sekondêre doelwit van die studie was om ‘n gebruikersvriendelike
“Design Flood Estimation Tool” (DFET) in ‘n Microsoft Office Excel/Visual Basic
for Applications omgewing te ontwikkel om die gebruik en toepaslikheid van die
verskeie ontwerpvloedberamingsmetodes te bepaal.
Die DFET sal ontwerpers voorsien van ‘n sagtewareprogram om alternatiewe
ontwerpvloedberamingsmetodes op streek- of plaaslike skaal te ondersoek en te
evalueer. Die fokus-gebruikersgroep vir die toepassing van die program sal
bestaan uit ingenieurstegnici, ingenieurstegnoloë en ingenieurs werksaam by
raadgewende siviele ingenieurs, nie noodwendig vakkundiges in die veld van
hidrologie nie. Die DFET was gebruik om al die opvangsgebied-,
meteorologiese (reënval) en hidrologiese (waargenome vloeie) data vir die
verskeie ontwerpvloedberamingsmetodes te verwerk.
|
5 |
Evaluation of the Catchment Parameter (CAPA) and Midgley and Pitman (MIPI) empirical design flood estimation methodsSmal, Ruan 12 1900 (has links)
Thesis (MScEng)--Stellenbosch University, 2012. / ENGLISH ABSTRACT: The devastating effects floods have on both social and economic level make effective flood risk management an essential part of rural and urban development. A major part of effective flood risk management is the application of reliable design flood estimation methods. Research over the years has illustrated that current design flood estimation methods as a norm show large discrepancies which can mainly be attributed to the fact that these methods are outdated (Smithers, 2007).
The research presented focused on the evaluation and updating of the Midgley and Pitman (MIPI) and the Catchment Parameter (CAPA or McPherson) empirical design flood estimation methods. The evaluation was done by means of comparing design floods estimated by each method with more reliable probabilistic design floods derived from historical flow records.
Flow gauging stations were selected as drainage data points based on the availability of flow data and available catchment characteristics. A selection criterion was developed resulting in 53 gauging stations. The Log Normal (LN) and Log Pearson Type III (LP III) distributions were used to derive the probabilistic floods for each gauging station.
The flow gauging stations were used to delineate catchments and to quantify catchment characteristics using Geographic Information Systems (GIS) software and their associated applications.
The two methods were approximated by means derived formulas instead of evaluating and updating the two methods from first principles. This was done as a result of the constraints brought about by both time and the attainment of the relevant literature. The formulae were derived by means of plotting method inputs and resulted in graphs, fitting a trendline through the points and deriving a formula best describing the trendline. The derived formulae and the catchment characteristics were used to estimate the design floods for each method. A comparison was then done between the design flood results of the two methods and the probabilistic design floods. The results of these comparisons were used to derive correction factors which could potentially increase the reliability of the two methods used to estimate design floods.
The effectiveness of any updating would be the degree (or level) in which the reliability of a method could be increased. It was proven that the correction factors did decrease the difference between the „assumed and more reliable probabilistic design floods‟ and the methods‟ estimates.
However, the increase in reliability of the methods through the use of the recommended correction factors is questionable due to factors such as the reliability of the flow data as well as the methods which had to be used to derive the correction factors. / AFRIKAANSE OPSOMMING: Die verwoestende gevolge van vloede op beide ekonomiese en sosiale gebiede beklemtoon die belangrikheid van effektiewe vloed risiko bestuur vir ontwikellings doeleindes. „n Baie belangrikke gedeelte van effektiewe vloed risiko bestuur is die gebruik van betroubare ontwerp vloed metodes. Navorsing oor die laaste paar jaar het die tekortkominge van die metodes beklemtoon, wat meestal toegeskryf kan word aan die metodes wat verouderd is.
Die navorsing het gefokus op die evaluering en moontlike opdatering van die Midley en Pitman (MIPI) en die “Catchment Parameter” (CAPA of McPherson) empiriese ontwerp vloed metodes. Die evaluering het geskied deur middel van die vergelyking van die ontwerp vloed soos bereken deur die twee metodes en die aanvaarde, meer betroubare probabilistiese ontwerp vloede, bepaal deur middel van statistiese ontledings.
Vloei meetstasies is gekies as data-punte omrede die beskikbaarheid van vloei data en beskikbare opvanggebied eienskappe. „n Seleksie kriteruim is ontwikkel waaruit 53 meetstasies gekies is. Die Log Normale (LN) en Log Pearson Tipe III (LP III) verspreidings is verder gebruik om die probabilistiese ontwerp vloede te bereken vir elke meetstasie. Die posisie van die meetstasies is ook verder gebruik om opvanggebiede te definieer en opvanggebied eienskappe te bereken. Geografiese inligtingstelsels (GIS) is vir die doel gebruik inplaas van die oorspronlik hand metodes.
Die twee metodes is benader deur die gebruik van afgeleide formules inplaas van „n eerste beginsel benadering. Dit is gedoen as gevolg van die beperkings wat teweeggebring is deur beide tyd en die beskikbaarheid van die relevante litratuur wat handel oor die ontwikkeling van die twee metodes. Die formules is verkry deur middel van die plot van beide insette en resultate in grafieke, die passing van tendenslyne en die afleiding van formules wat die tendenslyne die beste beskryf. Die afgeleide formules saam met die opvanggebied eienskappe is toe verder gebruik om die ontwerp vloede van elke meet stasie te bepaal, vir beide metodes. The resultate van die twee metodes is toe vergelyk met die probabilistiese ontwerp vloede. Die resultate van hierdie vergelyking is verder gebruik om korreksie faktore af te lei wat moontlik die betroubaarheid van die twee metodes kon verhoog.
Die doeltreffendheid van enige opdatering sal die mate wees waarin die betroubaarheid van n metode verhoog kan word. Gedurende die verhandeling is dit bewys dat die korreksie faktore wel n vermindering teweebring in die verskil tussen die ontwerp vloede van die aanvaarde meer betroubare probabilistiese ontwerp vloede van beide metodes.
Die toename in betroubaarheid van die metodes deur die gebruik van die voorgestelde korreksie faktore is egter bevraagteken as gevolg van faktore soos die betroubaarheid van die vloei data self asook die metodologie wat gevolg is om die korreksie faktore af te lei.
|
6 |
Modelling Losses in Flood EstimationIlahee, Mahbub January 2005 (has links)
Flood estimation is often required in hydrologic design and has important economic significance. For example, in Australia, the annual spending on infrastructure requiring flood estimation is of the order of $650 million ARR (I.E. Aust., 1998). Rainfall-based flood estimation techniques are most commonly adopted in practice. These require several inputs to convert design rainfalls to design floods. Of all the inputs, loss is an important one and defined as the amount of precipitation that does not appear as direct runoff. The concept of loss includes moisture intercepted by vegetation, infiltration into the soil, retention on the surface, evaporation and loss through the streambed and banks. As these loss components are dependent on topography, soils, vegetation and climate, the loss exhibits a high degree of temporal and spatial variability during the rainfall event. In design flood estimation, the simplified lumped conceptual loss models were used because of their simplicity and ability to approximate catchment runoff behaviour. In Australia, the most commonly adopted conceptual loss model is the initial losscontinuing loss model. For a specific part of the catchment, the initial loss occurs prior to the commencement of surface runoff, and can be considered to be composed of the interception loss, depression storage and infiltration that occur before the soil surface saturates. ARR (I. E. Aust., 1998) mentioned that the continuing loss is the average rate of loss throughout the remainder of the storm. At present, there is inadequate information on design losses in most parts of Australia and this is one of the greatest weaknesses in Australian flood hydrology. Currently recommended design losses are not compatible with design rainfall information in Australian Rainfall and Runoff. Also design losses for observed storms show a wide variability and it is always difficult to select an appropriate value of loss from this wide range for a particular application. Despite the wide variability of loss values, in the widely used Design Event Approach, a single value of initial and continuing losses is adopted. Because of the non-linearity in the rainfall-runoff process, this is likely to introduce a high degree of uncertainty and possible bias in the resulting flood estimates. In contrast, the Joint Probability Approach can consider probability-distributed losses in flood estimation. In ARR (I. E. Aust., 1998) it is recommended to use a constant continuing loss value in rainfall events. In this research it was observed that the continuing loss values in the rainfall events were not constant, rather than it decays with the duration of the rainfall event. The derived loss values from the 969 rainfall and streamflow events of Queensland catchments would provide better flood estimation than the recommended design loss values in ARR (I. E. Aust., 1998). In this research, both the initial and continuing losses were computed using IL-CL loss model and a single median loss value was used to estimate flood using Design Event Approach. Again both the initial and continuing losses were considered to be random variables and their probability distribution functions were determined. Hence, the research showed that the probability distributed loss values can be used for Queensland catchments in near future for better flood estimate. The research hypothesis tested was whether the new loss value for Queensland catchments provides significant improvement in design flood estimation. A total of 48 catchments, 82 pluviograph stations and 24 daily rainfall stations were selected from all over Queensland to test the research hypothesis. The research improved the recommended design loss values that will result in more precise design flood estimates. This will ultimately save millions of dollars in the construction of hydraulic infrastructures.
|
7 |
Hydrogrammes synthétiques par bassin et types d'événements. Estimation, caractérisation, régionalisation et incertitude / Catchment- and event-type specific synthetic design hydrographs. Estimation, characterization, regionalization, and uncertaintyBrunner, Manuela 29 January 2018 (has links)
L'estimation de crues de projet est requise pour le dimensionnement de barrages et de bassins de rétention, de même que pour la gestion des inondations lors de l’élaboration de cartes d’aléas ou lors de la modélisation et délimitation de plaines d’inondation. Généralement, les crues de projet sont définies par leur débit de pointe à partir d’une analyse fréquentielle univariée. Cependant, lorsque le dimensionnement d’ouvrages hydrauliques ou la gestion de crues nécessitent un stockage du volume ruisselé, il est également nécessaire de connaître les caractéristiques volume, durée et forme de l’hydrogramme de crue en plus de son débit maximum. Une analyse fréquentielle bivariée permet une estimation conjointe du débit de pointe et du volume de l’hydrogramme en tenant compte de leur corrélation. Bien qu’une telle approche permette la détermination du couple débit/volume de crue, il manque l’information relative à la forme de l’hydrogramme de crue. Une approche attrayante pour caractériser la forme de la crue de projet est de définir un hydrogramme représentatif normalisé par une densité de probabilité. La combinaison d’une densité de probabilité et des quantiles bivariés débit/volume permet la construction d’un hydrogramme synthétique de crue pour une période de retour donnée, qui modélise le pic d’une crue ainsi que sa forme. De tels hydrogrammes synthétiques sont potentiellement utiles et simples d’utilisation pour la détermination de crues de projet. Cependant, ils possèdent actuellement plusieurs limitations. Premièrement, ils reposent sur la définition d’une période de retour bivariée qui n’est pas univoque. Deuxièmement, ils décrivent en général le comportement spécifique d’un bassin versant en ne tenant pas compte de la variabilité des processus représentée par différents types de crues. Troisièmement, les hydrogrammes synthétiques ne sont pas disponibles pour les bassins versant non jaugés et une estimation de leurs incertitudes n’est pas calculée.Pour remédier à ces manquements, cette thèse propose des avenues pour la construction d’hydrogrammes synthétiques de projet pour les bassins versants jaugés et non jaugés, de même que pour la prise en compte de la diversité des types de crue. Des méthodes sont également développées pour la construction d’hydrogrammes synthétiques de crue spécifiques au bassin et aux événements ainsi que pour la régionalisation des hydrogrammes. Une estimation des diverses sources d’incertitude est également proposée. Ces travaux de recherche montrent que les hydrogrammes synthétiques de projet constituent une approche qui s’adapte bien à la représentation de différents types de crue ou d’événements dans un contexte de détermination de crues de projet. Une comparaison de différentes méthodes de régionalisation montre que les hydrogrammes synthétiques de projet spécifiques au bassin peuvent être régionalisés à des bassins non jaugés à l’aide de méthodes de régression linéaires et non linéaires. Il est également montré que les hydrogrammes de projet spécifiques aux événements peuvent être régionalisés à l’aide d’une approche d’indice de crue bivariée. Dans ce contexte, une représentation fonctionnelle de la forme des hydrogrammes constitue un moyen judicieux pour la délimitation de régions ayant un comportement hydrologique de crue similaire en terme de réactivité. Une analyse de l’incertitude a montré que la longueur de la série de mesures et le choix de la stratégie d’échantillonnage constituent les principales sources d’incertitude dans la construction d’hydrogrammes synthétiques de projet. Cette thèse démontre qu’une approche de crues de projet basée sur un ensemble de crues permet la prise en compte des différents types de crue et de divers processus. Ces travaux permettent de passer de l’analyse fréquentielle statistique de crues vers l’analyse fréquentielle hydrologique de crues permettant de prendre en compte les processus et conduisant à une prise de décision plus éclairée. / Design flood estimates are needed in hydraulic design for the construction of dams and retention basins and in flood management for drawing hazard maps or modeling inundation areas. Traditionally, such design floods have been expressed in terms of peak discharge estimated in a univariate flood frequency analysis. However, design or flood management tasks involving storage, in addition to peak discharge, also require information on hydrograph volume, duration, and shape . A bivariate flood frequency analysis allows the joint estimation of peak discharge and hydrograph volume and the consideration of their dependence. While such bivariate design quantiles describe the magnitude of a design flood, they lack information on its shape. An attractive way of modeling the whole shape of a design flood is to express a representative normalized hydrograph shape as a probability density function. The combination of such a probability density function with bivariate design quantiles allows the construction of a synthetic design hydrograph for a certain return period which describes the magnitude of a flood along with its shape. Such synthetic design hydrographs have the potential to be a useful and simple tool in design flood estimation. However, they currently have some limitations. First, they rely on the definition of a bivariate return period which is not uniquely defined. Second, they usually describe the specific behavior of a catchment and do not express process variability represented by different flood types. Third, they are neither available for ungauged catchments nor are they usually provided together with an uncertainty estimate.This thesis therefore explores possibilities for the construction of synthetic design hydrographs in gauged and ungauged catchments and ways of representing process variability in design flood construction. It proposes tools for both catchment- and flood-type specific design hydrograph construction and regionalization and for the assessment of their uncertainty.The thesis shows that synthetic design hydrographs are a flexible tool allowing for the consideration of different flood or event types in design flood estimation. A comparison of different regionalization methods, including spatial, similarity, and proximity based approaches, showed that catchment-specific design hydrographs can be best regionalized to ungauged catchments using linear and nonlinear regression methods. It was further shown that event-type specific design hydrograph sets can be regionalized using a bivariate index flood approach. In such a setting, a functional representation of hydrograph shapes was found to be a useful tool for the delineation of regions with similar flood reactivities.An uncertainty assessment showed that the record length and the choice of the sampling strategy are major uncertainty sources in the construction of synthetic design hydrographs and that this uncertainty propagates through the regionalization process.This thesis highlights that an ensemble-based design flood approach allows for the consideration of different flood types and runoff processes. This is a step from flood frequency statistics to flood frequency hydrology which allows better-informed decision making.
|
8 |
Cartographie des événements hydrologiques extrêmes et estimation SCHADEX en sites non jaugés / Cartography of the extreme rain falls and use of the SCHADEX method for ungauged sitesPenot, David 17 October 2014 (has links)
Depuis 2006, à EDF, les études de crues extrêmes sont réalisées avec la méthode SCHADEX (Simulation Climato-Hydrologique pour l'Appréciation des Débits EXtrêmes). Elle s'appuie sur un modèle probabiliste MEWP (distribution saisonnière utilisant une classification par type de temps) pour décrire l'aléa pluie et sur une simulation stochastique croisant l'aléa pluie et l'aléa de saturation du bassin. Les approches par simulation, type SCHADEX, ont montré de bonnes performances pour estimer les distributions de crues extrêmes (projet ANR ExtraFlo , 2013). Cependant, l'utilisation de SCHADEX en l'absence de données (pluie, température, débit) sur le bassin à étudier reste problématique. Cette thèse propose une adaptation de la méthode en site non jaugé en essayant de conserver ses points forts, à savoir: - une structuration spatiale et probabiliste des précipitations conditionnée par les types de temps. - un croisement des aléas pluie et saturation du bassin par simulation stochastique. Ce travail s'est limité au pas de temps journalier afin d'aborder la problématique de régionalisation avec un maximum de données. La démarche s'est alors articulée autour de quatre grands axes: - proposer une méthode de régionalisation des précipitations journalières extrêmes ponctuelles et construire des cartes de pluies aux temps de retour remarquables. Évaluer l'intérêt d'une classification par type de temps pour la régionalisation des distributions de pluies extrêmes et qualifier l'interpolateur de pluie SPAZM pour l'estimation des pluies extrêmes. - s'intéresser à la construction de pluies de bassin (ou pluies spatiales) et en particulier à l'impact des choix de construction de cette pluie sur l'estimation des précipitations extrêmes concernant le bassin. - développer une méthode de simulation stochastique régionale permettant de proposer une distribution de débits journaliers issue d'un croisement des aléas pluies et saturation du bassin. - étudier le passage de la distribution des débits journaliers à la distribution des débits de pointe. Les principaux apports de cette thèse sont les suivants: - la prise en compte des types de temps permet d'améliorer la description des structures spatiales des précipitations extrêmes. - l'information apportée par les pluies SPAZM se révèle être précieuse pour l'estimation des pluies extrêmes en site non jaugé. - une étude de sensibilité du calcul de la pluie spatiale en fonction du nombre de postes utilisés (comparaison des pluies SPAZM et Thiessen) donne une indication sur le biais d'estimation. - le générateur de champs de pluie par bandes tournantes SAMPO permet d'étudier l'abattement sur les précipitations extrêmes et de mettre en place un modèle de correction pour les quantiles élevés des pluies spatiales SPAZM. - une nouvelle méthode de simulation stochastique peu paramétrée mais analogue à la méthode SCHADEX (croisement d'un aléa pluie et d'un aléa de saturation du bassin pour produire une distribution des débits journaliers) est proposée pour l'estimation en site non jaugé. - enfin, un travail préliminaire donne des premiers éléments sur le passage à la distribution des débits de pointe par un générateur d'hydrogrammes s'adaptant à la séquence des débits journaliers simulés. Tous ces développements et conclusions sont détaillés et justifiés dans le mémoire de thèse. / Since 2006, at EDF, extreme flood estimations are computed with the SCHADEX method (Climatic-hydrological simulation of extreme floods). This method relies on a MEWP probabilistic model (seasonal rainfall distribution using a weather pattern concept) and on a stochastic simulation to cross rainy events hazard and catchment saturation states. Simulation approaches, as SCHADEX, have shown good performances to estimate extreme flood distributions. However, the use of SCHADEX method without data for a considered catchment (rain, temperature, runoff) remains a main issue. This thesis suggests an adaptation of the method in ungauged context, trying to keep the key points of the SCHADEX method: - spatial and probabilistic structure of rainfall conditioned by weather patterns. - a cross of rainfall and catchment saturation hazards by stochastic simulation. This work is limited to a daily step to address the issue of regionalization with a maximum of data. The approach is then structured around four main points: - regionalize punctual daily extreme precipitations and construct maps of return period rainfalls. Evaluate the contribution of a weather type classification for the regionalization of extreme rainfall distributions and qualify the SPAZM interpolator for the estimation of extreme rainfall. - wonder about the construction of an areal rainfall and in particular about the impact of its construction choices on the estimation of extreme precipitations. - develop a regional stochastic simulation method to estimate a distribution of daily runoffs which crosses rainy events and catchment saturation hazards. - study the transposition from a daily runoff distribution to a peak flow distribution. The main contributions of this thesis are: - taking into account the weather types improves the description of spatial patterns of extreme precipitations. - information provided by the SPAZM rainfall interpolator proves to be valuable for the estimation of extreme rainfall in ungauged site. - a sensitivity analysis of the calculation of the areal rainfall based on the number of stations used (comparison SPAZM and Thiessen areal rainfalls) gives an indication of the estimation bias. - the SAMPO rainfall generator used to study the areal reduction factor of extreme precipitation and implement a correction model for high quantiles of SPAZM areal rainfall. - a simplified method of stochastic simulation similar to SCHADEX method (cross between a rainfall hazard and a catchment saturation hazard) is developed to produce a distribution of daily flows in ungauged site. - finally, preliminary work provides a way for the transition to the peak flow distribution using a hydrograph generator adapted to the sequence of daily simulated runoffs. All these developments and conclusions are detailed and justified in the thesis.STAR
|
9 |
A New Mathematical Framework for Regional Frequency Analysis of FloodsBasu, Bidroha January 2015 (has links) (PDF)
Reliable estimates of design flood quantiles are often necessary at sparsely gauged/ungauged target locations in river basins for various applications in water resources engineering. Development of effective methods for use in this task has been a long-standing challenge in hydrology for over five decades.. Hydrologists often consider various regional flood frequency analysis (RFFA) approaches that involve (i) use of regionalization approach to delineate a homogeneous group of watersheds resembling watershed of the target location, and (ii) use of a regional frequency analysis (RFA) approach to transfer peak flow related information from gauged watersheds in the group to the target location, and considering the information as the basis to estimate flood quantile(s) for the target site. The work presented in the thesis is motivated to address various shortcomings/issues associated with widely used regionalization and RFA approaches.
Regionalization approaches often determine regions by grouping data points in multidimensional space of attributes depicting watershed’s hydrology, climatology, topography, land-use/land-cover and soils. There are no universally established procedures to identify appropriate attributes, and modelers use subjective procedures to choose a set of attributes that is considered common for the entire study area. This practice may not be meaningful, as different sets of attributes could influence extreme flow generation mechanism in watersheds located in different parts of the study area. Another issue is that practitioners usually give equal importance (weight) to all the attributes in regionalization, though some attributes could be more important than others in influencing peak flows. To address this issue, a two-stage clustering approach is developed in the thesis. It facilitates identification of appropriate attributes and their associated weights for use in regionalization of watersheds in the context of flood frequency analysis. Effectiveness of the approach is demonstrated through a case study on Indiana watersheds.
Conventional regionalization approaches could prove effective for delineating regions when data points (depicting watersheds) in watershed related attribute space can be segregated into disjoint groups using straight lines or linear planes. They prove ineffective when (i) data points are not linearly separable, (ii) the number of attributes and watersheds is large, (iii) there are outliers in the attribute space, and (iv) most watersheds resemble each other in terms of their attributes. In real world scenario, most watersheds resemble each other, and regions may not always be segregated using straight lines or linear planes, and dealing with outliers and high-dimensional data is inevitable in regionalization. To address this, a fuzzy support vector clustering approach is proposed in the thesis and its effectiveness over commonly used region-of-influence approach, and different cluster analysis based regionalization methods is demonstrated through a case study on Indiana watersheds. For the purpose of regional frequency analysis (RFA), index-flood approach is widely used over the past five decades. Conventional index-flood (CIF) approach assumes that values of scale and shape parameters of frequency distribution are identical across all the sites in a homogeneous region. In real world scenario, this assumption may not be valid even if a region is statistically homogeneous. Logarithmic index-flood (LIF) and population index-flood (PIF) methodologies were proposed to address the problem, but even those methodologies make unrealistic assumptions. PIF method assumes that the ratio of scale to location parameters is a constant for all the sites in a region. On the other hand, LIF method assumes that appropriate frequency distribution to fit peak flows could be found in log-space, but in reality the distribution of peak flows in log space may not be closer to any of the known theoretical distributions. To address this issue, a new mathematical approach to RFA is proposed in L-moment and LH-moment frameworks that can overcome shortcomings of the CIF approach and its related LIF and PIF methods that make various assumptions but cannot ensure their validity in RFA. For use with the proposed approach, transformation mechanisms are proposed for five commonly used three-parameter frequency distributions (GLO, GEV, GPA, GNO and PE3) to map the random variable being analyzed from the original space to a dimensionless space where distribution of the random variable does not change, and deviations of regional estimates of all the distribution’s parameters (location, scale, shape) with respect to their population values as well as at-site estimates are minimal. The proposed approach ensures validity of all the assumptions of CIF approach in the dimensionless space, and this makes it perform better than CIF approach and related LIF and PIF methods. Monte-Carlo simulation experiments revealed that the proposed approach is effective even when the form of regional frequency distribution is mis-specified. Case study on watersheds in conterminous United States indicated that the proposed approach outperforms methods based on index-flood approach in real world scenario.
In recent decades, fuzzy clustering approach gained recognition for regionalization of watersheds, as it can account for partial resemblance of several watersheds in watershed related attribute space. In working with this approach, formation of regions and quantile estimation requires discerning information from fuzzy-membership matrix. But, currently there are no effective procedures available for discerning the information. Practitioners often defuzzify the matrix to form disjoint clusters (regions) and use them as the basis for quantile estimation. The defuzzification approach (DFA) results in loss of information discerned on partial resemblance of watersheds. The lost information cannot be utilized in quantile estimation, owing to which the estimates could have significant error. To avert the loss of information, a threshold strategy (TS) was considered in some prior studies, but it results in under-prediction of quantiles. To address this, a mathematical approach is proposed in the thesis that allows discerning information from fuzzy-membership matrix derived using fuzzy clustering approach for effective quantile estimation. Effectiveness of the approach in estimating flood quantiles relative to DFA and TS was demonstrated through Monte-Carlo simulation experiments and case study on mid-Atlantic water resources region, USA.
Another issue with index flood approach and its related RFA methodologies is that they assume linear relationship between each of the statistical raw moments (SMs) of peak flows and watershed related attributes in a region. Those relationships form the basis to arrive at estimates of SMs for the target ungauged/sparsely gauged site, which are then utilized to estimate parameters of flood frequency distribution and quantiles corresponding to target return periods. In reality, non-linear relationships could exist between SMs and watershed related attributes. To address this, simple-scaling and multi-scaling methodologies have been proposed in literature, which assume that scaling (power law) relationship exists between each of the SMs of peak flows at sites in a region and drainage areas of watersheds corresponding to those sites. In real world scenario, drainage area alone may not completely describe watershed’s flood response. Therefore flood quantile estimates based on the scaling relationships can have large errors. To address this, a recursive multi-scaling (RMS) approach is proposed that facilitates construction of scaling (power law) relationship between each of the SMs of peak flows and a set of site’s region-specific watershed related attributes chosen/identified in a recursive manner. The approach is shown to outperform index-flood based region-of-influence approach, simple-and multi-scaling approaches, and a multiple linear regression method through leave-one-out cross validation experiment on watersheds in and around Indiana State, USA.
The conventional approaches to flood frequency analysis (FFA) are based on the assumption that peak flows at the target site represent a sample of independent and identically distributed realization drawn from a stationary homogeneous stochastic process. This assumption is not valid when flows are affected by changes in climate and/or land use/land cover, and regulation of rivers through dams, reservoirs and other artificial diversions/storages. In situations where evidence of non-stationarity in peak flows is strong, it is not appropriate to use quantile estimates obtained based on the conventional FFA approaches for hydrologic designs and other applications. Downscaling is one of the options to arrive at future projections of flows at target sites in a river basin for use in FFA. Conventional downscaling methods attempt to downscale General Circulation Model (GCM) simulated climate variables to streamflow at target sites. In real world scenario, correlation structure exists between records of streamflow at sites in a study area. An effective downscaling model must be parsimonious, and it should ensure preservation of the correlation structure in downscaled flows to a reasonable extent, though exact reproduction/mimicking of the structure may not be necessary in a climate change (non-stationary) scenario. A few recent studies attempted to address this issue based on the assumption of spatiotemporal covariance stationarity. However, there is dearth of meaningful efforts especially for multisite downscaling of flows. To address this, multivariate support vector regression (MSVR) based methodology is proposed to arrive at flood return levels (quantile estimates) for target locations in a river basin corresponding to different return periods in a climate change scenario. The approach involves (i) use of MSVR relationships to downscale GCM simulated large scale atmospheric variables (LSAVs) to monthly time series of streamflow at multiple locations in a river basin, (ii) disaggregation of the downscaled streamflows corresponding to each site from monthly to daily time scale using k-nearest neighbor disaggregation methodology, (iii) fitting time varying generalized extreme value (GEV) distribution to annual maximum flows extracted from the daily streamflows and estimating flood return levels for different target locations in the river basin corresponding to different return periods.
|
Page generated in 0.112 seconds