• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • 1
  • Tagged with
  • 9
  • 7
  • 7
  • 7
  • 5
  • 5
  • 4
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Determinación de Hidrogramas Unitarios Utilizando el Modelo de Simulación Distribuido GSSHA

Pérez Martínez, Osvaldo Antonio January 2007 (has links)
No description available.
2

A Cloud-Based GSSHA Index Map Editor Utility for Watershed Decision Support

Anderson, Jocelynn Marie 01 July 2015 (has links) (PDF)
Preventing damages from flooding is critically important for city managers and planners.Efforts in protecting infrastructure from flooding are often coupled with building hydrologicmodels to provide predictions of what is likely to happen during storm events. As land usechanges, these models must be updated, which is more challenging with sophisticated models. Ateam of researchers from universities in Utah and Wyoming have been developing tools forwater management in the Intermountain West as part of a collaborative NSF research grantcalled CI-WATER. In particular, a free and open source web platform called Tethys has beendeveloped to support the development and hosting of hydrologic web applications. Tethys wasused to develop a prototype application that uses a GSSHA runoff model and allows users tochange land-use inputs to simulate the impact on a watershed for any type of land use change.The application also provides a method to run the edited model and produces a comparisonreport of before-and-after runoff and water depth as part of a decision-support framework.
3

A Comprehensive Python Toolkit for Harnessing Cloud-Based High-Throughput Computing to Support Hydrologic Modeling Workflows

Christensen, Scott D. 01 February 2016 (has links)
Advances in water resources modeling are improving the information that can be supplied to support decisions that affect the safety and sustainability of society, but these advances result in models being more computationally demanding. To facilitate the use of cost- effective computing resources to meet the increased demand through high-throughput computing (HTC) and cloud computing in modeling workflows and web applications, I developed a comprehensive Python toolkit that provides the following features: (1) programmatic access to diverse, dynamically scalable computing resources; (2) a batch scheduling system to queue and dispatch the jobs to the computing resources; (3) data management for job inputs and outputs; and (4) the ability for jobs to be dynamically created, submitted, and monitored from the scripting environment. To compose this comprehensive computing toolkit, I created two Python libraries (TethysCluster and CondorPy) that leverage two existing software tools (StarCluster and HTCondor). I further facilitated access to HTC in web applications by using these libraries to create powerful and flexible computing tools for Tethys Platform, a development and hosting platform for web-based water resources applications. I tested this toolkit while collaborating with other researchers to perform several modeling applications that required scalable computing. These applications included a parameter sweep with 57,600 realizations of a distributed, hydrologic model; a set of web applications for retrieving and formatting data; a web application for evaluating the hydrologic impact of land-use change; and an operational, national-scale, high- resolution, ensemble streamflow forecasting tool. In each of these applications the toolkit was successful in automating the process of running the large-scale modeling computations in an HTC environment.
4

Automated Calibration of the GSSHA Watershed Model: A Look at Accuracy and Viability for Routine Hydrologic Modeling

Shurtz, Kayson M. 23 November 2009 (has links) (PDF)
The goal of hydrologic models is to accurately predict a future event of a given magnitude. Historic data are often used to calibrate models to increase their ability to forecast accurately. The GSSHA model is a distributed model that uses physical parameters and physics based computations to compute water flow from cell to cell based on a 2 dimensional grid. The goal of calibration is to obtain good estimates for the actual parameters of the watershed. These parameters should then transfer to other storm events of different magnitudes more easily than an empirical model. In conducting this research three watersheds were selected in different parts of the United States and the required data were collected to develop and run single event hydrologic models. The WMS software was used to preprocess digital spatial data for model creation before calibrating them with the GSSHA model. A calibrated HEC-HMS model was also developed for each watershed for comparative purposes. Establishing GSSHA's usability in routine hydrologic modeling is the primary objective of this research. This has been accomplished by developing guidelines for GSSHA calibrations, assisted by WMS, testing model accuracy in the calibration and verification phases, and comparing results with HEC-HMS, a model widely accepted for routine hydrologic modeling. As a result of this research, the WMS interface has become well equipped to set up and run GSSHA model calibrations. The focus has been on single event, or routine hydrologic model simulations, but continuous simulation calibrations, an important strength of GSSHA, can also be developed. Each of the model simulations in the study calibrated well in terms of matching peak and volume. However, the verification for two out of the three watersheds used in the study was less than ideal. The results of this research indicate that the physical factors, which GSSHA should represent well, are particularly sensitive for single event storms. The use of calibration of single events is therefore difficult in some cases and may not be recommended. Further research could be done to establish guidelines for situations (e.g. watershed conditions, storm type, etc.) where single event calibration is plausible.
5

An Examination of Distributed Hydrologic Modeling Methods as Compared with Traditional Lumped Parameter Approaches

Paudel, Murari 06 July 2010 (has links) (PDF)
Empirically based lumped hydrologic models have an extensive track record of use where as physically based, multi-dimensional distributed models are evolving for various engineering applications. Despite the availability of high resolution data, better computational resources and robust numerical methods, the usage of distributed models is still limited. The purpose of this research is to establish the credibility and usability of distributed hydrologic modeling tools of the United States Army Corps of Engineers (USACE) in order to promote the extended use of distributed models. Two of the USACE models were used as the modeling tools for the study, with Gridded Surface and Subsurface Hydrologic Analysis (GSSHA) representing a distributed and with Hydrologic Engineering Center-Hydrologic Modeling System (HEC-HMS) representing a lumped model. Watershed Modeling System (WMS) was used as the pre- and post-processing tool. The credibility of distributed models has been established by validating that the distributed models are efficient in solving complex hydrologic problems. The distributed and lumped models in HEC-HMS were compared. Similarly, the capabilities of GSSHA and lumped models in HEC-HMS in simulating land use change scenario were compared. The results of these studies were published in peer-reviewed journals. Similarly, the usability of the distributed models was studied taking GSSHA-WMS modeling as a test case. Some of the major issues in GSSHA-modeling using WMS interface were investigated and solutions were proposed to solve such issues. Personal experience with GSSHA and feedback from the students in a graduate class (CE531) and from participants in the USACE GSSHA training course were used to identify such roadblocks. The project being partly funded by the USACE Engineering Research and Development Center (ERDC) and partly by Aquaveo LLC, the research was motivated in improving GSSHA modeling using the WMS interface.
6

A GIS-Based Data Model and Tools for Analysis and Visualization of Levee Breaching Using the GSSHA Model

Tran, Hoang Luu 17 March 2011 (has links) (PDF)
Levee breaching is the most frequent and dangerous form of levee failure. A levee breach occurs when floodwater breaks through part of the levee creating an opening for water to flood the protected area. According to National Committee on Levee Safety (NCLS), a reasonable upper limit for damage resulting from levee breaching is around $10 billion per year during 1998 and 2007. This number excludes hurricanes Katrina and Rita in 2005 which resulted in economic damages estimated to be more than $200 billion dollar and a loss of more than 1800 lives. In response to these catastrophic failures, the U.S. Army Corps of Engineers (USACE) started to develop the National Levee Database (NLD) on May 2006. The NLD has a critical role in evaluating the safety of the national levee system. It contains information regarding the attributes of the national levee system. The Levee Analyst Data Model was developed by Dr Norm Jones, Jeff Handy and Thomas Griffiths to supplement the NLD. Levee Analyst is a data model and suite of tools for managing levee information in ArcGIS and exporting the information to Google Earth for enhanced visualization. The current Levee Analyst has a concise and expandable structure for managing, archiving and analyzing large amounts of levee seepage and slope stability data. (Thomas 2009). The new set of tools developed in this research extends the ability of the Levee Analyst Data Model to analyze and mange levee breach simulations and store them in the NLD geodatabase. The capabilities and compatibilities with the NLD of the new geoprocessing tools are demonstrated in the case study. The feasibility of using GSSHA model to simulate flooding is also demonstrated in this research.
7

Shallow Groundwater Modeling of the Historical Irwin Wet Prairie in the Oak Openings of Northwest Ohio

Wijayarathne, Dayal Buddika 27 July 2015 (has links)
No description available.
8

Stochastic Spatio-Temporal Uncertainty in GIS-Based Water Quality Modeling of the Land Water Interface

Salah, Ahmad Mohamad 27 February 2009 (has links)
Integrated water resources management has been used for decades in various formats. The limited resources and the ever growing population keep imposing pressure on decision makers to better-, and reliably, manage the available waters. On the other hand, the continuous development in computing and modeling power has helped modelers and decision makers considerably. To use these models, assumptions have to be made to fill in the gaps of missing data and to approximate the current conditions. The type and amount of information available can also be used to help select the best model from the currently available models. Advances in data collection have not kept up to the pace of advances in model development and the need for more and reliable input parameter values. Hence, uncertainty in model input parameters also needs to be quantified and addressed. This research effort develops a spatially-based modeling framework to model watersheds from both water quantity and quality standpoints. In this research, Gridded Surface Sub-Surface Hydrologic Analysis (GSSHA) and CE-QUAL-W2 models are linked within the Watershed Modeling System (WMS); a GIS interface for hydrologic and hydraulic models, to better handle both models pre and post processing. In addition, stochastic analysis routines are developed and used to examine and address the uncertainty inherent in the modeling process of the interface between land and water in the designated watershed. The linkage routines are developed in WMS using C++. The two models are linked spatially and temporally with the general direction of data flow from GSSHA to CE-QUAL-W2. Pre-processing of the CE-QUAL-W2 model is performed first. Then stochastic parameters and their associated distributions are defined for stochastic analysis in GSSHA before a batch run is performed. GSSHA output is then aggregated by CE-QUAL-W2 segments to generate multiple CE-QUAL-W2 runs. WMS then reads the stochastic CE-QUAL-W2 runs upon successful completion for data analysis. Modelers need to generate a WMS Gage for each location where they want to examine the stochastic output. A Gage is defined by a segment and a layer in the CE-QUAl-W2 model. Once defined, modelers are able to view a computed credible interval with lower, upper bounds in addition to the mean time series of a pre-selected constituent. Decision makers can utilize this output to better manage watersheds by understanding and incorporating the spatio-temporal uncertainty for the land-water interface.
9

An Enhanced Data Model and Tools for Analysis and Visualization of Levee Simulations

Griffiths, Thomas Richard 15 March 2010 (has links) (PDF)
The devastating levee failures associated with hurricanes Katrina and Rita, and the more recent Midwest flooding, placed a spotlight on the importance of levees and our dependence on them to protect life and property. In response to levee failures associated with the hurricanes, Congress passed the Water Resources Development Act of 2007 which established a National Committee on Levee Safety. The committee was charged with developing recommendations for a National Levee Safety Program. The Secretary of the Army was charged with the establishment and maintenance of a National Levee Database. The National Levee Database is a critical tool in assessing and improving the safety of the nation's levees. However, the NLD data model, established in 2007, lacked a structure to store seepage and slope stability analyses – vital information for assessing the safety of a levee. In response, the Levee Analyst was developed in 2008 by Dr. Norm Jones and Jeffrey Handy. The Levee Analysis Data Model was designed to provide a central location, compatible with the National Levee Database, for storing large amounts of levee seepage and slope stability analytical data. The original Levee Analyst geoprocessing tools were created to assist users in populating, managing, and analyzing Levee Analyst geodatabase data. In an effort to enhance the Levee Analyst and provide greater accessibility to levee data, this research expanded the Levee Analyst to include modifications to the data model and additional geoprocessing tools that archive GeoStudio SEEP/W and SLOPE/W simulations as well as export the entire Levee Analyst database to Google Earth. Case studies were performed to demonstrate the new geoprocessing tools' capabilities and the compatibility between the National Levee Database and the Levee Analyst database. A number of levee breaches were simulated to prototype the enhancement of the Levee Analyst to include additional feature classes, tables, and geoprocessing tools. This enhancement would allow Levee Analyst to manage, edit, and export two-dimensional levee breach scenarios.

Page generated in 0.0447 seconds