• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7543
  • 3557
  • 3288
  • 1164
  • 361
  • 177
  • 153
  • 149
  • 145
  • 88
  • 76
  • 57
  • 55
  • 54
  • 47
  • Tagged with
  • 20361
  • 3814
  • 3244
  • 3199
  • 2731
  • 2684
  • 2683
  • 1937
  • 1758
  • 1473
  • 1334
  • 1219
  • 1178
  • 1101
  • 963
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

How Collaborative Technology Supports Cognitive Processes in Collaborative Process Modeling: A Capabilities-Gains-Outcome Model

Recker, Jan, Mendling, Jan, Hahn, Christopher 11 1900 (has links) (PDF)
We examine which capabilities technologies provide to support collaborative process modeling. We develop a model that explains how technology capabilities impact cognitive group processes, and how they lead to improved modeling outcomes and positive technology beliefs. We test this model through a free simulation experiment of collaborative process modelers structured around a set of modeling tasks. With our study, we provide an understanding of the process of collaborative process modeling, and detail implications for research and guidelines for the practical design of collaborative process modeling.
42

Foundations for Multi-level Ontology-based Conceptual Modeling

CARVALHO, V. A. 16 December 2016 (has links)
Made available in DSpace on 2018-08-02T00:04:06Z (GMT). No. of bitstreams: 1 tese_10529_foundations_for_ontology_based_multi_level_conceptual_modeling_2012 - victorio.pdf: 6142526 bytes, checksum: 3e3483b04f4e9b86ab648e66104ac1e2 (MD5) Previous issue date: 2016-12-16 / Considerando que modelos conceituais são produzidos com o objetivo de representar certos aspectos do mundo físico e social de acordo com uma conceituação específica e que ontologias buscam descrever conceituações, tem havido crescente interesse no uso de ontologias para fornecer uma base teórica sólida para a disciplina de modelagem conceitual. Esse interesse deu origem a uma área de pesquisa denominada modelagem conceitual baseada em ontologias, com avanços significativos na modelagem conceitual nas últimas décadas. Apesar desses avanços, a modelagem baseada em ontologias não provê suporte adequado à modelagem de domínios que exigem a representação de categorias de indivíduos e de categorias de categorias (ou tipos de tipos). A representação de entidades de vários "níveis" de classificação tem sido o foco de uma área de pesquisa distinta denominada modelagem multi-nível. As iniciativas em modelagem multi-nível visam a contornar as limitações impostas pelo paradigma convencional de modelagem em dois níveis. Apesar das contribuições relevantes das áreas de modelagem multi-nível e de modelagem conceitual baseada em ontologias, a combinação dessas duas áreas ainda não recebeu a devida atenção. Este trabalho explora essa lacuna propondo o uso combinado de teorias formais para a modelagem multi-nível e de ontologias de fundamentação para apoiar o que chamamos de modelagem conceitual multi-nível baseada em ontologias. Para fornecer uma abordagem bem fundamentada à modelagem conceitual multi-nível, desenvolvemos uma teoria chamada MLT. MLT caracteriza formalmente a natureza dos níveis de classificação e define precisamente as relações que podem ocorrer entre elementos de diferentes níveis de classificação. A fim de aproveitar os benefícios do uso de ontologias de fundamentação na modelagem de domínios que abrangem vários níveis de classificação, combinamos MLT com uma ontologia de fundamentação. Essa combinação resulta em uma abordagem de modelagem que apoia a construção de modelos conceituais multi-níveis em um espectro de níveis de especificidade, desde ontologias de fundamentação até modelos conceituas de domínios específicos. Para demonstrar a aplicabilidade da nossa abordagem de modelagem conceitual multi-nível baseada em ontologias, a empregamos para desenvolver uma ontologia núcleo para estruturas organizacionais, um domínio que abrange vários níveis de classificação. Além disso, mostramos como MLT pode ser usada como uma teoria de referência para esclarecer a semântica e aumentar a expressividade de UML no que diz respeito à representação de modelos de multi-níveis. O perfil UML produzido viabiliza a aplicação prática de MLT pela comunidade de modelagem conceitual.
43

Development and Evaluation of a Gis-Based Spatially Distributed Unit Hydrograph Model

Kilgore, Jennifer Leigh 23 December 1997 (has links)
Synthetic unit hydrographs, which assume uniform rainfall excess distribution and static watershed conditions, are frequently used to estimate hydrograph characteristics when observed data are unavailable. The objective of this research was to develop a spatially distributed unit hydrograph (SDUH) model that directly reflects spatial variation in the watershed in generating runoff hydrographs. The SDUH model is a time-area unit hydrograph technique that uses a geographic information system (GIS) to develop a cumulative travel time map of the watershed based on cell by cell estimates of overland and channel flow velocities. The model considers slope, land use, watershed position, channel characteristics, and rainfall excess intensity in determining flow velocities. The cumulative travel time map is divided into isochrones which are used to generate a time-area curve and the resulting unit hydrograph. Predictions of the SDUH model along with the Snyder, SCS, and Clark synthetic unit hydrographs were compared with forty observed storm events from an 1153-ha Virginia Piedmont watershed. The SDUH model predictions were comparable or slightly better than those from the other models, with the lowest relative error in the peak flow rate prediction for 12 of the 40 storms, and a model efficiency of at least 0.90 for 21 of the storms. Despite the good predictions of the hydrograph peak flow rate and shape, the time to peak was underpredicted for 34 of the 40 storms. Runoff from the 40 storms was also generated for two subwatersheds (C: 462 ha; D: 328 ha) in Owl Run to assess the effect of scale on the SDUH model. Peak flow rate predictions were more accurate for the entire watershed than for either subwatershed. The time to peak prediction and model efficiency statistics were comparable for the entire watershed and subwatershed D. Subwatershed C had poorer predictions, which were attributed to a large pond in the main channel, rather than to scale effects. The SDUH model provides a framework for predicting runoff hydrographs for ungauged watersheds that can reflect the spatially distributed nature of the rainfall-runoff process. Predictions were comparable to the other synthetic unit hydrograph techniques. Because the time to peak and model efficiency statistics were similar for the 1153-ha watershed and a 328-ha subwatershed, scale does not have a major impact on the accuracy of the SDUH model. / Master of Science
44

Hydrodynamic and Water Quality Simulation of Fecal Coliforms in the Lower Appomattox River, Virginia

Hammond, Andrew Jesse 29 September 2004 (has links)
The Virginia Department of Environmental Quality (VADEQ) under the direction of the United States Environmental Protection Agency (USEPA) has listed the lower Appomattox River as impaired because it violates current water quality standards for fecal coliforms. To advance the analytical process by which various scenarios for improving water quality within the estuary are examined, an array of computer-based hydrodynamic and water quality models were investigated. The Dynamic Estuary Model (DYNHYD5), developed by USEPA, was used to simulate hydrodynamics within the lower Appomattox River. The Water Quality Analysis Simulation Program (WASP6.1), also developed by USEPA, was employed to perform water quality simulations of fecal coliforms. Also, a detailed literature review examined DYNHYD5 and WASP6.1 model theory, computer-based model solution techniques, and background hydrodynamic theory. DYNHYD5 sensitivity analysis showed that the model was most responsive to tidal heights (seaward boundary conditions) both upstream and downstream within the model network. Specific model parameters were varied during calibration until modeled water surface elevations converged on observed water surface elevations. A goodness-of-fit value of 0.749 was determined with linear regression analysis for model calibration. DYNHYD5 input parameter validation was performed with additional observations and a goodness-of-fit value of 0.829 was calculated. Through sensitivity analysis, WASP6.1 proved to be most responsive to coliform loading rates in the downstream direction and boundary concentrations in the upstream direction. With these results, WASP6.1 input parameters were calibrated against observed fecal coliform concentrations. A goodness-of-fit value of 0.573 was determined with linear regression analysis for model calibration. WASP6.1 input parameter validation was performed with additional observations and a goodness-of-fit value of 0.0002 was calculated. Model results suggest that hydrodynamic model calibration and validation can be improved with additional tidal height observations at the downstream seaward boundary. Similarly, water quality model calibration and validation can possibly be improved with the aid of detailed, time-variable coliform concentrations at the downstream seaward boundary. Therefore, it is recommended that a water quality sampling station and tidal stage recorder be installed at the confluence of the Appomattox and James Rivers to provide for further testing of estuary hydrodynamic and water quality models. / Master of Science
45

The Use of Stormwater Modeling for Design and Performance Evaluation of Best Management Practices at the Watershed Scale

Houston, Edward Brian 02 October 2006 (has links)
The use of best management practices or BMPs to treat urban stormwater runoff has been pervasive for many years. Extensive research has been conducted to evaluate the performance of individual BMPs at specific locations; however, little research has been published that seeks to evaluate the impacts of small, distributed BMPs throughout a watershed at the regional level. To address this, a model is developed using EPA SWMM 5.0 for the Duck Pond watershed, which is located in Blacksburg, Virginia and encompasses much of the Virginia Polytechnic and State Institute's campus and much of the town of Blacksburg as well. A variety of BMPs are designed and placed within the model. Several variations of the model are created in order to test different aspects of BMP design and to test the BMP modeling abilities of EPA SWMM 5.0. Simulations are performed using one-hour design storms and yearlong hourly rainfall traces. From these simulations, small water quality benefits are observed at the system level. This is seen as encouraging, given that a relatively small amount of the total drainage area is controlled by BMPs and that the BMPs are not sited in optimal locations. As expected, increasing the number of BMPs in the watershed generally increases the level of treatment. The use of the half-inch rule in determining the required water quality volume is examined and found to provide reasonable results. The design storm approach to designing detention structures is also examined for a two-pond system located within the model. The pond performances are examined under continuous simulation and found to be generally adequate for the simulated rainfall conditions, although they do under-perform somewhat in comparison to the original design criteria. The usefulness of EPA SWMM 5.0 as a BMP modeling tool is called into question. Many useful features are identified, but so are many limitations. Key abilities such as infiltration from nodes or treatment in conduit flow are found to be lacking. Pollutant mass continuity issues are also encountered, making specific removal rates difficult to define. / Master of Science
46

Approximations for Nonlinear Differential Algebraic Equations to Increase Real-time Simulation Efficiency

Kwong, Gordon Houng 07 June 2010 (has links)
Full-motion driving simulators require efficient real-time high fidelity vehicle models in order to provide a more realistic vehicle response. Typically, multi-body models are used to represent the vehicle dynamics, but these have the unfortunate drawback of requiring the solution of a set of coupled differential algebraic equations (DAE). DAE's are not conducive to real-time implementation such as in a driving simulator, without a very expensive processing capability. The primary objective of this thesis is to show that multi-body models constructed from DAE's can be reasonably approximated with linear models using suspension elements that have nonlinear constitutive relationships. Three models were compared in this research, an experimental quarter-car test rig, a multi-body dynamics differential algebraic equation model, and a linear model with nonlinear suspension elements. Models constructed from differential algebraic equations are computationally expensive to compute and are difficult to realize for real-time simulations. Instead, a linear model with nonlinear elements was proposed for a more computationally efficient solution that would retain the nonlinearities of the suspension. Simplifications were made to the linear model with nonlinear elements to further reduce computation time for real-time simulation. The development process of each model is fully described in this thesis. Each model was excited with the same input and their outputs were compared. It was found that the linear model with nonlinear elements provides a reasonably good approximation of actual model with the differential algebraic equations. / Master of Science
47

A modeling framework for analyzing the education system as a complex system

Mital, Pratik 08 June 2015 (has links)
In this thesis, the Education System Intervention Modeling Framework (ESIM Framework) is introduced for analyzing interventions in the K-12 education system. This framework is the first of its kind to model interventions in the K-12 school system in the United States. Techniques from systems engineering and operations research, such as agent-based modeling and social network analysis, are used to model the bottom-up mechanisms of intervention implementation in schools. By applying the ESIM framework, an intervention can be better analyzed in terms of the barriers and enablers to intervention implementation and sustainability. The risk of failure of future interventions is thereby reduced through improved allocation of resources towards the system agents and attributes which play key roles in the sustainability of the intervention. Increasing the sustainability of interventions in the school system improves educational outcomes in the school and increases the benefits gained from the millions of dollars being invested in such interventions. In the first part of this thesis, a case study of an Engineers Without Borders chapter is modeled which helped in the development of a more generalized framework, applicable across a broad range of education system interventions. In the second part of this thesis, the ESIM framework is developed. The framework developed is divided into four phases: model definition, model design, model analysis, and model validation. Each of these phases has detailed steps in order to build the agent-based model of the particular intervention. In the third part of this thesis, the ESIM framework is applied to a case study of a curriculum intervention, Science Learning: Integrating Design, Engineering and Robotics, involving the design and implementation of an 8th-grade, inquiry-based physical science curriculum across three demographically varying schools. This case study provides a good comparison of the implementation of the intervention across different school settings because of the varied outcomes at the three schools.
48

Fluctuations of a Greenlandic tidewater glacier from the Little Ice Age to present : reconstruction and modelling of Kangiata Nunaata Sermia, SW Greenland

Lea, James M. January 2014 (has links)
Significant uncertainty surrounds the influence of atmospheric and oceanic forcing on the fluctuations of tidewater glacier outlets of the Greenland ice sheet (GrIS), with the majority of studies focussing on dynamics over the last two decades. Although numerical model based projections exist anticipating the future dynamics of major GrIS outlets, these have been made using temporally limited model calibration periods (<5 years) compared to the centennial timescales that they seek to predict over. The ability of these numerical models to simulate the centennial timescale dynamics of GrIS tidewater glaciers has therefore not been explicitly tested. This thesis seeks to calibrate a well-established one-dimensional tidewater glacier numerical model against post-Little Ice Age maximum (LIAmax) observations of a major tidewater glacier outlet of GrIS. The study site chosen is Kangiata Nunaata Sermia (KNS); the largest tidewater outlet in SW Greenland south of Jakobshavn Isbræ. This glacier is known to have undergone retreat of >20 km since its LIAmax, though the timing of this retreat and response to climate forcing is currently poorly constrained. Utilising a range of source material, it is demonstrated that KNS is likely to have achieved its LIAmax by 1761, experiencing either one, or two multi-kilometre retreats by 1859, and retreats of a similar scale between 1921-1968, and 1997-2012. Terminus fluctuations of KNS were in phase with climate anomalies, where data were available for comparison (1871-2012). To allow accurate comparison to numerical model output, the accuracy of different methods of quantifying glacier terminus change was also evaluated. Two new methods were devised so observations could be matched with greater accuracy than existing methods allowed. Glacier sensitivity to climate forcing was evaluated using the numerical model.
49

Modeling of microstructural evolutions in machining of dual phase alloys

Tabei, Seyed Ali 27 May 2016 (has links)
Depending on the material system and machining conditions, the localized strain, strain rate and temperature fields induced to the material during the machining process can be intense. Therefore, a wide variety of microstructural evolutions are likely to occur below the machined surface. These microstructural changes take place at various scales. First of all, due to the severe plastic deformation below the machined surface, the crystallographic orientation of grains can change dramatically. In addition, if the levels of the induced temperature and strain are high enough, recrystallization may occur, new grains may form and subsequently grow. Additionally, contingent upon the duration of the machining process, partial grain growth might also happen. Last but not least, if the material is consisted of more than one phase, the microstructural characteristics of secondary phases will also evolve. The ultimate result of all the aforementioned evolutions produces remarkable changes in the mechanical and thermal (and almost all other) properties of the material, which consequently affect the response of the material during service. A comprehensive modeling framework that reliably captures all the aspects of the above microstructural evolutions in machining is absent in the open literature. This work coalesces concrete and all-inclusive modeling toolsets into a unified scheme to follow the mentioned phenomena in machining of aluminum alloy 7075. The modeling outcomes are verified by experimental results to assure reliability. Finite element analyses were applied to obtain the stress, temperature, strain and strain rate fields developed in the material during machining at different parameters. Kinetic-based models were exploited to determine the possible recrystallization or grain growth. A viscoplastic self-consistent crystal plasticity model was utilized to investigate texture evolution below the machined surface. Also for multi-phase materials, the first steps in developing a totally new constitutive model to yield the extent of the possible refinement in the second phase precipitates, were taken. The main goal of the work was to link the above-mentioned microstructural evolutions to process parameters of machining by mathematical derivation of process path functions. Therefore, prediction of microstructural changes as a result of changing the process parameters became possible; which has significant industrial potential and importance. Additionally, such a direct and complete linkage between machining and microstructure is completely new to the scientific community in manufacturing and design fields.
50

Dynamic thermal modeling and simulation framework: design of modeling techniques and external integration tools

Pierce, Michael Stephen 24 August 2010 (has links)
In looking to the future of naval warfare, the US Navy has committed itself to development of future classes of an All-Electric Ship (AES) that will incorporate significant technological advancements in the areas of power management, advanced sensor equipment and weaponry, reconfigurability, and survivability systems while simultaneously increasing overall system efficiencies and decreasing the operational costs of the future naval fleet. As part of the consortium responsible for investigating the viability of numerous next-generation technologies, the University of Texas at Austin is dedicated to providing the capabilities and tools to better address thermal management issues aboard the future AES. Research efforts at the University of Texas in Austin have focused on the development of physics-based, dynamic models of components and subsystems that simulate notional future AES, system-level, thermal architectures. This research has resulted in the development of an in-house thermal management tool, known as the Dynamic Thermal Modeling and Simulation (DTMS) Framework. The work presented herein has sought to increase the modeling capabilities of the DTMS Framework and provide valuable tools to aid both developers and users of this simulation environment. Using numerical approximations of complex physical behaviors, the scope of the DTMS Framework has been expanded beyond elements of thermal-fluid behaviors to capture the dynamic, transient nature of far broader, more complex architectures containing interconnected thermal-mechanical-electrical components. Sophisticated interfacial systems have also been developed that allow integration of the DTMS Framework with external software products that improve and enhance the user experience. Developmental tools addressing customizable presentation of simulation data, debugging systems that aid in introduction of new features into the existing framework, and error-reporting mechanisms to ease the process of utilizing the power of the simulation environment have been added to improve the applicability and accessibility of the DTMS Framework. Finally, initial efforts in collaboration with Mississippi State University are presented that provide a graphical user interface for the DTMS Framework and thus provide far more insight into the complex interactions of numerous shipboard systems than would ever be possible using raw numerical data. / text

Page generated in 0.089 seconds