21 |
Estimating a three-level latent variable regression model with cross-classified multiple membership dataLeroux, Audrey Josée 28 October 2014 (has links)
The current study proposed a new model, termed the cross-classified multiple membership latent variable regression (CCMM-LVR) model, to be utilized for multiple membership data structures (for example, in the presence of student mobility across schools) that provides an extension to the three-level latent variable regression model (HM3-LVR). The HM3-LVR model is beneficial for testing more flexible, directional hypotheses about growth trajectory parameters and handles pure clustering of participants within higher-level units. However, the HM3-LVR model involves the assumption that students remain in the same cluster (school) throughout the duration of the time period of interest. The CCMM-LVR model, on the other hand, appropriately models the participants’ changing clusters over time. The first purpose of this study was to demonstrate use and interpretation of the CCMM-LVR model and its parameters with a large-scale longitudinal dataset that had a multiple membership data structure (i.e., student mobility). The impact of ignoring mobility in the real data was investigated by comparing parameter estimates, standard error estimates, and model fit indices for the two estimating models (CCMM-LVR and HM3-LVR). The second purpose of the dissertation was to conduct a simulation study to try to understand the source of potential differences between the two estimating models and find out which model’s estimates were closer to the truth given the conditions investigated. The manipulated conditions in the simulation study included the mobility rate, number of clustering units, number of individuals (i.e., students) per cluster (here, school), and number of measurement occasions per individual. The outcomes investigated in the simulation study included relative parameter bias, relative standard error bias, root mean square error, and coverage rates of the 95% credible intervals. Substantial bias was found across conditions for both models, but the CCMM-LVR model resulted in the least amount of relative parameter bias and more efficient estimates of the parameters, especially for larger numbers of clustering units. The results of the real data and simulation studies are discussed, along with the implications for applied researchers for when to consider using the CCMM-LVR model versus the misspecified HM3-LVR model. / text
|
22 |
COMPARISON OF TWO AERIAL DISPERSION MODELS FOR THE PREDICTION OF CHEMICAL RELEASE ASSOCIATED WITH MARITIME ACCIDENTS NEAR COASTAL AREASKEONG KOK, TEO 11 March 2002 (has links)
No description available.
|
23 |
Styles in business process modeling: an exploration and a modelPinggera, Jakob, Soffer, Pnina, Fahland, Dirk, Weidlich, Matthias, Zugal, Stefan, Weber, Barbara, Reijers, Hajo A., Mendling, Jan 07 1900 (has links) (PDF)
Business process models are an important means to design, analyze, implement, and control business processes. As with every type of conceptual model, a business process model has to meet certain syntactic, semantic, and pragmatic quality requirements to be of value. For many years, such quality aspects were investigated by centering on the properties of the model artifact itself. Only recently, the process of model creation is considered as a factor that influences the resulting model's quality. Our work contributes to this stream of research and presents an explorative analysis of the process of process modeling (PPM). We report on two large-scale modeling sessions involving 115 students. In these sessions, the act of model creation, i.e., the PPM, was automatically recorded. We conducted a cluster analysis on this data and identified three distinct styles of modeling. Further, we investigated how both task- and modeler-specific factors influence particular aspects of those modeling styles. Based thereupon, we propose a model that captures our insights. It lays the foundations for future research that may unveil how high-quality process models can be established through better modeling support and modeling instruction. (authors' abstract)
|
24 |
A new formal and analytical process to product modeling (PPM) method and its application to the precast concrete industryLee, Ghang 08 November 2004 (has links)
The current standard product (data) modeling process relies on the experience and subjectivity of data modelers who use their experience to eliminate redundancies and identify omissions. As a result, product modeling becomes a social activity that involves iterative review processes of committees. This study aims to develop a new, formal method for deriving product models from data collected in process models of companies within an industry sector. The theoretical goals of this study are to provide a scientific foundation to bridge the requirements collection phase and the logical modeling phase of product modeling and to formalize the derivation and normalization of a product model from the processes it supports. To achieve these goals, a new and formal method, Georgia Tech Process to Product Modeling (GTPPM), has been proposed. GTPPM consists of two modules. The first module is called the Requirements Collection and Modeling (RCM) module. It provides semantics and a mechanism to define a process model, information items used by each activity, and information flow between activities. The logic to dynamically check the consistency of information flow within a process also has been developed. The second module is called the Logical Product Modeling (LPM) module. It integrates, decomposes, and normalizes information constructs collected from a process model into a preliminary product model. Nine design patterns are defined to resolve conflicts between information constructs (ICs) and to normalize the resultant model. These two modules have been implemented as a Microsoft Visio ™ add-on. The tool has been registered and is also called GTPPM ™. The method has been tested and evaluated in the precast concrete sector of the construction industry through several GTPPM modeling efforts. By using GTPPM, a complete set of information items required for product modeling for a medium or a large industry can be collected without generalizing each company's unique process into one unified high-level model. However, the use of GTPPM is not limited to product modeling. It can be deployed in several other areas including: workflow management system or MIS (Management Information System) development software specification development business process re-engineering.
|
25 |
Modeling high-genus surfacesSrinivasan, Vinod 30 September 2004 (has links)
The goal of this research is to develop new, interactive methods for creating very high genus 2-manifold meshes. The various approaches investigated in this research can be categorized into two groups -- interactive methods, where the user primarily controls the creation of the high-genus mesh, and automatic methods, where there is minimal user interaction and the program automatically creates the high-genus mesh.
In the interactive category, two different methods have been developed. The first allows the creation of multi-segment, curved handles between two different faces, which can belong to the same mesh or to geometrically distinct meshes. The second method, which is referred to as ``rind modeling'', provides for easy creation of surfaces resembling peeled and punctured rinds.
The automatic category also includes two different methods. The first one automates the process of creating generalized Sierpinski polyhedra, while the second one allows the creation of Menger sponge-type meshes.
Efficient and robust algorithms for these approaches and user-friendly tools for these algorithms have been developed and implemented.
|
26 |
Development of agent-based models for healthcare: applications and critiqueDemianyk, Bryan C.P. January 2010 (has links)
Agent-based modeling (ABM) is a modeling and simulation paradigm well-suited to social systems where agents interact and have some degree of autonomy. In their most basic sense, ABMs consist of agents (generally, individuals) interacting in an environment according to a set of behavioural rules. The foundational premise and the conceptual depth of ABM is that simple rules of individual behaviour will aggregate to illuminate complex and/or emergent group-level phenomena that are not specifically encoded by the modeler and that cannot be predicted or explained by the agent-level rules. In essence, ABM has the potential to reveal a whole that is greater than the sum of its parts. In this thesis, ABMs have been utilized as a modeling framework for three specific healthcare applications, including:
• the development of an ABM of an emergency department within a hospital allowing the modeling of contact-based infectious diseases such as influenza, and simulating various mitigation strategies;
• the development of an ABM to model the effectiveness of a real-time location system (RTLS) using radio frequency identification (RFID) in an emergency department, used for patient tracking as one measure of hospital efficiency; and,
• the development of an ABM to test strategies for disaster preparedness (high volume, high risk patients) using a fictitious case of zombies in an emergency department.
Although each ABM was purposeful and meaningful for its custom application, each ABM also represented an iteration toward the development of a generic ABM framework. Finally, a thorough critique of ABMs and the modifications required to create a more robust framework are provided. / February 2016
|
27 |
Multiscale Modeling of Multiphase PolymersLawrimore, William Brantley 12 August 2016 (has links)
Accurately simulating material systems in a virtual environment requires the synthesis and utilization of all relevant information regarding performance mechanisms for the material occurring over a range of length and time scales. Multiscale modeling is the basis for the Integrated Computational Materials Engineering (ICME) Paradigm and is a powerful tool for accurate material simulations. However, while ICME has experienced adoption among those in the metals community, it has not gained traction in polymer research. This thesis seeks establish a hierarchical multiscale modeling methodology for simulating polymers containing secondary phases. The investigation laid out in the chapters below uses mesoscopic Finite Element Analysis (FEA) as a foundation to build a multiscale modeling methodology for polymer material systems. At the mesoscale a Design of Experiments (DOE) parametric study utilizing FEA of polymers containing defects compared the relative impacts of a selection of parameters on damage growth and coalescence in polymers. Of the parameters considered, the applied stress state proved to be the most crucial parameter affecting damage growth and coalescence. At the macroscale, the significant influence of the applied stress state on damage growth and coalescence in polymers (upscaled from the mesoscale) motivated an expansion of the Bouvard Internal State Variable (ISV) (Bouvard et al. 2013) polymer model stress state sensitivity. Deviatoric stress invariants were utilized to modify the Bouvard ISV model to account for asymmetry in polymer mechanical performance across different stress states (tension, compression, torsion). Lastly, this work implements a hierarchical multiscale modeling methodology to examine parametric effects of heterogeneities on Polymer/Clay Nanocomposite’s (PCNs) mechanical performance under uncertainty. A Virtual Composite Structure Generator (VCSG) built three-dimensional periodic Representative Volume Elements (RVEs) coupled to the Bouvard ISV model and a Cohesive Zone Model (CZM) which featured a Traction-Separation (T-S) rule calibrated to results upscaled from Molecular Dynamics (MD) simulations. A DOE parametric examination utilized the RVEs to determine the relative effects of a selection of parameters on the mechanical performance of PCNs. DOE results determined that nanoclay particle orientation was the most influential parameter affecting PCN elastic modulus while intercalated interlamellar gallery strength had the greatest influence on PCN yield stress
|
28 |
Computational Software for Building Biochemical Reaction Network Models with Differential EquationsAllen, Nicholas A. 20 December 2005 (has links)
The cell is a highly ordered and intricate machine within which a wide variety of chemical processes take place. The full scientific understanding of cellular physiology requires accurate mathematical models that depict the temporal dynamics of these chemical processes. Modelers build mathematical models of chemical processes primarily from systems of differential equations. Although developing new biological ideas is more of an art than a science, constructing a mathematical model from a biological idea is largely mechanical and automatable.
This dissertation describes the practices and processes that biological modelers use for modeling and simulation. Computational biologists struggle with existing tools for creating models of complex eukaryotic cells. This dissertation develops new processes for biological modeling that make model creation, verification, validation, and testing less of a struggle. This dissertation introduces computational software that automates parts of the biological modeling process, including model building, transformation, execution, analysis, and evaluation. User and methodological requirements heavily affect the suitability of software for biological modeling. This dissertation examines the modeling software in terms of these requirements.
Intelligent, automated model evaluation shows a tremendous potential to enable the rapid, repeatable, and cost-effective development of accurate models. This dissertation presents a case study that indicates that automated model evaluation can reduce the evaluation time for a budding yeast model from several hours to a few seconds, representing a more than 1000-fold improvement. Although constructing an automated model evaluation procedure requires considerable domain expertise and skill in modeling and simulation, applying an existing automated model evaluation procedure does not. With this automated model evaluation procedure, the computer can then search for and potentially discover models superior to those that the biological modelers developed previously. / Ph. D.
|
29 |
Analysis-ready models of tortuous, tightly packed geometriesEdwards, John Martin 22 September 2014 (has links)
Complex networks of cells called neurons in the brain enable human learning and memory. The topology and electrophysiological function of these networks are affected by nano and microscale geometries of neurons. Understanding of these structure-function relationships in neurons is an important component of neuroscience in which simulation plays a fundamental role. This thesis addresses four specific geometric problems raised by modeling and simulation of intricate neuronal structure and behavior at the nanoscale. The first two problems deal with 3D surface reconstruction: neurons are geometrically complex structures that are tightly intertwined in the brain, presenting great challenges in reconstruction. We present the first algorithm that reconstructs surface meshes from polygonal contours that provably guarantees watertight, manifold, and intersection-free forests of densely packed structures. Many algorithms exist that produce surfaces from cross-sectional contours, but all either use heuristics in fitting the surface or they fail when presented with tortuous objects in close proximity. Our algorithm reconstructs surfaces that are not only internally correct, but are also free of intersections with other reconstructed objects in the same region. We also present a novel surface remeshing algorithm suitable for models of neuronal dual space. The last two problems treated by this thesis deal with producing derivative models from surface meshes. A range of neuronal simulation methodologies exist and we offer a framework to derive appropriate models for each from surface meshes. We present two specific algorithms that yield analysis-ready 1D cable models in one case, and proposed "aligned cell" models in the other. In the creation of aligned cells we also present a novel adaptive distance transform. Finally, we present a software package called VolRoverN in which we have implemented many of our algorithms and which we expect will serve as a repository of important tools for the neuronal modeling community. Our algorithms are designed to meet the immediate needs of the neuroscience community, but as we show in this thesis, they are general and suitable for a variety of applications. / text
|
30 |
Sediment Transport and Pathogen Indicator Modeling in Lake PontchartrainChilmakui, Chandra Sekhar 20 January 2006 (has links)
A nested three dimensional numerical modeling application was developed to determine the fate of pathogen indicators in Lake Pontchartrain discharged from its tributaries. To accomplish this, Estuarine, coastal and ocean model with sediment (ECOMSED) was implemented to simulate various processes that would determine the fate and transport of fecal coliform bacteria in the lake. The processes included hydrodynamics, waves, sediment transport, and the decay and transport of the fecal coliforms. Wind and tidal effects were accounted along with the freshwater inflows. All the components of the modeling application were calibrated and validated using measured data sets. Field measurements of the conventional water quality parameters and fecal coliform levels were used to calibrate and validate the pathogen indicator transport. The decay of the fecal coliforms was based on the literature and laboratory tests. The sediment transport module was calibrated based on the satellite reflectance data in the lake. The north shore near-field model indicated that the fecal coliform plume can be highly dynamic and sporadic depending on the wind and tide conditions. It also showed that the period of impact due to a storm event on the fecal coliform levels in the lake can be anywhere from 1.5 days for a typical summer event to 4 days for an extreme winter event. The model studies showed that the zone of impact of the stormwater from the river was limited to a few hundred meters from the river mouth. Finally, the modeling framework developed for the north shore was successfully applied to the south shore of Lake Pontchartrain to simulate fate and transport of fecal coliforms discharged through the urban stormwater outfalls.
|
Page generated in 0.0936 seconds