Spelling suggestions: "subject:"3dmodeling,"" "subject:"bymodeling,""
341 |
Long-term Effects of Notch1 Signaling on Neural Stem Cells following Traumatic Brain InjurySevilla, Cruz, Jr 01 January 2019 (has links)
Traumatic brain injury (TBI) is a devastating problem which stands as a leading cause of death and disability. The elderly is significantly affected by TBI, typically as the result of falls, and recovery is especially limited. This, in part, is associated with decreased tissue-specific stem cell regeneration and replacement of damaged cells in the aged brain. The diminished ability of the aged brain to recover is especially devastating after TBI, likely leading to permanent loss of sensory, motor, and cognitive functions. Studies have shown that the mature mammalian brain contains Neural Stem Cells (NSCs), found in specific regions of the brain, which can generate functional neurons during normal and pathological conditions. Two of those regions, the Dentate Gyrus (DG) of the hippocampus as well as the Subventricular Zone (SVZ) of the lateral ventricles, have proven to be niches for these multipotent NSCs. A key regulator in the maintenance of these NSCs is the Notch signaling pathway, shown to control proliferation, differentiation, and apoptosis of NSCs during development and throughout adulthood. In the current study, we assessed the regulatory mechanisms that drive the regenerative functions of NSCs in a neuropathological state following TBI. Using the Lateral Fluid Percussion Injury model, we analyzed the diffuse effects of the injury response on 3-month old male Sprague-Dawley rats. Immediately following TBI, Notch agonist, antagonist or vehicle was infused into the lateral ventricle for 7 days to assess the role of Notch signaling on neural stem cell proliferation/survival and neurogenesis at 30 days post-TBI. Dividing cells during infusion time were labeled with BrdU via single daily intraperitoneal injections for 7 days. Animals were sacrificed at 30 days post-injury and brain tissues were processed then immunolabeling for BrdU and Doublecortin. We found a higher number of BrdU-positive cells in the FPI+Notch1 agonist group when compared to Sham and FPI+Jagged-1 Fc antagonist groups in the contralateral granular zone. A significant increase in proliferation/survival was also seen in FPI+Notch1 versus Sham/FPI+Jagged-1 Fc and for FPI+Vehicle versus Sham animals in both the ipsilateral and contralateral hilus. DCX immunolabeling did not establish a significant difference in FPI+Notch1 compared to Sham animals, nor across any other groups, which is consistent with what we know of activation of the Notch pathway. Our results demonstrate that Notch1 signaling is directly involved in cellular proliferation/survival of NSCs in the DG following TBI at 30 days post-injury, but further work must be done to understand the fate of these cells. Thus, drug treatment targeting Notch1 signaling could serve as a potential therapeutic target following TBI to preserve NSCs and limit long-term cognitive deficits.
|
342 |
High Dimensional Non-Linear Optimization of Molecular ModelsFogarty, Joseph C. 20 November 2014 (has links)
Molecular models allow computer simulations to predict the microscopic properties of macroscopic systems. Molecular modeling can also provide a fully understood test system for the application of theoretical methods. The power of a model lies in the accuracy of the parameter values which govern its mathematical behavior. In this work, a new software, called ParOpt, for general high dimensional non-linear optimization will be presented. The software provides a very general framework for the optimization of a wide variety of parameter sets. The software is especially powerful when applied to the difficult task of molecular model parameter optimization. Three applications of the ParOpt software, and the Nelder-Mead algorithm implemented within it, are presented: a coarse-grained (CG) water--ion model, a model for the determination of lipid bilayer structure via the interpretation of scattering data, and a reactive molecular dynamics (ReaxFF) model for oxygen and hydrogen. Each problem presents specific difficulties. The power and generality of the ParOpt software is illustrated by the successful optimization of such a diverse set of problems.
|
343 |
A study of the perception of the impact of modeling on the development of commitment to action in Decision Conferencing.Wood, Margaret Ann January 2004 (has links)
Managers are increasingly faced with making complex decisions in turbulent organisational environments. This has led to greater information processing demands. Increasingly organisations try to deal with this in such a way that many of these decisions are now made in a group environment. The increase in group decision making has generated a corresponding intensification in the interest in options available to support such decision making. One such approach is a Group Decision Support System (GDSS) referred to as Decision Conferencing. However, Decision Conferencing rests on the unsupported key premise that the computer modeling, which forms an intrinsic part of the process, leads to shared understanding and commitment - the stated goals of the process. The application of Decision Conferencing to important organizational issues continues, yet prior to this study its fundamental premise was both empirically unsupported and potentially under-theorised. This theory-building research demonstrates that the interface between these concepts is more complex than the literature suggests and that the concepts themselves are problematic. Shared understanding is essentially a dependent variable, with factors such as comprehension of the modeling process impacting on the degree to which this is developed. In addition, many aspects of commitment fall outside of the domain of the Decision Conference workshop e.g. the individual’s sense of responsibility and degree of commitment to their profession. The idea of commitment appears to fall more into the arena of managerial responsibility and change management and it is partly how the outcomes are managed after the Decision Conference which will be crucial to their implementation. / Within this study it appears that the most a Decision Conference can offer is the ‘buy-in’ or constructive involvement of the individual participant; the assurance of an unassailable case to which all participants have contributed, for the adoption of the outcomes; and the confidence in the outcomes that this brings. All of this suggests that a higher order goal which subsumes these factors should be considered when re-conceptualising the Decision Conferencing experience. It is suggested here that Decision Quality is a more appropriate goal for the Decision Conferencing process. In essence this is an expansion of the existing ‘best bet’ concept already endorsed in the Decision Conferencing literature. The thesis presents a number of conditions for assuring decision quality e.g. a democratic environment for decision making; mutual respect and an encouragement of diversity. It is also argued that it falls to the facilitator to encompass all of these factors. Given the above, it is also suggested that it is appropriate to consider an alternative conceptualization of Decision Conferencing which facilitators of public sector groups might adopt. This revised conceptualization is drawn from complexity theory. Incorporating the findings from this study a more strongly theorised facilitation approach, entitled Quality Facilitation Practice (QFP) has been developed. Taking into account all of the above a revised model for Decision Conferencing in the public sector is presented, incorporating both QFP and the higher order goal of Decision Quality.
|
344 |
Integrative methods for gene data analysis and knowledge discovery on the case study of KEDRI’s brain gene ontologyWang, Yuepeng January 2008 (has links)
In 2003, Pomeroy et al. published a research study that described a gene expression based prediction of central nervous system embryonal tumour (CNS) outcome. Over a half of decade, many models and approaches have been developed based on experimental data consisting of 99 samples with 7,129 genes. The way, how meaningful knowledge from these models can be extracted, and how this knowledge for further research is still a hot topic. This thesis addresses this and has developed an information method that includes modelling of interactive patterns, important genes discovery and visualisation of the obtained knowledge. The major goal of this thesis is to discover important genes responsible for CNS tumour and import these genes into a well structured knowledge framework system, called Brain-Gene-Ontology. In this thesis, we take the first step towards finding the most accurate model for analysing the CNS tumour by offering a comparative study of global, local and personalised modelling. Five traditional modelling approaches and a new personalised method – WWKNN (weighted distance, weighted variables K-nearest neighbours) – are investigated. To increase the classification accuracy and one-vs.-all based signal to- noise ratio is also developed for pre-processing experimental data. For the knowledge discovery, CNS-based ontology system is developed. Through ontology analysis, 21 discriminate genes are found to be relevant for different CNS tumour classes, medulloblastoma tumour subclass and medulloblastoma treatment outcome. All the findings in this thesis contribute for expanding the information space of the BGO framework.
|
345 |
Methodology to determine airport check-in counter arrangementsAhyudanari, Ervina , Civil & Environmental Engineering, Faculty of Engineering, UNSW January 2003 (has links)
Check-in area is an important component of airports. All passengers, except transit passengers and remote check-in passengers, have to enter this area prior to their departure. The convenience for passengers at this area is essential in order to gain more customers using the airports. In estimating the check-in counter arrangements, this thesis introduces a method that is based on spreadsheet software packages. Two programs are developed to assist the optimization computations. The programs provide the optimum number of servers required at the airport. This will help airport management to select the number of counters at a given time. The results of the execution process of these two programs indicate the variables, such as earliness distribution, service time, queue system, and the check in counter sizes and configurations have strong influence to overall cost. A number of applications have been attempted and distribution has been explored. The results also demonstrated that under the condition imposed the multiple queue system provides less maximum queue length but longer waiting time compared to single queue system.
|
346 |
Some Concepts of Estuarine ModelingJönsson, Bror January 2005 (has links)
<p>If an estuarine system is to be investigated using an oceanographic modeling approach, a decision must be made whether to use a simple and robust framework based on e.g. mass-balance considerations, or if a more advanced process-resolving three-dimensional (3-D) numerical model are necessary. Although the former are straightforward to apply, certain fundamental constraints must be fulfilled. 3-D modeling, even though requiring significant efforts to implement, generates an abundance of highly resolved data in time and space, which may lead to problems when attempting to specify the "representative state" of the system, a common goal in estuarine studies.</p><p>In this thesis, different types of models suitable for investigating estuarine systems have been utilized in various settings. A mass-balance model was applied to investigate potential changes of water fluxes and salinities due to the restoration of a mangrove estuary in northern Colombia. Seiches, i.e. standing waves, in the Baltic Sea were simulated using a 2-D shallow-water model which showed that the dominating harmonic oscillation originates from a fjord seiche in the Gulf of Finland rather than being global. Another study pertaining to the Gulf of Finland used velocity-fields from a 3-D numerical model together with Lagrangian-trajectory analyses to investigate the mixing dynamics. The results showed that water from the Baltic proper is mixed with that from the river Neva over a limited zone in the inner parts of the Gulf. Lagrangian-trajectory analysis was finally also used as a tool to compare mass-balance and 3-D model results from the Gulf of Riga and the Bay of Gdansk, highlighting when and where each method is applicable.</p><p>From the present thesis it can be concluded that the above described estuarine-modeling approaches not only require different levels of effort for their implementation, but also yield results of varying quality. If oceanographic aspects are to be taken into account within Integrated Coastal Zone Managment, which most likely should be the case, it is therefore important to decide as early as possible in the planning process which model to use, since this choice ultimately determines how much information about the physical processes characterizing the system the model can be expected to provide.</p>
|
347 |
Metallic Cluster Coalescence: Molecular Dynamics Simulations of Boundary FormationTakahashi, A. R., Thompson, Carl V., Carter, W. Craig 01 1900 (has links)
During the evaporative deposition of polycrystalline thin films, the development of a tensile stress at small film thicknesses is associated with island coalescence. Several continuum models exist to describe the magnitude of this tensile stress but the coalescence stress becomes significant at small enough thicknesses to draw the continuum models into question. For nanometer-sized islands, we perform atomistic simulations of island coalescence to determine if the atomistic methods and continuum models are mutually consistent. The additional detail provided by the atomistic simulations allows for study of the kinetics of island coalescence and the treatment of different crystallographic orientations. We find that the atomistic simulations are consistent with the continuum models. We also note that the atomistic simulations predict extremely fast coalescence times and include the possibility of island rotations during coalescence. / Singapore-MIT Alliance (SMA)
|
348 |
Atomistic Simulations of Metallic Cluster CoalescenceTakahashi, A. R., Thompson, Carl V., Carter, W. Craig 01 1900 (has links)
A new computational method is introduced to investigate the stresses developed in the island-coalescence stage of polycrystalline film formation during deposition. The method uses molecular dynamics to examine the behavior of clusters of atoms both in free space and on substrates. Continuum treatments used in previous models may not be applicable at small length scales or low dimensionality. In atomistic simulations, the effects of surface diffusion, bond straining and defect formation can be directly studied. TEM experiments will be used to evaluate the validity of the simulation model. / Singapore-MIT Alliance (SMA)
|
349 |
A Two-level Prediction Model for Deep Reactive Ion Etch (DRIE)Taylor, Hayden K., Sun, Hongwei, Hill, Tyrone F., Schmidt, Martin A., Boning, Duane S. 01 1900 (has links)
We contribute a quantitative and systematic model to capture etch non-uniformity in deep reactive ion etch of microelectromechanical systems (MEMS) devices. Deep reactive ion etch is commonly used in MEMS fabrication where high-aspect ratio features are to be produced in silicon. It is typical for many supposedly identical devices, perhaps of diameter 10 mm, to be etched simultaneously into one silicon wafer of diameter 150 mm. Etch non-uniformity depends on uneven distributions of ion and neutral species at the wafer level, and on local consumption of those species at the device, or die, level. An ion–neutral synergism model is constructed from data obtained from etching several layouts of differing pattern opening densities. Such a model is used to predict wafer-level variation with an r.m.s. error below 3%. This model is combined with a die-level model, which we have reported previously, on a MEMS layout. The two-level model is shown to enable prediction of both within-die and wafer-scale etch rate variation for arbitrary wafer loadings. / Singapore-MIT Alliance (SMA)
|
350 |
Investigating the effects of transportation infrastructure development on energy consumption and emissionsAchtymichuk, Darren S. 11 1900 (has links)
This study outlines the development of an emissions modeling process in which tractive power based emissions functions are applied to microscopic traffic simulation data. The model enables transportation planners to evaluate the effects of transportation infrastructure projects on emissions and fuel consumption to aid in selecting the projects providing the greatest environmental return on investment.
Using the developed model, the performance of a set of simplified macroscopic velocity profiles used in an existing emissions model has been evaluated. The profiles were found to under predict the vehicle emissions due to the low acceleration rates used.
To illustrate the use of the model in evaluating transportation infrastructure projects, the benefits of two potential development scenarios in a major transportation corridor were evaluated. Weighing the benefits provided by each scenario against their associated costs revealed that greenhouse gas emissions would be reduced at a cost an order of magnitude greater than the value of a carbon credit suggesting that neither option is economical solely as a greenhouse gas emissions reduction tool.
|
Page generated in 0.0557 seconds