131 |
Computational Studies of HIV-1 Protease InhibitorsSchaal, Wesley January 2002 (has links)
<p>Human Immunodeficiency Virus (HIV) is the causative agent of the pandemic disease Acquired Immune Deficiency Syndrome (AIDS). HIV acts to disrupt the immune system which makes the body susceptible to opportunistic infections. Untreated, AIDS is generally fatal. Twenty years of research by countless scientists around the world has led to the discovery and exploitation of several targets in the replication cycle of HIV. Many lives have been saved, prolonged and improved as a result of this massive effort. One particularly successful target has been the inhibition of HIV protease. In combination with the inhibition of HIV reverse transcriptase, protease inhibitors have helped to reduce viral loads and partially restore the immune system. Unfortunately, viral mutations leading to drug resistance and harmful side-effects of the current medicines have identified the need for new drugs to combat HIV.</p><p>This study presents computational efforts to understand the interaction of inhibitors to HIV protease. The first part of this study has used molecular modelling and Comparative Molecular Field Analysis (CoMFA) to help explain the structure-active relationship of a novel series of protease inhibitors. The inhibitors are sulfamide derivatives structurally similar to the cyclic urea candidate drug mozenavir (DMP-450). The central ring of the sulfamides twists to adopt a nonsymmetrical binding mode distinct from that of the cyclic ureas. The energetics of this twist has been studied with <i>ab initio</i> calculations to develop improved empirical force field parameters for use in molecular modelling.</p><p>The second part of this study has focused on an analysis of the association and dissociation kinetics of a broad collection of HIV protease inhibitors. Quantitative models have been derived using CoMFA which relate the dissociation rate back to the chemical structures. Efforts have also been made to improve the models by systematically varying the parameters used to generate them.</p>
|
132 |
Distribution Patterns and Metapopulation Dynamics of Epiphytic Mosses and LichensSnäll, Tord January 2003 (has links)
<p>This thesis examines the relative importance of local conditions, dispersal and dynamics of the trees on epiphyte distribution patterns and colonization-extinction dynamics. Study species are the mosses <i>Orthotrichum speciosum </i>and <i>O. obtusifolium</i>, and the red-listed <i>Neckera pennata</i>. The thesis also includes an attempt to parameterize a model for a lichen metapopulation (<i>Lobaria pulmonaria</i>) in a dynamic landscape, based on only presence/absence data of the epiphyte and its host trees. </p><p>The results show that epiphyte colonization of trees is affected by both local conditions, and by connectivity to occupied trees. The positive effect of connectivity, implying a restricted dispersal range, was established by both demographic and genetic studies. The important local conditions were tree diameter and vitality, and shade. Local extinctions from trees occurred among small trees with low local epiphyte abundance, but more often, were the results of tree fall.</p><p>The observed importance of connectivity on epiphyte colonization agrees with the assumptions of the classic metapopulation model. However, the classic metapopulation model assumes that the landscape is static, and that local extinctions occur for stochastic reasons. The dynamics of epiphytes are different. A new conceptual model is therefore suggested, the patch-tracking metapopulation model. It differs from the classic metapopulation model in that it includes dynamics of the patches, and in that local extinctions only occur as patches are destroyed.</p><p>Simulations of the dynamics of <i>N. pennata </i>showed that its future metapopulation size will be overestimated unless the dynamics of the trees are accounted for. The simulation results further suggest that the dynamics of <i>N. pennata </i>can be characterised by the patch-tracking metapopulation model. </p><p>The attempt to parameterize the <i>L. pulmonaria </i>metapopulation model showed that more information are required for rigorous parameterization, preferably of the past historic fire regime.</p>
|
133 |
Distribution Patterns and Metapopulation Dynamics of Epiphytic Mosses and LichensSnäll, Tord January 2003 (has links)
This thesis examines the relative importance of local conditions, dispersal and dynamics of the trees on epiphyte distribution patterns and colonization-extinction dynamics. Study species are the mosses Orthotrichum speciosum and O. obtusifolium, and the red-listed Neckera pennata. The thesis also includes an attempt to parameterize a model for a lichen metapopulation (Lobaria pulmonaria) in a dynamic landscape, based on only presence/absence data of the epiphyte and its host trees. The results show that epiphyte colonization of trees is affected by both local conditions, and by connectivity to occupied trees. The positive effect of connectivity, implying a restricted dispersal range, was established by both demographic and genetic studies. The important local conditions were tree diameter and vitality, and shade. Local extinctions from trees occurred among small trees with low local epiphyte abundance, but more often, were the results of tree fall. The observed importance of connectivity on epiphyte colonization agrees with the assumptions of the classic metapopulation model. However, the classic metapopulation model assumes that the landscape is static, and that local extinctions occur for stochastic reasons. The dynamics of epiphytes are different. A new conceptual model is therefore suggested, the patch-tracking metapopulation model. It differs from the classic metapopulation model in that it includes dynamics of the patches, and in that local extinctions only occur as patches are destroyed. Simulations of the dynamics of N. pennata showed that its future metapopulation size will be overestimated unless the dynamics of the trees are accounted for. The simulation results further suggest that the dynamics of N. pennata can be characterised by the patch-tracking metapopulation model. The attempt to parameterize the L. pulmonaria metapopulation model showed that more information are required for rigorous parameterization, preferably of the past historic fire regime.
|
134 |
Range Parameterized Bearings-only Tracking Using Particle FilterArslan, Ali Erkin 01 September 2012 (has links) (PDF)
In this study, accurate target tracking for bearings-only tracking problem is investigated. A new tracking filter for this nonlinear problem is designed where both range parameterization and Rao-Blackwellized (marginalized) particle filtering techniques are used in a Gaussian mixture formulation to track both constant velocity and maneuvering targets. The idea of using target turn rate in the state equation in such a way that marginalization is possible is elaborated. Addition to nonlinear nature, unobservability is a major problem of bearings-only tracking. Observer trajectory generation to increase the observability of the bearings-only tracking problem is studied. Novel formulation of observability measures based on mutual information between the state and the measurement sequences are derived for the problem. These measures are used as objective functions to improve observability. Based on the results obtained better understanding of the required observer trajectory for accurate bearings-only target tracking is developed.
|
135 |
Application of the Weather Research and Forecasting (WRF) Model to Simulate a Squall Line: Implications of Choosing Parameterization Scheme Combinations and Model Initialization Data SetsGaines, Mitchell 01 August 2012 (has links)
On January 29-30, 2008 a squall line of thunderstorms moved through the Ohio Valley resulting in four deaths and one injury. Such events highlight the importance of accurate forecasting for public safety. Mesoscale Modeling plays an important role in any forecast of a potential squall line. The focus of this study was to examine the performance of several parameterization scheme combinations in the Weather Research and Forecasting Model version three (WRF) as they related to this event. These examinations included cloud microphysics (WRF Single-Moment 3-class, 6-class, and Goddard), cumulus parameterization (Kain-Fritsch and Bets-Miller-Janjic) and planetary boundary layer schemes (Yonsei-University and Mellor-Yamada-Janjic). A total of 12 WRF simulations were conducted for all potential scheme combinations. Data from the WRF simulations for several locations in south central Kentucky were analyzed and compared using Kentucky Mesonet observations for four locations: Bowling Green, Russellville, Murray and Liberty, KY. A fine model resolution of 1 km was used over these locations. Coarser resolutions of 3 km and 9 km were used on the outer two domains, which encompassed the Ohio and Tennessee Valleys. The model simulation performance was assessed using established statistical measures for the above four locations and by visually comparing the North American Regional Reanalysis dataset (NARR) along with modeled simulations. The most satisfactory scheme combination was the WRF Single-Moment 3-class Microphysics scheme, Kain-Fritsch cumulus parameterization scheme and Yonsei University scheme for the planetary boundary layer. The planetary boundary layer schemes were noted to have the greatest influence in determining the most satisfactory model simulations. There was limited influence from different selections of microphysics and cumulus parameterization schemes. The preferred physics parameters from these simulations were then used in six additional simulations to analyze the affect different initialization data sets have with regards to model output. Data sets used in these simulations were the Final Operational Analysis global data, North American Regional Reanalysis (3 and 6 hour) and the North American Mesoscale Model at 1, 3 and 6 hour timesteps, for a total of six simulations. More timesteps or an increase in model resolution did not materially improve the model performance.
|
136 |
Computational Studies of HIV-1 Protease InhibitorsSchaal, Wesley January 2002 (has links)
Human Immunodeficiency Virus (HIV) is the causative agent of the pandemic disease Acquired Immune Deficiency Syndrome (AIDS). HIV acts to disrupt the immune system which makes the body susceptible to opportunistic infections. Untreated, AIDS is generally fatal. Twenty years of research by countless scientists around the world has led to the discovery and exploitation of several targets in the replication cycle of HIV. Many lives have been saved, prolonged and improved as a result of this massive effort. One particularly successful target has been the inhibition of HIV protease. In combination with the inhibition of HIV reverse transcriptase, protease inhibitors have helped to reduce viral loads and partially restore the immune system. Unfortunately, viral mutations leading to drug resistance and harmful side-effects of the current medicines have identified the need for new drugs to combat HIV. This study presents computational efforts to understand the interaction of inhibitors to HIV protease. The first part of this study has used molecular modelling and Comparative Molecular Field Analysis (CoMFA) to help explain the structure-active relationship of a novel series of protease inhibitors. The inhibitors are sulfamide derivatives structurally similar to the cyclic urea candidate drug mozenavir (DMP-450). The central ring of the sulfamides twists to adopt a nonsymmetrical binding mode distinct from that of the cyclic ureas. The energetics of this twist has been studied with ab initio calculations to develop improved empirical force field parameters for use in molecular modelling. The second part of this study has focused on an analysis of the association and dissociation kinetics of a broad collection of HIV protease inhibitors. Quantitative models have been derived using CoMFA which relate the dissociation rate back to the chemical structures. Efforts have also been made to improve the models by systematically varying the parameters used to generate them.
|
137 |
Regional Precipitation Study in Central America, Using the WRF ModelMaldonado, Tito January 2012 (has links)
Using the regional climate model WRF, and the NCEP-NCAR Reanalysis Project data asboundary and initial conditions, regional precipitation was estimated by means of thedynamical downscaling technique for two selected periods, January 2000 and September2007. These months show very particular climatic characteristics of the precipitationregimen in Central America, like dry (wet) conditions in the Pacific (Caribbean) coast of theCentral American isthmus, in January, and wet (dry) conditions, respectively in each coast,during September. Four-nested-domains, each grids of resolution of 90 km (d01), 30 km(d02), 10 km (d03), and 3.3 km (d04), were configured over this region. The runs werereinitialized each 5 days with 6 hours of spin-up time for adjustment of the model. A total of8 experiments (4 per month) were tested in order to study: a) two important CumulusParameterization Schemes (CPS), Kain-Fritsch (KF) and Grell-Devenyi (GD); and b) thephysical interaction between nested domains (one- and two-way nesting), during eachsimulated month.January 2000 results showed that the modeled precipitation is in agreement withobservations, and also captured the mean climate features of rainfall concerning magnitude,and spatial distribution, like the particular precipitation contrast between the Pacific and theCaribbean coast.Outputs from September 2007 revealed significant differences when a visual comparison ismade to the spatial distribution of each coarse domain (d01, d02, and d03) with theirrespective domain in each experiment. However, the inner grids (d04) in all theexperiments, showed a similar spatial distribution and magnitude estimation, mainly inthose runs using one-way nesting configuration. Furthermore, the results for this mothdiffer substantially with observations, and the latter could be related with associateddeficiencies in the boundary condition that do not reproduce well the transition periodsfrom warm to cold El Niño episodes.Moreover, in all the experiments, the KF scheme calculated more precipitation than the GDscheme and it is associated to the ability of the GD scheme to reproduce spotty but intenserainfall, and apparently, this scheme is reluctant to activate, frequently yielding little or norain. However, when rainfall does develop, it is very intense.Also, the time series do not replicate specific precipitation events, thus, the 5-daysintegration period used in this study, is not enough to reproduce short-period precipitationevents.Finally, physical interaction issues between the nested domains are reflected indiscontinuities in the precipitation field, which have been associated to mass fieldadjustment in the CPS. / Nederbörden i Central Amerika har uppskattats med dynamisk nedskalning för två utvaldaperioder, januari 2000 och september 2007. Global återanalysdata från NCEP-NCARsåteranalysprojekt har använts som randdata och initialdata till den regionalaklimatmodellen WRF. De studerade månaderna uppvisar stora variationer inederbördsmönster, t ex lite (mycket) nederbörd under januari och mycket (lite) nederbördunder september för kustområdena längs Stilla havet (Karibiska havet). Fyra nästladedomäner över Central Amerika har använts med en upplösning på 90 km (d01), 30 km (d02),10 km (d03) och 3,3 km (d04). Simuleringarna initialiserades var 5:e dag och de första 6timmarna efter varje initialisering används för modellens anpassning till initialtillståndet.Totalt 8 experiment genomfördes (4 för varje månad) för att studera: (a) två olika sätt attparameterisera konvektion i Cumulusmoln (CPS), Kain-Fritsch (KF) och Grell-Devenyi (GD)och (b) den fysikaliska interaktionen mellan de nästlade domänerna (en- respektive tvåvägsnästlade scheman).För januari 2000 var det god överensstämmelse mellan modellerad och observeradnederbörd. Modellen beskriver väl såväl mängden nederbörd som den rumsligafördelningen, t ex den stora kontrasten mellan kustområdena längs Stilla havet och Karibiskahavet.För september 2007 uppvisar den modellerade nederbörden stora skillnader i de olikaexperimenten för de yttre domänerna (d01, d02, d03). För den inre domänen (d04) ärresultaten från de olika experimenten betydligt mer lika, särskilt för experimenten medenvägs nästlade scheman. Vidare skiljer sig den modellerade nederbörden väsentligt frånobserverad nederbörd under september 2007. Detta kan förklaras med felaktiga randdatapå grund av problemet i återanalys data att reproducera perioder med övergång från varmtill kall El Niño. I alla experiment gav KF mer nederbörd än GD, det kan förklaras med att GDbättre reproducerar kortvarig, intensiv nederbörd. Det finns en viss tröghet innannederbörden i GD aktiveras, vilket innebär större frekvens av lite eller ingen nederbörd. Närnederbörden väl utvecklas blir den dock intensiv. WRF-modellen klarar inte av att återgespecifika nederbördshändelser för de genomförda experimenten, vilket betyder att 5-dagarär för lång simuleringstid för att kunna reproducera specifika händelser. Slutligen,interaktion mellan de nästlade domänerna skapar diskontinuiteter i nederbördsmöns.
|
138 |
Surface Topological Analysis for Image SynthesisZhang, Eugene 09 July 2004 (has links)
Topology-related issues are becoming increasingly important in Computer Graphics. This research examines the use of topological analysis for solving two important problems in 3D Graphics: surface parameterization, and vector field design on surfaces. Many applications, such as high-quality and interactive image synthesis, benefit from the solutions to these problems.
Surface parameterization refers to segmenting a 3D surface into a number of patches and unfolding them onto a plane. A surface parameterization allows surface properties to be sampled and stored in a texture map for high-quality and interactive display. One of the most important quality measurements for surface parameterization is stretch, which causes an uneven sampling rate across the surface and needs to be avoided whenever possible. In this thesis, I present an automatic parameterization technique that segments the surface according to the handles and large protrusions in the surface. This results in a small number of large patches that can be unfolded with relatively little stretch. To locate the handles and large protrusions, I make use of topological analysis of a distance-based function on the surface.
Vector field design refers to creating continuous vector fields on 3D surfaces with control over vector field topology, such as the number and location of the singularities. Many graphics applications make use of an input vector field. The singularities in the input vector field often cause visual artifacts for these applications, such as texture synthesis and non-photorealistic rendering. In this thesis, I describe a vector field design system for both planar domains and 3D mesh surfaces. The system provides topological editing operations that allow the user to control the number and location of the singularities in the vector field. For the system to work for 3D meshes surface, I present a novel piecewise interpolating scheme that produces a continuous vector field based on the vector values defined at the vertices of the mesh. I demonstrate the effectiveness of the system through several graphics applications: painterly rendering of still images, pencil-sketches of surfaces, and texture synthesis.
|
139 |
Physically Meaningful Harmonization of Tire/Pavement Friction Measurement DevicesRajapakshe, Madhura Priyanga Nishshanke 01 January 2011 (has links)
Accurate characterization and evaluation of tire/pavement friction is critical in assuring runway and highway safety. Historically, Pavement Friction Measurement Devices (PFMDs) employing different measuring mechanisms have been used to evaluate tire/pavement friction. They yield significantly disparate friction coefficients under the same contact conditions. Currently, an empirically developed data harmonization method based on a reference device (Dynamic Friction Tester (DFT)) is used in an attempt to overcome the disparities between the measurements using various different PFMDs. However, this method, which has been standardized by the American Society for Testing and Materials (ASTM E1960), has been criticized for its inconsistency by researchers and runway/highway operations personnel.
The objective of this dissertation research was to develop a systematic and physically intuitive harmonization method for PFMDs that will improve the comparability of their data. As a foundation for such a harmonization, the LuGre tire model that employs physically meaningful parameters to represent the main attributes of tire/pavement friction was evaluated and validated. Measurements of tire/pavement friction by three widely used PFMDs; Locked Wheel Skid Trailer (LWST), Runway Friction Tester (RFT) and DFT, were accurately predicted using nonlinear optimization of LuGre model parameters. The LuGre model was found to be superior compared to the model used in the current ASTM E1960 standardization procedure for predicting PFMD measurements.
A sensitivity analysis was performed to identify the relative significance of the LuGre model parameters in characterizing tire/pavement friction, and to study the effects of variation of those parameters on predicted frictional behavior. A set of laboratory tire experiments was designed and performed to validate the physical significance of LuGre tire model parameters and to study how they behave under typical load, inflation pressure, excitation frequency, and amplitude conditions. An empirical method was developed to accommodate the effects of water film thickness on tire/pavement friction in the LuGre model. The results of the sensitivity analysis and the experiments to directly estimate the model parameters were used to identify and quantify appropriate modifications to the measurement mechanisms of PFMDs that can be introduced to improve the comparability of their results. Friction experiments performed after introducing such modifications to the LWST showed an average reduction of 20% in the deviations between the results of LWST and RFT measurements.
The research carried out in this dissertation is significant because it: (i) identified the deficiencies in the current method for harmonizing PFMD measurements and the underlying reasons for these deficiencies, (ii) emphasized the importance of a standardization approach that regulates the physical condition of PFMDs, in order to achieve universal comparability of tire/pavement friction measurements, (iii) validated that the LuGre tire model is a tire/pavement friction model capable of facilitating a better standardization approach, and, (iv) initialized the development of a physically meaningful harmonization procedure for PFMDs.
|
140 |
Peakflow response of stream networks : implications of physical descriptions of streams and temporal changeÅkesson, Anna January 2015 (has links)
Through distributed stream network routing, it has quantitatively been shown that the relationship between flow travel time and discharge varies strongly nonlinearly with stream stage and with catchment-specific properties. Physically derived distributions of water travel times through a stream network were successfully used to parameterise the streamflow response function of a compartmental hydrological model. Predictions were found to improve compared to conventional statistically based parameterisation schemes, for most of the modelled scenarios, particularly for peakflow conditions. A Fourier spectral analysis of 55-110 years of daily discharge time series from 79 unregulated catchments in Sweden revealed that the discharge power spectral slope has gradually increased over time, with significant increases for 58 catchments. The results indicated that the catchment scaling function power spectrum had steepened in most of the catchments for which historical precipitation series were available. These results suggest that (local) land-use changes within the catchments may affect the discharge power spectra more significantly than changes in precipitation (climate change). A case study from an agriculturally intense catchment using historical (from the 1880s) and modern stream network maps revealed that the average stream network flow distance as well as average water levels were substantially diminished over the past century, while average bottom slopes increased. The study verifies the hypothesis that anthropogenic changes (determined through scenario modelling using a 1D distributed routing model) of stream network properties can have a substantial influence on the travel times through the stream networks and thus on the discharge hydrographs. The findings stress the need for a more hydrodynamically based approach to adequately describe the variation of streamflow response, especially for predictions of higher discharges. An increased physical basis of response functions can be beneficial in improving discharge predictions during conditions in which conventional parameterisation based on historical flow patterns may not be possible - for example, for extreme peak flows and during periods of nonstationary conditions, such as during periods of climate and/or land use change. / <p>QC 20150903</p>
|
Page generated in 0.1094 seconds