• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 137
  • 79
  • 41
  • 23
  • 16
  • 4
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 370
  • 61
  • 56
  • 52
  • 51
  • 45
  • 39
  • 37
  • 36
  • 34
  • 33
  • 30
  • 29
  • 29
  • 28
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
111

PERFORMANCE AND ENDURANCE CONTROL IN EMERGING STORAGE TECHNOLOGIES

Roy, Tanaya, 0000-0003-4545-9299 January 2021 (has links)
The current diverse and wide range of computing moves towards the cloud and de- mands high performance in low latency and high throughput. Facebook reported that 3.3 billion people monthly and 2.6 billion people daily use their data centers over the network. Many emerging user-facing applications require strict control over the stor- age latency’s tail to provide a quality user experience. The low-latency requirement triggers the ongoing replacement of hard drives (HDDs) by solid-state drives (SSDs) in the enterprise, enabling much higher performance and lower end-to-end storage latencies. It becomes more challenging to ensure low latency while maintaining the device’s endurance ratings. We address this challenge in the following ways: 1. Enhance the overall storage system’s performance and maintain the SSD endurance using emerging Non-volatile memory (ENVM) technology. 2. Implement deterministic la- tency in the storage path for latency-sensitive applications. 3. Provide low-latency and differentiated services when write-intensive workloads are present in a shared environment. We have proposed the performance and endurance-centric mechanisms to evaluate the tradeoffs between performance and endurance. In the first approach, our goal is to achieve low storage latency and a long lifetime of the SSD simultane- ously, even for a write-heavy workload. Incorporating a significantly smaller amount of ENVM with SSD as a cache helps to achieve the said goal.SSDs using the NVMe (Non-Volatile Memory Express) interface can achieve low latency as the interface provides several advanced features. The second approach has iii explored such features to control the storage tail latency in a distributed environment. The ”predictable latency mode (PLM)” advanced feature helps to achieve determinis- tic storage latency. SSDs need to perform many background management operations to deal with the underlying flash technology traits, the most time-consuming ones be- ing garbage collection and wear leveling. The latency requirement of latency-sensitive applications violates when the I/O requests fall behind such management activities. PLM leverages SSD controllers to perform the background operations during a win- dow, called a ”non-deterministic window (NDWin)”. Whereas during the ”determin- istic window (DTWin)”, applications will experience no such operations. We have extended this feature in the distributed environment and showed how it helps achieve low storage latency when the proposed ”PLM coordinator (PLMC)” is incorporated. In a shared environment with write-intensive workloads present, result in latency peak for Read IO. Moreover, it is required to provide differentiated services with multiple QoS classes present in the workload mixture. We have extended the PLM concept on hybrid storage to realize the deterministic latency for tight tail-controlled appli- cations and assure differentiated services among multiple QoS applications. Since nearly all of the storage access in a data center is over the network, an end-to-end path consists of three components: The host component, Network component, and Storage Component. For latency-sensitive applications, the overall tail latency needs to consider all these components. In a NAS (Network Attached Storage) architecture, it is worth studying the QoS class aware services present at the different components to provide an overall low request-response latency. Therefore, it helps future research to embrace the gaps that have not been considered yet. / Computer and Information Science
112

Bayesian Uncertainty Modeling in Decomposed Multilevel Optimization

Dettwiller, Ian Daniel 06 May 2017 (has links)
Bayesian updating is used to approximate discontinuous multi-interval uncertainty representations (i.e., belief structures) of epistemic uncertainty. Several Bayesian-based approaches are examined for assessing the accuracy of approximating the mean and standard deviation of a belief structure and calculating reliability using posterior distributions. Moreover, a Bayesian-based belief structure approximation is integrated with a decomposed multilevel optimization solution strategy through analytical target cascading, where the ensuing reliability-based design optimization problem within each decomposed element is solved using a single loop single vector approach. The non-deterministic decomposed multilevel optimization approach is demonstrated through solutions to four analytical benchmark problems with mixed aleatory and epistemic uncertainties as well as a nano-enhanced composite sandwich plate problem. Consistent with the integrated computational materials engineering philosophy, the proposed solution strategy for the sandwich plate problem combines micro- and macro-level material modeling and design with structural level analysis and optimization. The orientation distribution of the carbon nanofibers in the micro-mechanical model is described through a belief structure and modeled using a Bayesian approach. Aleatory uncertainty in the ply thickness of the composite facesheets is also considered. This problem is used to demonstrate computationally efficient integration of epistemic uncertainty described through a belief structure for a complex design problem with mixed uncertainties. The results of this study show that the posterior distributions from some of the Bayesian-based approaches are suitable for direct calculation of reliability through joint probability density functions. Moreover, the Bayesian-based approach can provide a computationally efficient method for integrating epistemic and aleatory uncertainties in decomposed multilevel optimization of complex problems.
113

Deterministic Brownian Motion

Trefán, György 08 1900 (has links)
The goal of this thesis is to contribute to the ambitious program of the foundation of developing statistical physics using chaos. We build a deterministic model of Brownian motion and provide a microscpoic derivation of the Fokker-Planck equation. Since the Brownian motion of a particle is the result of the competing processes of diffusion and dissipation, we create a model where both diffusion and dissipation originate from the same deterministic mechanism - the deterministic interaction of that particle with its environment. We show that standard diffusion which is the basis of the Fokker-Planck equation rests on the Central Limit Theorem, and, consequently, on the possibility of deriving it from a deterministic process with a quickly decaying correlation function. The sensitive dependence on initial conditions, one of the defining properties of chaos insures this rapid decay. We carefully address the problem of deriving dissipation from the interaction of a particle with a fully deterministic nonlinear bath, that we term the booster. We show that the solution of this problem essentially rests on the linear response of a booster to an external perturbation. This raises a long-standing problem concerned with Kubo's Linear Response Theory and the strong criticism against it by van Kampen. Kubo's theory is based on a perturbation treatment of the Liouville equation, which, in turn, is expected to be totally equivalent to a first-order perturbation treatment of single trajectories. Since the boosters are chaotic, and chaos is essential to generate diffusion, the single trajectories are highly unstable and do not respond linearly to weak external perturbation. We adopt chaotic maps as boosters of a Brownian particle, and therefore address the problem of the response of a chaotic booster to an external perturbation. We notice that a fully chaotic map is characterized by an invariant measure which is a continuous function of the control parameters of the map. Consequently if the external perturbation is made to act on a control parameter of the map, we show that the booster distribution undergoes slight modifications as an effect of the weak external perturbation, thereby leading to a linear response of the mean value of the perturbed variable of the booster. This approach to linear response completely bypasses the criticism of van Kampen. The joint use of these two phenomena, diffusion and friction stemming from the interaction of the Brownian particle with the same booster, makes the microscopic derivation of a Fokker-Planck equation and Brownian motion, possible.
114

Labour Demand Composition and Wage Responses in a Transition to a Clean Economy : A Case Study on the EU ETS

Boksebeld, Jeroen January 2023 (has links)
This thesis studies the effect on labour demand of a transition from an unconstrained economy featuring a ’clean’ and a ’dirty’ production method towards a fully clean economy, using a simplified General Equilibrium model. This model is calibrated to the European Union Emission Trading System (EUETS) and features a linear decline in a cap placed on the dirty production method. After simulating the trajectory set by the EU ETS, this study finds that wages are expected to increase by 11.68% over the transition period in the baseline scenario. This result is found to hold qualitatively both for steeper transitions and wide ranges of the elasticity of substitution and consumer preference in consumption, as long as clean productivity growth is sufficientand the consumer is either indifferent or favours the clean consumption good. The minimum level of clean productivity growth needed to achieve a long-runincrease in wages is found to be slightly below 2% per year, even when the cap on dirty production faces a steeper decline.
115

Maternal Chlamydia trachomatis and Neisseria gonorrhoeae Infections and the Outcome of Preterm Birth: The Impact of Early Detection

Folger, Alonzo T., V January 2012 (has links)
No description available.
116

Image charge detection statistics relevant for deterministic ion implantation

Räcke, Paul, Staacke, Robert, Gerlach, Jürgen W., Meijer, Jan, Spemann, Daniel 27 April 2023 (has links)
Image charge detection is a non-perturbative pre-detection approach for deterministic ion implantation. Using low energy ion bunches as a model system for highly charged single ions, we experimentally studied the error and detection rates of an image charge detector setup. The probability density functions of the signal amplitudes in the Fourier spectrum can be modelled with a generalised gamma distribution to predict error and detection rates. It is shown that the false positive error rate can be minimised at the cost of detection rate, but this does not impair the fidelity of a deterministic implantation process. Independent of the ion species, at a signal to-noise ratio of 2, a false positive error rate of 0.1% is achieved, while the detection rate is about 22%
117

A Production And Cost Modeling Methodology Of 2nd Generation Biofuel In The United States

Poole, David A 01 January 2012 (has links)
The use of biofuels in the United States has increased dramatically in the last few years. The largest source of feedstock for ethanol to date has been corn. However, corn is also a vitally important food crop and is used commonly as feed for cattle and other livestock. To prevent further diversion of an important food crop to production of ethanol, there is great interest in developing commercial-scale technologies to make ethanol from non-food crops, or other suitable plant material. This is commonly referred to as biomass. A review is made of lignocellulosic sources being considered as feedstocks to produce ethanol. Current technologies for pretreatment and hydrolysis of the biomass material are examined and discussed. Production data and cost estimates are culled from the literature, and used to assist in development of mathematical models for evaluation of production ramp-up profiles, and cost estimation. These mathematical models are useful as a planning tool, and provide a methodology to estimate monthly production output and costs for labor, capital, operations and maintenance, feedstock, raw materials, and total cost. Existing credits for ethanol production are also considered and modeled. The production output in liters is modeled as a negative exponential growth curve, with a rate coefficient providing the ability to evaluate slower, or faster, growth in production output and its corresponding effect on monthly cost. The capital and labor costs per unit of product are determined by dividing the monthly debt service and labor costs by that month’s production value. The remaining cost components change at a constant rate in the simulation case studies. This methodology is used to calculate production levels and costs as a function of time for a 25 million gallon per year capacity cellulosic ethanol plant. The parameters of interest are calculated in MATLAB with a deterministic, continuous system simulation model. Simulation results for high, medium, and low cost case studies are included. Assumptions for the model and for each case study are included and some comparisons are made to cost estimates in the literature. iv While the cost per unit of product decreases and production output increases over time, some reasonable cost values are obtained by the end of the second year for both the low and medium cost case studies. By the end of Year 2, total costs for those case studies are $0.48 per liter and $0.88 per liter, respectively. These cost estimates are well within the reported range of values from the reviewed literature sources. Differing assumptions for calculations made by different sources make a direct cost comparison with the outputs of this modeling methodology extremely difficult. Proposals for reducing costs are introduced. Limitations and shortcomings of the research activity are discussed, along with recommendations for potential future work in improving the simulation model and model verification activities. In summary, the author was not able to find evidence—within the public domain—of any similar modeling and simulation methodology that uses a deterministic, continuous simulation model to evaluate production and costs as a function of time. This methodology is also unique in highlighting the important effect of production ramp-up on monthly costs for capital (debt service) and labor. The resultant simulation model can be used for planning purposes and provides an independent, unbiased estimate of cost as a function of time.
118

A knowledge based design methodology for manufacturing assembly lines

Khan, Asar, Day, Andrew J. January 2002 (has links)
No
119

Spatial Evolutionary Game Theory: Deterministic Approximations, Decompositions, and Hierarchical Multi-scale Models

Hwang, Sung-Ha 01 September 2011 (has links)
Evolutionary game theory has recently emerged as a key paradigm in various behavioral science disciplines. In particular it provides powerful tools and a conceptual framework for the analysis of the time evolution of strategic interdependence among players and its consequences, especially when the players are spatially distributed and linked in a complex social network. We develop various evolutionary game models, analyze these models using appropriate techniques, and study their applications to complex phenomena. In the second chapter, we derive integro-differential equations as deterministic approximations of the microscopic updating stochastic processes. These generalize the known mean-field ordinary differential equations and provide powerful tools to investigate the spatial effects on the time evolutions of the agents' strategy choices. The deterministic equations allow us to identify many interesting features of the evolution of strategy profiles in a population, such as standing and traveling waves, and pattern formation, especially in replicator-type evolutions. We introduce several methods of decomposition of two player normal form games in the third chapter. Viewing the set of all games as a vector space, we exhibit explicit orthonormal bases for the subspaces of potential games, zero-sum games, and their orthogonal complements which we call anti-potential games and anti-zero-sum games, respectively. Perhaps surprisingly, every anti-potential game comes either from Rock-paper-scissors type games (in the case of symmetric games) or from Matching Pennies type games (in the case of asymmetric games). Using these decompositions, we prove old (and some new) cycle criteria for potential and zero-sum games (as orthogonality relations between subspaces). We illustrate the usefulness of our decompositions by (a) analyzing the generalized Rock-Paper-Scissors game, (b) completely characterizing the set of all null-stable games, (c) providing a large class of strict stable games, (d) relating the game decomposition to the Hodge decomposition of vector fields for the replicator equations, (e) constructing Lyapunov functions for some replicator dynamics, (f) constructing Zeeman games -games with an interior asymptotically stable Nash equilibrium and a pure strategy ESS. The hierarchical modeling of evolutionary games provides flexibility in addressing the complex nature of social interactions as well as systematic frameworks in which one can keep track of the interplay of within-group dynamics and between-group competitions. For example, it can model husbands and wives' interactions, playing an asymmetric game with each other, while engaging coordination problems with the likes in other families. In the fourth chapter, we provide hierarchical stochastic models of evolutionary games and approximations of these processes, and study their applications
120

A model for crop monitoring and yield prediction fusing remotely sensed data and prior information in a deterministic-probabilistic framework

Lovison-Golob, Lucia 31 January 2024 (has links)
This research focuses on the development of a deterministic-probabilistic framework for agricultural land use and management, specifically for both annual crops, such as wheat, barley and maize, and permanent crops, such as vineyards. The goal is to predict crop greening and peak crop development progressively through the growing season, based on accumulating information as the crop develops and matures, and to provide an accompanying uncertainty statement (credible interval) with each prediction. The integrated area underneath the phenology curve can be associated, although not explicitly in our example, with per-area crop yield. The prediction model relies on remotely sensed data, including science data products from the Landsat and MODIS (Moderate Resolution Imaging Spectroradiometer) spaceborne instruments, field data from agro-meteorological stations, and statistical data from prior years. The development of the deterministic-probabilistic model focuses on northeastern Italy, a region of small agricultural plots set in a diverse physical landscape, which is typical of many areas of old-world and developing-nation agriculture. The estimation process uses the phenological cycle of the MODIS Enhanced Vegetation Index (EVI), extracted from the satellite imagery at 500 m spatial resolution. Landsat data, at 30-m spatial resolution, are fused with MODIS data, to provide fine-scale information better suited to small-field agriculture. By applying a piecewise logistic function to model the time trajectory of EVI values, crop development and peak greenness are estimated and characterized based on the main phenological stages determined from the remote imagery trained with ground station observations. The deterministic-probabilistic model is later validated with observations from reference testing stations and statistical crop and yield data obtained independently by administrative districts such as regional and national organizations. A temporal filter of the main phenological stages, here called a crop calendar, plays a critical role. A Bayesian approach to integrate stochastically the parameters related to a certain area provides a way to include the different datasets at the different dimensions and scales and to assess the probability to obtain a vegetation index within a given uncertainty. The model becomes, therefore, a typical generalized linear model problem, deterministically described by a piecewise logistic function, with the parameters describing the peak phenological curve estimated probabilistically, with their own uncertainty. / 2026-01-31T00:00:00Z

Page generated in 0.0917 seconds