• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 336
  • 39
  • 30
  • 12
  • 10
  • 10
  • 10
  • 10
  • 10
  • 10
  • 6
  • 1
  • 1
  • Tagged with
  • 448
  • 448
  • 122
  • 120
  • 47
  • 47
  • 45
  • 43
  • 42
  • 35
  • 34
  • 34
  • 32
  • 31
  • 30
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
361

Adaptive multiscale modeling of polymeric materials using goal-oriented error estimation, Arlequin coupling, and goals algorithms

Bauman, Paul Thomas, 1980- 29 August 2008 (has links)
Scientific theories that explain how physical systems behave are described by mathematical models which provide the basis for computer simulations of events that occur in the physical universe. These models, being only mathematical characterizations of actual phenomena, are obviously subject to error because of the inherent limitations of all mathematical abstractions. In this work, new theory and methodologies are developed to quantify such modeling error in a special way that resolves a fundamental and standing issue: multiscale modeling, the development of models of events that transcend many spatial and temporal scales. Specifically, we devise the machinery for a posteriori estimates of relative modeling error between a model of fine scale and another of coarser scale, and we use this methodology as a general approach to multiscale problems. The target application is one of critical importance to nanomanufacturing: imprint lithography of semiconductor devices. The development of numerical methods for multiscale modeling has become one of the most important areas of computational science. Technological developments in the manufacturing of semiconductors hinge upon the ability to understand physical phenomena from the nanoscale to the microscale and beyond. Predictive simulation tools are critical to the advancement of nanomanufacturing semiconductor devices. In principle, they can displace expensive experiments and testing and optimize the design of the manufacturing process. The development of such tools rest on the edge of contemporary methods and high-performance computing capabilities and is a major open problem in computational science. In this dissertation, a molecular model is used to simulate the deformation of polymeric materials used in the fabrication of semiconductor devices. Algorithms are described which lead to a complex molecular model of polymer materials designed to produce an etch barrier, a critical component in imprint lithography approaches to semiconductor manufacturing. Each application of this so-called polymerization process leads to one realization of a lattice-type model of the polymer, a molecular statics model of enormous size and complexity. This is referred to as the base model for analyzing the deformation of the etch barrier, a critical feature of the manufacturing process. To reduce the size and complexity of this model, a sequence of coarser surrogate models is generated. These surrogates are the multiscale models critical to the successful computer simulation of the entire manufacturing process. The surrogate involves a combination of particle models, the molecular model of the polymer, and a coarse-scale model of the polymer as a nonlinear hyperelastic material. Coefficients for the nonlinear elastic continuum model are determined using numerical experiments on representative volume elements of the polymer model. Furthermore, a simple model of initial strain is incorporated in the continuum equations to model the inherit shrinking of the A coupled particle and continuum model is constructed using a special algorithm designed to provide constraints on a region of overlap between the continuum and particle models. This coupled model is based on the so-called Arlequin method that was introduced in the context of coupling two continuum models with differing levels of discretization. It is shown that the Arlequin problem for the particle-tocontinuum model is well posed in a one-dimensional setting involving linear harmonic springs coupled with a linearly elastic continuum. Several numerical examples are presented. Numerical experiments in three dimensions are also discussed in which the polymer model is coupled to a nonlinear elastic continuum. Error estimates in local quantities of interest are constructed in order to estimate the modeling error due to the approximation of the particle model by the coupled multiscale surrogate model. The estimates of the error are computed by solving an auxiliary adjoint, or dual, problem that incorporates as data the quantity of interest or its derivatives. The solution of the adjoint problem indicates how the error in the approximation of the polymer model inferences the error in the quantity of interest. The error in the quantity of interest represents the relative error between the value of the quantity evaluated for the base model, a quantity typically unavailable or intractable, and the value of the quantity of interest provided by the multiscale surrogate model. To estimate the error in the quantity of interest, a theorem is employed that establishes that the error coincides with the value of the residual functional acting on the adjoint solution plus a higher-order remainder. For each surrogate in a sequence of surrogates generated, the residual functional acting on various approximations of the adjoint is computed. These error estimates are used to construct an adaptive algorithm whereby the model is adapted by supplying additional fine-scale data in certain subdomains in order to reduce the error in the quantity of interest. The adaptation algorithm involves partitioning the domain and selecting which subdomains are to use the particle model, the continuum model, and where the two overlap. When the algorithm identifies that a region contributes a relatively large amount to the error in the quantity of interest, it is scheduled for refinement by switching the model for that region to the particle model. Numerical experiments on several configurations representative of nano-features in semiconductor device fabrication demonstrate the effectiveness of the error estimate in controlling the modeling error as well as the ability of the adaptive algorithm to reduce the error in the quantity of interest. There are two major conclusions of this study: 1. an effective and well posed multiscale model that couples particle and continuum models can be constructed as a surrogate to molecular statics models of polymer networks and 2. an error estimate of the modeling error for such systems can be estimated with sufficient accuracy to provide the basis for very effective multiscale modeling procedures. The methodology developed in this study provides a general approach to multiscale modeling. The computational procedures, computer codes, and results could provide a powerful tool in understanding, designing, and optimizing an important class of semiconductormanufacturing processes. The study in this dissertation involves all three components of the CAM graduate program requirements: Area A, Applicable Mathematics; Area B, Numerical Analysis and Scientific Computation; and Area C, Mathematical Modeling and Applications. The multiscale modeling approach developed here is based on the construction of continuum surrogates and coupling them to molecular statics models of polymer as well as a posteriori estimates of error and their adaptive control. A detailed mathematical analysis is provided for the Arlequin method in the context of coupling particle and continuum models for a class of one-dimensional model problems. Algorithms are described and implemented that solve the adaptive, nonlinear problem proposed in the multiscale surrogate problem. Large scale, parallel computations for the base model are also shown. Finally, detailed studies of models relevant to applications to semiconductor manufacturing are presented. / text
362

Model enhancements for state estimation in electric power systems

Hansen, Charles William 12 1900 (has links)
No description available.
363

Robust state estimation and model validation techniques in computer vision

Al-Takrouri, Saleh Othman Saleh, Electrical Engineering & Telecommunications, Faculty of Engineering, UNSW January 2008 (has links)
The main objective of this thesis is to apply ideas and techniques from modern control theory, especially from robust state estimation and model validation, to various important problems in computer vision. Robust model validation is used in texture recognition where new approaches for classifying texture samples and segmenting textured images are developed. Also, a new model validation approach to motion primitive recognition is demonstrated by considering the motion segmentation problem for a mobile wheeled robot. A new approach to image inpainting based on robust state estimation is proposed where the implementation presented here concerns with recovering corrupted frames in video sequences. Another application addressed in this thesis based on robust state estimation is video-based tracking. A new tracking system is proposed to follow connected regions in video frames representing the objects in consideration. The system accommodates tracking multiple objects and is designed to be robust towards occlusions. To demonstrate the performance of the proposed solutions, examples are provided where the developed methods are applied to various gray-scale images, colored images, gray-scale videos and colored videos. In addition, a new algorithm is introduced for motion estimation via inverse polynomial interpolation. Motion estimation plays a primary role within the video-based tracking system proposed in this thesis. The proposed motion estimation algorithm is also applied to medical image sequences. Motion estimation results presented in this thesis include pairs of images from a echocardiography video and a robot-assisted surgery video.
364

Variation in project parameters as a measure of improvement in software process control

Woodings, Terence Leslie January 2006 (has links)
[Truncated abstract] The primary tool for software process control is the project plan, with divergence from the schedule usually being the first indication that there are difficulties. Thus the estimation of the schedule, particularly the effort parameter, is a central element of software engineering management. Regrettably, estimation methods are poorly used within the software industry and accuracy is lacking when compared with other engineering disciplines. There are many reasons for this. However, the need to predict project effort remains, particularly in situations of tendering for contracts. The broad objective of this research is the improvement of project control by means of better estimation. . . The error in the prediction of a project parameter is investigated as the result of the variation in two distinct (estimation and actual development) processes. Improvement depends upon the understanding, control and then reduction of that variation. A strategy for the systematic identification of the sources of greatest variation is developed - so that it may be reduced by appropriate software engineering practices. The key to the success of the approach is the statistical partitioning of the Mean Square Error (of the estimate) in order to identify the weakest area of project control. The concept is proven with a set of student projects, where the estimation error is significantly reduced. The conditions for its transfer to industry are discussed and a systematic reduction in error is demonstrated on five sets of commercial project data. The thesis concludes with a discussion of the linking of the approach to current estimation methods. It should also have implications for the statistical process control of other projects involving small sample sizes and multiple correlated parameters.
365

Non-normal analysis of variance and regression procedures based on modified maximum likelihood estimators.

Milosevic-Hill, Sean Michael. Tiku, M.L. Unknown Date (has links)
Thesis (Ph.D.)--McMaster University (Canada), 1995. / Source: Dissertation Abstracts International, Volume: 56-12, Section: B, page: 6848. Adviser: M. L. Tiku.
366

An investigation of long-term dependence in time-series data /

Ellis, Craig. January 1998 (has links)
Thesis: Ph.D.--University of Western Sydney, Macarthur.Faculty of Business and Technology. 1998.
367

Recursive residuals and estimation for mixed models /

Bani-Mustafa, Ahmed. January 2004 (has links)
Thesis (Ph.D.) -- University of Western Sydney, 2004. / "A dissertation submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy" Bibliography : leaves 171-186.
368

Performance of estimation and detection algorithms in wireless networks /

Leong, Alex Seak Chon. January 2007 (has links)
Thesis (Ph.D.)--University of Melbourne, Dept. of Electrical and Electronic Engineering, 2008. / Typescript. Includes bibliographical references (leaves 149-158).
369

Adaptive multiscale modeling of polymeric materials using goal-oriented error estimation, Arlequin coupling, and goals algorithms

Bauman, Paul Thomas, January 1900 (has links)
Thesis (Ph. D.)--University of Texas at Austin, 2008. / Vita. Includes bibliographical references.
370

The generalized MLE with the interval centered and masked competing risks data

Wang, Jiaping. January 2009 (has links)
Thesis (Ph. D.)--State University of New York at Binghamton, Department of Mathematical Sciences, 2009. / Includes bibliographical references.

Page generated in 0.1138 seconds