• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 8
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 16
  • 16
  • 7
  • 6
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Data Requirements for a Look-Ahead System

Holma, Erik January 2007 (has links)
Look ahead cruise control deals with the concept of using recorded topographic road data combined with a GPS to control vehicle speed. The purpose of this is to save fuel without a change in travel time for a given road. This thesis explores the sensitivity of different disturbances for look ahead systems. Two different systems are investigated, one using a simple precalculated speed trajectory without feedback and the second based upon a model predictive control scheme with dynamic programming as optimizing algorithm. Defect input data like bad positioning, disturbed angle data, faults in mass estimation and wrong wheel radius are discussed in this thesis. Also some investigations of errors in the environmental model for the systems are done. Simulations over real road profiles with two different types of quantization of the road slope data are done. Results from quantization of the angle data in the system are important since quantization will be unavoidable in an implementation of a topographic road map. The results from the simulations shows that disturbance of the fictive road profiles used results in quite large deviations from the optimal case. For the recorded real road sections however the differences are close to zero. Finally conclusions of how large deviations from real world data a look ahead system can tolerate are drawn.
2

Data Requirements for a Look-Ahead System

Holma, Erik January 2007 (has links)
<p>Look ahead cruise control deals with the concept of using recorded topographic road data combined with a GPS to control vehicle speed. The purpose of this is to save fuel without a change in travel time for a given road. This thesis explores the sensitivity of different disturbances for look ahead systems. Two different systems are investigated, one using a simple precalculated speed trajectory without feedback and the second based upon a model predictive control scheme with dynamic programming as optimizing algorithm.</p><p>Defect input data like bad positioning, disturbed angle data, faults in mass estimation and wrong wheel radius are discussed in this thesis. Also some investigations of errors in the environmental model for the systems are done. Simulations over real road profiles with two different types of quantization of the road slope data are done. Results from quantization of the angle data in the system are important since quantization will be unavoidable in an implementation of a topographic road map.</p><p>The results from the simulations shows that disturbance of the fictive road profiles used results in quite large deviations from the optimal case. For the recorded real road sections however the differences are close to zero. Finally conclusions of how large deviations from real world data a look ahead system can tolerate are drawn.</p>
3

Reconstruction of Complete Head Models with Consistent Parameterization

Niloofar, Aghayan 16 April 2014 (has links)
This thesis introduces an efficient and robust approach for 3D reconstruction of complete head models with consistent parameterization and personalized shapes from several possible inputs. The system input consists of Cyberware laser-scanned data where we perform scanning task as well as publically available face data where (i) facial expression may or may not exist and (ii) only partial information of head may exist, for instance only front face part without back part of the head. Our method starts with a surface reconstruction approach to either transfer point clouds to a mesh structure or to fill missing points on a triangular mesh. Then, it is followed by a registration process which unifies the representation of all meshes. Afterward, a photo-cloning method is used to extract an adequate set of features in a semi-automatic way on snapshots taken from front and left views of provided range data. We modify Radial Basis Functions (RBFs) deformation so that it would be based on not only distance, but also regional information. Using feature point sets and modified RBFs deformation, a generic mesh can be manipulated in a way that closed eyes and mouth movements like separating upper lip and lower lip can be properly handled. In other word, such mesh modification method makes construction of various facial expressions possible. Moreover, new functions are added where a generic model can be manipulated based on feature point sets to consequently recover missing parts such as ears, back of the head and neck in the input face. After feature-based deformation using modified radial basis functions, a fine mesh modification method based on model points follows to extract the fine details from the available range data. Then, some post refinement procedures employing RBFs deformation and averaging neighboring points are carried out to make the surface of reconstructed 3D head smoother and uniform. Due to existence of flaws and defects on the mesh surface such as flipped triangles, self-intersections or degenerate faces, an automatic repairing approach is leveraged to clean up the entire surface of the mesh. The experiments which are performed on various models show that our method is robust and efficient in terms of accurate full head reconstruction from input data and execution time, respectively. In our method, it is also aimed to use minimum user interaction as much as possible.
4

Reconstruction of Complete Head Models with Consistent Parameterization

Niloofar, Aghayan January 2014 (has links)
This thesis introduces an efficient and robust approach for 3D reconstruction of complete head models with consistent parameterization and personalized shapes from several possible inputs. The system input consists of Cyberware laser-scanned data where we perform scanning task as well as publically available face data where (i) facial expression may or may not exist and (ii) only partial information of head may exist, for instance only front face part without back part of the head. Our method starts with a surface reconstruction approach to either transfer point clouds to a mesh structure or to fill missing points on a triangular mesh. Then, it is followed by a registration process which unifies the representation of all meshes. Afterward, a photo-cloning method is used to extract an adequate set of features in a semi-automatic way on snapshots taken from front and left views of provided range data. We modify Radial Basis Functions (RBFs) deformation so that it would be based on not only distance, but also regional information. Using feature point sets and modified RBFs deformation, a generic mesh can be manipulated in a way that closed eyes and mouth movements like separating upper lip and lower lip can be properly handled. In other word, such mesh modification method makes construction of various facial expressions possible. Moreover, new functions are added where a generic model can be manipulated based on feature point sets to consequently recover missing parts such as ears, back of the head and neck in the input face. After feature-based deformation using modified radial basis functions, a fine mesh modification method based on model points follows to extract the fine details from the available range data. Then, some post refinement procedures employing RBFs deformation and averaging neighboring points are carried out to make the surface of reconstructed 3D head smoother and uniform. Due to existence of flaws and defects on the mesh surface such as flipped triangles, self-intersections or degenerate faces, an automatic repairing approach is leveraged to clean up the entire surface of the mesh. The experiments which are performed on various models show that our method is robust and efficient in terms of accurate full head reconstruction from input data and execution time, respectively. In our method, it is also aimed to use minimum user interaction as much as possible.
5

An Approach To Automating Data Collection For Simulation

Portnaya, Irin 01 January 2004 (has links)
In past years many industries have utilized simulation as a means for decision making. That wave has introduced simulation as a powerful optimization and development tool in the manufacturing industry. Input data collection is a significant and complex event in the process of simulation. The simulation professionals have grown to accept it is as a strenuous but necessary task. Due to the nature of this task, data collection problems are numerous and vary depending on the situation. These problems may involve time consumption, lack of data, lack of structure, etc. This study concentrates on the challenges of input data collection for Discrete Event Simulation in the manufacturing industry and focuses particularly on speed, efficiency, data completeness and data accuracy. It has been observed that many companies have recently utilized commercial databases to store production data. This study proposes that the key to faster and more efficient input data collection is to extract data directly from these sources in a flexible and efficient way. An approach is introduced here to creating a custom software tool for a manufacturing setting that allows input data to be collected and formatted quickly and accurately. The methodology for the development of such a custom tool and its implementation, Part Data Collection, are laid out in this research. The Part Data Collection application was developed to assist in the simulation endeavors of Lockheed Martin Missiles and Fire Control in Orlando, Florida. It was implemented and tested as an aid in a large simulation project, which included modeling a new factory. This implementation resulted in 93% reduction in labor time associated with data collection and significantly improved data accuracy.
6

Efficient variable screening method and confidence-based method for reliability-based design optimization

Cho, Hyunkyoo 01 May 2014 (has links)
The objectives of this study are (1) to develop an efficient variable screening method for reliability-based design optimization (RBDO) and (2) to develop a new RBDO method incorporated with the confidence level for limited input data problems. The current research effort involves: (1) development of a partial output variance concept for variable screening; (2) development of an effective variable screening sequence; (3) development of estimation method for a confidence level of a reliability output; and (4) development of a design sensitivity method for the confidence level. In the RBDO process, surrogate models are frequently used to reduce the number of simulations because analysis of a simulation model takes a great deal of computational time. On the other hand, to obtain accurate surrogate models, we have to limit the dimension of the RBDO problem and thus mitigate the curse of dimensionality. Therefore, it is desirable to develop an efficient and effective variable screening method for reduction of the dimension of the RBDO problem. In this study, it is found that output variance is critical for identifying important variables in the RBDO process. A partial output variance, which is an efficient approximation method based on the univariate dimension reduction method (DRM), is proposed to calculate output variance efficiently. For variable screening, the variables that has larger partial output variances are selected as important variables. To determine important variables, hypothesis testing is used so that possible errors are contained at a user-specified error level. Also, an appropriate number of samples is proposed for calculating the partial output variance. Moreover, a quadratic interpolation method is studied in detail to calculate output variance efficiently. Using numerical examples, performance of the proposed variable screening method is verified. It is shown that the proposed method finds important variables efficiently and effectively. The reliability analysis and the RBDO require an exact input probabilistic model to obtain accurate reliability output and RBDO optimum design. However, often only limited input data are available to generate the input probabilistic model in practical engineering problems. The insufficient input data induces uncertainty in the input probabilistic model, and this uncertainty forces the RBDO optimum to lose its confidence level. Therefore, it is necessary to consider the reliability output, which is defined as the probability of failure, to follow a probability distribution. The probability of the reliability output is obtained with consecutive conditional probabilities of input distribution type and parameters using the Bayesian approach. The approximate conditional probabilities are obtained under reasonable assumptions, and Monte Carlo simulation is applied to practically calculate the probability of the reliability output. A confidence-based RBDO (C-RBDO) problem is formulated using the derived probability of the reliability output. In the C-RBDO formulation, the probabilistic constraint is modified to include both the target reliability output and the target confidence level. Finally, the design sensitivity of the confidence level, which is the new probabilistic constraint, is derived to support an efficient optimization process. Using numerical examples, the accuracy of the developed design sensitivity is verified and it is confirmed that C-RBDO optimum designs incorporate appropriate conservativeness according to the given input data.
7

Einfluss von Eingabedaten auf nicht-funktionale Eigenschaften in Software-Produktlinien

Lillack, Max 13 December 2012 (has links) (PDF)
Nicht-funktionale Eigenschaften geben Aussagen über Qualitätsaspekte einer Software. Mit einer Software-Produktlinie (SPL) wird eine Menge von verwandten Software-Produkten beschrieben, die auf Basis gemeinsam genutzter Bausteine und Architekturen entwickelt werden, um die Anforderungen unterschiedlicher Kundengruppen zu erfüllen. Hierbei werden gezielt Software-Bestandteile wiederverwendet, um Software effizienter zu entwickeln. In dieser Arbeit wird der Einfluss von Eingabedaten auf die nicht-funktionalen Eigenschaften von SPL untersucht. Es wird auf Basis von Messungen ausgewählter nicht-funktionaler Eigenschaften einzelner Software-Produkte ein Vorhersagemodell für beliebige Software-Produkte der SPL erstellt. Das Vorhersagemodell kann genutzt werden, um den Konfigurationsprozess zu unterstützen. Das Verfahren wird anhand einer SPL von verlustfreien Kompressionsalgorithmen evaluiert. Die Berücksichtigung von Eingabedaten kann die Vorhersage von nicht-funktionalen Eigenschaften einer SPL gegenüber einfacheren Vorhersagemodellen ohne die Berücksichtigung von Eingabedaten signifikant verbessern.
8

An Integrated Framework for Automated Data Collection and Processing for Discrete Event Simulation Models

Rodriguez, Carlos 01 January 2015 (has links)
Discrete Events Simulation (DES) is a powerful tool of modeling and analysis used in different disciplines. DES models require data in order to determine the different parameters that drive the simulations. The literature about DES input data management indicates that the preparation of necessary input data is often a highly manual process, which causes inefficiencies, significant time consumption and a negative user experience. The focus of this research investigation is addressing the manual data collection and processing (MDCAP) problem prevalent in DES projects. This research investigation presents an integrated framework to solve the MDCAP problem by classifying the data needed for DES projects into three generic classes. Such classification permits automating and streamlining the preparation of the data, allowing DES modelers to collect, update, visualize, fit, validate, tally and test data in real-time, by performing intuitive actions. In addition to the proposed theoretical framework, this project introduces an innovative user interface that was programmed based on the ideas of the proposed framework. The interface is called DESI, which stands for Discrete Event Simulation Inputs. The proposed integrated framework to automate DES input data preparation was evaluated against benchmark measures presented in the literature in order to show its positive impact in DES input data management. This research investigation demonstrates that the proposed framework, instantiated by the DESI interface, addresses current gaps in the field, reduces the time devoted to input data management within DES projects and advances the state-of-the-art in DES input data management automation.
9

Discontinuous Galerkin Methods for Parabolic Partial Differential Equations with Random Input Data

Liu, Kun 16 September 2013 (has links)
This thesis discusses and develops one approach to solve parabolic partial differential equations with random input data. The stochastic problem is firstly transformed into a parametrized one by using finite dimensional noise assumption and the truncated Karhunen-Loeve expansion. The approach, Monte Carlo discontinuous Galerkin (MCDG) method, randomly generates $M$ realizations of uncertain coefficients and approximates the expected value of the solution by averaging M numerical solutions. This approach is applied to two numerical examples. The first example is a two-dimensional parabolic partial differential equation with random convection term and the second example is a benchmark problem coupling flow and transport equations. I first apply polynomial kernel principal component analysis of second order to generate M realizations of random permeability fields. They are used to obtain M realizations of random convection term computed from solving the flow equation. Using this approach, I solve the transport equation M times corresponding to M velocity realizations. The MCDG solution spreads toward the whole domain from the initial location and the contaminant does not leave the initial location completely as time elapses. The results show that MCDG solution is realistic, because it takes the uncertainty in velocity fields into consideration. Besides, in order to correct overshoot and undershoot solutions caused by the high level of oscillation in random velocity realizations, I solve the transport equation on meshes of finer resolution than of the permeability, and use a slope limiter as well as lower and upper bound constraints to address this difficulty. Finally, future work is proposed.
10

Einfluss von Eingabedaten auf nicht-funktionale Eigenschaften in Software-Produktlinien

Lillack, Max 05 December 2012 (has links)
Nicht-funktionale Eigenschaften geben Aussagen über Qualitätsaspekte einer Software. Mit einer Software-Produktlinie (SPL) wird eine Menge von verwandten Software-Produkten beschrieben, die auf Basis gemeinsam genutzter Bausteine und Architekturen entwickelt werden, um die Anforderungen unterschiedlicher Kundengruppen zu erfüllen. Hierbei werden gezielt Software-Bestandteile wiederverwendet, um Software effizienter zu entwickeln. In dieser Arbeit wird der Einfluss von Eingabedaten auf die nicht-funktionalen Eigenschaften von SPL untersucht. Es wird auf Basis von Messungen ausgewählter nicht-funktionaler Eigenschaften einzelner Software-Produkte ein Vorhersagemodell für beliebige Software-Produkte der SPL erstellt. Das Vorhersagemodell kann genutzt werden, um den Konfigurationsprozess zu unterstützen. Das Verfahren wird anhand einer SPL von verlustfreien Kompressionsalgorithmen evaluiert. Die Berücksichtigung von Eingabedaten kann die Vorhersage von nicht-funktionalen Eigenschaften einer SPL gegenüber einfacheren Vorhersagemodellen ohne die Berücksichtigung von Eingabedaten signifikant verbessern.

Page generated in 0.0514 seconds