• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 54
  • 26
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 1
  • 1
  • Tagged with
  • 122
  • 122
  • 98
  • 41
  • 32
  • 17
  • 16
  • 15
  • 15
  • 15
  • 14
  • 13
  • 11
  • 11
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

A Retrieval Method (DFM Framework) for Automated Retrieval of Design for Additive Manufacturing Problems

Yim, Sungshik 08 March 2007 (has links)
Problem: The process planning task for a given design problem in additive manufacturing can be greatly enhanced by referencing previously developed process plans. However, identifying appropriate process plans for the given design problem requires appropriate mapping between the design domain and the process planning domain. Hence, the objective of this research is to establish mathematical mapping between the design domain and the process planning domain such that the previously developed appropriate process plans can be identified for the given design task. Further more, identification of an appropriate mathematical theory that enables computational mapping between the two domains is of interest. Through such computational mapping, previously developed process plans are expected to be shared in a distributed environment using an open repository. Approach: The design requirements and process plans are discretized using empirical models that compute exact values of process variables for the given design requirements. Through this discretization, subsumption relations among the discretized design requirements and process plans are identified. Appropriate process plans for a given design requirement are identified by subsumption relations in the design requirements. Also, the design requirements that can be satisfied by the given process plans are identified by subsumption relations among the process plans. To computationally realize such mapping, a description logic (ALE) is identified and justified to represent and compute subsumption relation. Based on this investigation, a retrieval method (DFM framework) is realized that enables storage and retrieval of process plans. Validation: Theoretical and empirical validations are performed using the validation square method. For the theoretical validation, an appropriate description logic (ALE) is identified and justified. Also, subsumption utilization in mapping two domains and realizing the DFM framework is justified. For the empirical validation, the storing and retrieval performance of the DFM framework is tested to demonstrate its theoretical validity. Contribution: In this research, two areas of contributions are identified: DFM and engineering information management. In DFM, the retrieval method that relates the design problem to appropriate process plans through mathematical mapping between design and process planning domain is the major contribution. In engineering information management, the major contributions are the development of information models and the identification of their characteristics. Based on this investigation, an appropriate description logic (ALE) is selected and justified. Also, corresponding computational feasibility (non deterministic polynomial time) of subsumption is identified.
72

An iterative representer-based scheme for data inversion in reservoir modeling

Iglesias-Hernandez, Marco Antonio, 1979- 25 September 2012 (has links)
With the recent development of smart-well technology, the reservoir community now faces the challenge of developing robust and efficient techniques for reservoir characterization by means of data inversion. Unfortunately, classical history-matching methodologies do not possess computational efficiency and robustness needed to assimilate data measured almost in real time. Therefore, the reservoir community has started to explore techniques previously applied in other disciplines. Such is the case of the representer method, a variational data assimilation technique that was first applied in physical oceanography. The representer method is an efficient technique for solving linear inverse problems when a finite number of measurements are available. To the best of our knowledge, a general representer-based methodology for nonlinear inverse problems has not been fully developed. We fill this gap by presenting a novel implementation of the representer method applied to the nonlinear inverse problem of identifying petrophysical properties in reservoir models. Given production data from wells and prior knowledge of the petrophysical properties, the goal of our formulation is to find improved parameters so that the reservoir model prediction fits the data within some error given a priori. We first define an abstract framework for parameter identification in nonlinear reservoir models. Then, we propose an iterative representer-based scheme (IRBS) to find a solution of the inverse problem. Sufficient conditions for convergence of the proposed algorithm are established. We apply the IRBS to the estimation of absolute permeability in single-phase Darcy flow through porous media. Additionally, we study an extension of the IRBS with Karhunen-Loeve (IRBS-KL) expansions to address the identification of petrophysical properties subject to linear geological constraints. The IRBS-KL approach is compared with a standard variational technique for history matching. Furthermore, we apply the IRBS-KL to the identification of porosity, absolute and relative permeabilities given production data from an oil-water reservoir. The general derivation of the IRBS-KL is provided for a reservoir whose dynamics are modeled by slightly compressible immiscible displacement of two-phase flow through porous media. Finally, we present an ad-hoc sequential implementation of the IRBS-KL and compare its performance with the ensemble Kalman filter. / text
73

Development of a XML-based distributed service architecture for product development in enterprise clusters

Xie, Tian, 謝天 January 2005 (has links)
published_or_final_version / abstract / toc / Mechanical Engineering / Master / Master of Philosophy
74

Internet inter-domain traffic engineering and optimizatioon

Lam, Fung, 林峰 January 2001 (has links)
published_or_final_version / Electrical and Electronic Engineering / Master / Master of Philosophy
75

Hierarchical slice contours for layered manufacturing

Kwok, Kwok-tung., 郭國棟. January 2001 (has links)
published_or_final_version / Industrial and Manufacturing Systems Engineering / Master / Master of Philosophy
76

A Mahalanobis-distance-based image segmentation error measure with applications in automated microscopy /

Rogers, Wendy Laurel. January 1985 (has links)
No description available.
77

USING COMPLEXITY, COUPLING, AND COHESION METRICS AS EARLY INDICATORS OF VULNERABILITIES

Chowdhury, Istehad 28 September 2009 (has links)
Software security failures are common and the problem is growing. A vulnerability is a weakness in the software that, when exploited, causes a security failure. It is difficult to detect vulnerabilities until they manifest themselves as security failures in the operational stage of the software, because security concerns are often not addressed or known sufficiently early during the Software Development Life Cycle (SDLC). Complexity, coupling, and cohesion (CCC) related software metrics can be measured during the early phases of software development such as design or coding. Although these metrics have been successfully employed to indicate software faults in general, the relationships between CCC metrics and vulnerabilities have not been extensively investigated yet. If empirical relationships can be discovered between CCC metrics and vulnerabilities, these metrics could aid software developers to take proactive actions against potential vulnerabilities in software. In this thesis, we investigate whether CCC metrics can be utilized as early indicators of software vulnerabilities. We conduct an extensive case study on several releases of Mozilla Firefox to provide empirical evidence on how vulnerabilities are related to complexity, coupling, and cohesion. We mine the vulnerability databases, bug databases, and version archives of Mozilla Firefox to map vulnerabilities to software entities. It is found that some of the CCC metrics are correlated to vulnerabilities at a statistically significant level. Since different metrics are available at different development phases, we further examine the correlations to determine which level (design or code) of CCC metrics are better indicators of vulnerabilities. We also observe that the correlation patterns are stable across multiple releases. These observations imply that the metrics can be dependably used as early indicators of vulnerabilities in software. We then present a framework to automatically predict vulnerabilities based on CCC metrics. To build vulnerability predictors, we consider four alternative data mining and statistical techniques – C4.5 Decision Tree, Random Forests, Logistic Regression, and Naïve-Bayes – and compare their prediction performances. We are able to predict majority of the vulnerability-prone files in Mozilla Firefox, with tolerable false positive rates. Moreover, the predictors built from the past releases can reliably predict the likelihood of having vulnerabilities in future releases. The experimental results indicate that structural information from the non-security realm such as complexity, coupling, and cohesion are useful in vulnerability prediction. / Thesis (Master, Electrical & Computer Engineering) -- Queen's University, 2009-09-24 17:31:36.581
78

A framework for the design of simulation-based greenhouse control

Lacroix, René January 1994 (has links)
The main objectives were: (1) to develop tools to aid in the design of enclosed agro-ecosystems, and (2) to use these tools to develop a prototype simulation-based control system. Three tools were developed: (1) a conceptual framework, (2) a (simulated) greenhouse system and (3) a simulation approach within OS/2. / Part of the conceptual framework was dedicated to "conscious control", defined as a form of control practised by an entity that uses models of itself in its decision-making processes. The greenhouse system was composed of six modules (a simulation manager, a weather generator, a greenhouse model, a crop model, a Pavlovian controller and a cognitive controller), which were implemented under OS/2 as separate processes. / The greenhouse system was used to develop a prototype simulation-based controller. Primarily, the role of the controller was to determine temperature setpoints that would minimize the heating load. The simulation model used by the controller was an artificial neural network. The controller adapted temperature setpoints to anticipated meteorological conditions and reduced greenhouse energy consumption, in comparison with a more traditional controller. / Generally, the results showed the feasibility and illustrated some of the advantages of using simulation-based control. The research resulted in the definition of elements that will allow the creation of a methodological framework for the design of simulation-based control and, eventually, a theory of conscious control.
79

Gradient free optimisation in selected engineering applications

Walton, Sean Peter January 2013 (has links)
No description available.
80

A unified rapid-prototyping development framework for the control, command, and monitoring of unmanned aerial vehicles

Claassens, Samuel David 31 July 2012 (has links)
M.Ing. / This investigation explores the applicability of an adapted formal computational model for rapid synthesis of complete UAV (Unmanned Aerial Vehicle) systems in a single unified environment. The proposed framework termed XPDS (Cross-Platform Data Server) incorporates principles from a variety of similar, successful languages such as Giotto and Esterel. Application of such models has been shown to be advantageous in the UAV control system domain. The proposed solution extends the principles to the complete generic crafts/ground station problem and provides a unified framework for the development of distributed, scalable, and predictable solutions. The core of the framework is a hybrid FLET (Fixed Logical Execution Time) computational model which formalises the timing and operation of a number of concurrent processes or tasks. Three mechanisms are built upon the computational model – a design environment, simulation extensions, and code generation functionality. A design environment is proposed which permits a user to operate through an intuitive interface. The simulation extensions provide tight integration into established software such as Mathwork’s MatLab and Austin Meyer’s X-Plane. The code generation framework allows XPDS programs to be potentially converted into source for a variety of target systems. The combination of the three mechanisms and the formal computational model allow stakeholders to incrementally construct, test, and verify a complete UAV system. An implementation of the proposed framework is constructed to verify the proposed design. Initially, the implementation is subjected to a number of experiments that show that it is a valid representation of the specification. A simplified helicopter stability control system, based upon the problem statement from the initial literature review, is then presented as a test case and the solution is subsequently developed in XPDS. The scenario is successfully constructed and tested through the framework, demonstrating the validity of the proposed solution. The investigation demonstrates that it is both possible and beneficial to develop UAV systems in a single, unified environment. The incorporation of a formal computational model leads to rapid development of predictable solutions. The numerous systems are also easily integrated and benefit from features such as modularity and reusability.

Page generated in 0.0798 seconds