• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 512
  • 386
  • 96
  • 59
  • 43
  • 25
  • 17
  • 11
  • 10
  • 7
  • 6
  • 6
  • 4
  • 3
  • 2
  • Tagged with
  • 1405
  • 1405
  • 455
  • 268
  • 192
  • 177
  • 138
  • 135
  • 127
  • 113
  • 113
  • 112
  • 108
  • 107
  • 105
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
361

Design optimization of a microelectromechanical electric field sensor using genetic algorithms

Roy, Mark 24 September 2012 (has links)
This thesis studies the application of a multi-objective niched Pareto genetic algorithm on the design optimization of an electric field mill sensor. The original sensor requires resonant operation. The objective of the algorithm presented is to optimize the geometry eliminating the need for resonant operation which can be difficult to maintain in the presence of an unpredictable changing environment. The algorithm evaluates each design using finite element simulations. A population of sensor designs is evolved towards an optimal Pareto frontier of solutions. Several candidate solutions are selected that offer superior displacement, frequency, and stress concentrations. These designs were modified for fabrication using the PolyMUMPs abrication process but failed to operate due to the process. In order to fabricate the sensors in-house with a silicon-on-glass process, an anodic bonding apparatus has been designed, built, and tested.
362

Experimental High Cycle Fatigue Testing and Shape Optimization of Turbine Blades

Ahmadi Tafti, Mohamad 20 November 2013 (has links)
An accelerated high cycle fatigue testing approach is presented to determine the fatigue endurance limit of materials at high frequencies. Base excitation of a tapered plaque driven into a high frequency resonance mode allows the test to be completed in a significantly shorter time. This high cycle fatigue testing is performed using the tracked sine resonance search and dwell strategy. The controller monitors the structural health during the test. Any change in the dynamic response indicates crack initiation in the material. In addition, a shape optimization finite element model is conducted for the design of the tapered plaques. An integrated neural (Neural-Network) genetic (NSGA_II) optimization technique is implemented to carry out the shape optimization for this component. This process results in a significant reduction in the computational cost. A Pareto set is then produced that meets the designer’s requirements and provides the decision maker several alternatives to choose from.
363

Design Optimization of Soft Real-Time Applications on FlexRay Platforms

Malekzadeh, Mahnaz January 2013 (has links)
FlexRay is a deterministic communication bus in the automotive context that supports fault-tolerant and high-speed bus system. It operates based on the time-division-multiple-access scheme and allows transmission of event-driven and time-driven messages between nodes in a system. A FlexRay bus has two periodic segments which form a bus cycle: static segment and dynamic segment. Such a bus system could be used in a wide area of real-time automotive applications with soft and hard timing constraints. Recent research has been focused on the FlexRay static segment. As opposed to the static segment, however, the dynamic one is based on an event-triggered scheme. This scheme is more difficult to be temporally predicted. Nevertheless, the event-triggered paradigm provides more flexibility for further incremental design. The dynamic segment is also suitable for applications with erratic data size. Such advantages motivate for more research on the dynamic segment. In a real-time system, results of the computations have to be ready by a specific instant of time called deadline . However, in a soft real-time application, the result can be used with a degraded Quality of Service even after the deadline has passed while in a hard real-time system, missing a deadline leads to a catastrophe. This thesis aims at optimizing some of the parameters of the FlexRay bus for soft real-time applications. The cost function which helps to assess the solution to the optimization problem is the deadline miss ratio and a solution to our problem consists of two parts: (1) Frame identifiers to messages which are produced at each node. (2) The size of each individual minislot which is one of the FlexRay bus parameters. The optimization is done based on genetic algorithms. To evaluate the proposed approach, several experiments have been conducted based on the FlexRay bus simulator implemented in this thesis. The achieved results show that suitable choice of the parameters which are generated by our optimization engine improves the timing behavior of simulated communicating nodes.
364

Genomų palyginimo algoritmų tyrimas / Research of algorithms for genome comparison

Kovaliovas, Viktoras 23 May 2005 (has links)
To understand evolution, and to discover how different species are related, gene order analysis is a useful tool. Problems in this area can usually be formulated in a combinatorial language. We regard genomes as signed, or unsigned permutations, and thus evolutionary operations like inversions (reversing the order of a segment of genes) are easy to describe combinatorially. A commonly studied problem is to determine the evolutionary distance between two species. This is estimated by several combinatorial distances between gene order permutations, for instance the inversion distance. The main objective of this work was to survey the existing algorithms for genome comparison and to present new approach for solving this problem. The work led to these results: - We have surveyed existing approaches of genome comparison, namely comparison by inversion distance in signed and unsigned cases. It appeared that sorting signed genomes by inversions is done in quadratic time, but sorting unsigned genomes by inversions is NP-hard. - We have proposed the method of how to apply heuristic algorithms for sorting unsigned genomes by inversions. - We have applied tabu search and genetic algorithm to solve the sorting unsigned genomes by inversions problem. - We have experimentally proven, that the worst case solutions to sorting unsigned genomes by inversions found by heuristics (tabu search and genetic algorithm) are better then ones expected from best known approximating algorithm used for... [to full text]
365

SIMULATIONS-GUIDED DESIGN OF PROCESS ANALYTICAL SENSOR USING MOLECULAR FACTOR COMPUTING

Dai, Bin 01 January 2007 (has links)
Many areas of science now generate huge volumes of data that present visualization, modeling, and interpretation challenges. Methods for effectively representing the original data in a reduced coordinate space are therefore receiving much attention. The purpose of this research is to test the hypothesis that molecular computing of vectors for transformation matrices enables spectra to be represented in any arbitrary coordinate system. New coordinate systems are selected to reduce the dimensionality of the spectral hyperspace and simplify the mechanical/electrical/computational construction of a spectrometer. A novel integrated sensing and processing system, termed Molecular Factor Computing (MFC) based near infrared (NIR) spectrometer, is proposed in this dissertation. In an MFC -based NIR spectrometer, spectral features are encoded by the transmission spectrum of MFC filters which effectively compute the calibration function or the discriminant functions by weighing the signals received from a broad wavelength band. Compared with the conventional spectrometers, the novel NIR analyzer proposed in this work is orders of magnitude faster and more rugged than traditional spectroscopy instruments without sacrificing the accuracy that makes it an ideal analytical tool for process analysis. Two different MFC filter-generating algorithms are developed and tested for searching a near-infrared spectral library to select molecular filters for MFC-based spectroscopy. One using genetic algorithms coupled with predictive modeling methods to select MFC filters from a spectral library for quantitative prediction is firstly described. The second filter-generating algorithm designed to select MFC filters for qualitative classification purpose is then presented. The concept of molecular factor computing (MFC)-based predictive spectroscopy is demonstrated with quantitative analysis of ethanol-in-water mixtures in a MFC-based prototype instrument.
366

Digital Image Elasto-Tomography: Mechanical Property Reconstruction from Surface Measured Displacement Data

Peters, Ashton January 2007 (has links)
Interest in elastographic techniques for soft tissue imaging has grown as relevant research continues to indicate a correlation between tissue histology and mechanical stiffness. Digital Image Elasto-Tomography (DIET) presents a novel method for identifying cancerous lesions via a three-dimensional image of elastic properties. Stiffness reconstruction with DIET takes steady-state motion captured with a digital camera array as the input to an elastic property reconstruction algorithm, where finite element methods allow simulation of phantom motion at a range of internal stiffness distributions. The low cost and high image contrast achievable with a DIET system may be particularly suited to breast cancer screening, where traditional modalities such as mammography have issues with limited sensitivity and patient discomfort. Proof of concept studies performed on simulated data sets confirmed the potential of the DIET technique, leading to the development of an experimental apparatus for surface motion capture from a range of soft tissue approximating phantoms. Error studies performed on experimental data from these phantoms using a limited number of shape and modulus parameters indicated that accurate measurements of surface motion provide sufficient information to identify a stiffness distribution in both homogeneous and heterogeneous cases. The elastic reconstruction performed on simulated and experimental data considered both deterministic and stochastic algorithms, with a combination of the two approaches found to give the most accurate results, for a realistic increase in computational cost. The reconstruction algorithm developed has the ability to successfully resolve a hard spherical inclusion within a soft phantom, and in addition demonstrated promise in reconstructing the correct stiffness distribution when no inclusion is present.
367

Determination of impulse generator setup for transient testing of power transformers using optimization-enabled electromagnetic transient simulation

Samarawickrama, Kasun Chamara 02 September 2014 (has links)
Natural lightning strikes induce impulsive overvoltages on transmission lines and its terminal equipment. These overvoltages may cause failures in insulation mechanisms of electrical devices in the power system. It is important to test the insulation strength of a device against these impulsive overvoltages. Usually, Marx generators are used to generate impulse waveforms for testing purposes. A novel approach is proposed to obtain resistor settings of a Marx generator for impulse testing of power transformers. This approach enables us to overcome most of the major challenges in the commonly used trial-and-error method, including excessive time consumption and potential damage to the transformer. The proposed approach uses the frequency response of the transformer to synthesize a circuit model. Then, a genetic algorithm based optimization-enabled electromagnetic transient simulation approach is used to obtain the resistor settings. The proposed approach is validated by a real impulse test conducted on a three phase power transformer.
368

A Study on Aggregation of Objective Functions in MaOPs Based on Evaluation Criteria

Furuhashi, Takeshi, Yoshikawa, Tomohiro, Otake, Shun January 2010 (has links)
Session ID: TH-E1-4 / SCIS & ISIS 2010, Joint 5th International Conference on Soft Computing and Intelligent Systems and 11th International Symposium on Advanced Intelligent Systems. December 8-12, 2010, Okayama Convention Center, Okayama, Japan
369

A Study on Analysis of Design Variables in Pareto Solutions for Conceptual Design Optimization Problem of Hybrid Rocket Engine

Furuhashi, Takeshi, Yoshikawa, Tomohiro, Kudo, Fumiya 06 1900 (has links)
2011 IEEE Congress on Evolutionary Computation (CEC). June 5-8, 2011, Ritz-Carlton, New Orleans, LA, USA
370

Investigating the empirical relationship between oceanic properties observable by satellite and the oceanic pCO₂ / Marizelle van der Walt

Van der Walt, Marizelle January 2011 (has links)
In this dissertation, the aim is to investigate the empirical relationship between the partial pressure of CO2 (pCO2) and other ocean variables in the Southern Ocean, by using a small percentage of the available data. CO2 is one of the main greenhouse gases that contributes to global warming and climate change. The concentration of anthropogenic CO2 in the atmosphere, however, would have been much higher if some of it was not absorbed by oceanic and terrestrial sinks. The oceans absorb and release CO2 from and to the atmosphere. Large regions in the Southern Ocean are expected to be a CO2 sink. However, the measurements of CO2 concentrations in the ocean are sparse in the Southern Ocean, and accurate values for the sinks and sources cannot be determined. In addition, it is difficult to develop accurate oceanic and ocean-atmosphere models of the Southern Ocean with the sparse observations of CO2 concentrations in this part of the ocean. In this dissertation classical techniques are investigated to determine the empirical relationship between pCO2 and other oceanic variables using in situ measurements. Additionally, sampling techniques are investigated in order to make a judicious selection of a small percentage of the total available data points in order to develop an accurate empirical relationship. Data from the SANAE49 cruise stretching between Antarctica and Cape Town are used in this dissertation. The complete data set contains 6103 data points. The maximum pCO2 value in this stretch is 436.0 μatm, the minimum is 251.2 μatm and the mean is 360.2 μatm. An empirical relationship is investigated between pCO2 and the variables Temperature (T), chlorophyll-a concentration (Chl), Mixed Layer Depth (MLD) and latitude (Lat). The methods are repeated with latitude included and excluded as variable respectively. D-optimal sampling is used to select a small percentage of the available data for determining the empirical relationship. Least squares optimization is used as one method to determine the empirical relationship. For 200 D-optimally sampled points, the pCO2 prediction with the fourth order equation yields a Root Mean Square (RMS) error of 15.39 μatm (on the estimation of pCO2) with latitude excluded as variable and a RMS error of 8.797 μatm with latitude included as variable. Radial basis function (RBF) interpolation is another method that is used to determine the empirical relationship between the variables. The RBF interpolation with 200 D-optimally sampled points yields a RMS error of 9.617 μatm with latitude excluded as variable and a RMS error of 6.716 μatm with latitude included as variable. Optimal scaling is applied to the variables in the RBF interpolation, yielding a RMS error of 9.012 μatm with latitude excluded as variable and a RMS error of 4.065 μatm with latitude included as variable for 200 D-optimally sampled points. / Thesis (MSc (Applied Mathematics))--North-West University, Potchefstroom Campus, 2012

Page generated in 0.0547 seconds