• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 542
  • 76
  • 18
  • 9
  • 8
  • 6
  • 6
  • 6
  • 6
  • 6
  • 6
  • 6
  • 3
  • 3
  • 2
  • Tagged with
  • 781
  • 781
  • 219
  • 198
  • 144
  • 126
  • 107
  • 106
  • 89
  • 87
  • 75
  • 72
  • 71
  • 66
  • 66
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
291

Adapting Evolutionary Approaches for Optimization in Dynamic Environments

Younes, Abdunnaser January 2006 (has links)
Many important applications in the real world that can be modelled as combinatorial optimization problems are actually dynamic in nature. However, research on dynamic optimization focuses on continuous optimization problems, and rarely targets combinatorial problems. Moreover, dynamic combinatorial problems, when addressed, are typically tackled within an application context. <br /><br /> In this thesis, dynamic combinatorial problems are addressed collectively by adopting an evolutionary based algorithmic approach. On the plus side, their ability to manipulate several solutions at a time, their robustness and their potential for adaptability make evolutionary algorithms a good choice for solving dynamic problems. However, their tendency to converge prematurely, the difficulty in fine-tuning their search and their lack of diversity in tracking optima that shift in dynamic environments are drawbacks in this regard. <br /><br /> Developing general methodologies to tackle these conflicting issues constitutes the main theme of this thesis. First, definitions and measures of algorithm performance are reviewed. Second, methods of benchmark generation are developed under a generalized framework. Finally, methods to improve the ability of evolutionary algorithms to efficiently track optima shifting due to environmental changes are investigated. These methods include adapting genetic parameters to population diversity and environmental changes, the use of multi-populations as an additional means to control diversity, and the incorporation of local search heuristics to fine-tune the search process efficiently. <br /><br /> The methodologies developed for algorithm enhancement and benchmark generation are used to build and test evolutionary models for dynamic versions of the travelling salesman problem and the flexible manufacturing system. Results of experimentation demonstrate that the methods are effective on both problems and hence have a great potential for other dynamic combinatorial problems as well.
292

INFORMATION THEORETIC CRITERIA FOR IMAGE QUALITY ASSESSMENT BASED ON NATURAL SCENE STATISTICS

Zhang, Di January 2006 (has links)
Measurement of visual quality is crucial for various image and video processing applications. <br /><br /> The goal of objective image quality assessment is to introduce a computational quality metric that can predict image or video quality. Many methods have been proposed in the past decades. Traditionally, measurements convert the spatial data into some other feature domains, such as the Fourier domain, and detect the similarity, such as mean square distance or Minkowsky distance, between the test data and the reference or perfect data, however only limited success has been achieved. None of the complicated metrics show any great advantage over other existing metrics. <br /><br /> The common idea shared among many proposed objective quality metrics is that human visual error sensitivities vary in different spatial and temporal frequency and directional channels. In this thesis, image quality assessment is approached by proposing a novel framework to compute the lost information in each channel not the similarities as used in previous methods. Based on natural scene statistics and several image models, an information theoretic framework is designed to compute the perceptual information contained in images and evaluate image quality in the form of entropy. <br /><br /> The thesis is organized as follows. Chapter I give a general introduction about previous work in this research area and a brief description of the human visual system. In Chapter II statistical models for natural scenes are reviewed. Chapter III proposes the core ideas about the computation of the perceptual information contained in the images. In Chapter IV, information theoretic criteria for image quality assessment are defined. Chapter V presents the simulation results in detail. In the last chapter, future direction and improvements of this research are discussed.
293

University Intellectual Property Policies And University-Industry Technology Transfer In Canada

Hoye, Katherine January 2006 (has links)
This research investigates the relationship between those incentives for faculty support of university-industry technology transfer that are governed by university intellectual property policies and technology transfer outcomes at Canadian universities. <br /> Empirical research, chiefly conducted in the United States, has explored the link between the incentives that are governed by IP policies and various outcomes and found that financial incentives are correlated with a variety of outcomes. This research extends the literature by exploring the same underlying relationship, in Canada, where IP policies also determine ownership and control of the development of the IP; some universities retain control over the development and other universities let the ownership and control vest with the inventors. <br />The research question was pursued by conducting three studies, each of which provided a different perspective. The first study seeks to explain cross-institutional patterns in the numbers of patents held by Canadian universities using variables that represent the financial incentives and control offered to faculty inventors by the universities' policies. The second study investigated the impact of a policy change at the University of Toronto, using interrupted time series analysis techniques. The third study investigated the experiences of faculty inventors at the University of Waterloo through indepth interviews and thematic analysis of the resulting qualitative data. <br /> The first, cross-sectional study failed to generate statistically significant results. In the second, longitudinal study, the change from a "university-owns" to an "inventor-owns" policy appeared to have significantly and substantially increased the number of invention disclosures submitted to the University of Toronto by its faculty members. The third, qualitative study suggests that faculty members interpret the incentives governed by intellectual property policies and that this interpretation is shaped by group norms, academic leadership, university culture and the inventors' experiences with technology transfer support organizations. Therefore, Studies 2 and 3 indicate that university intellectual property policies are effective levers with which to stimulate university-industry technology transfer and thus deserve further study. The importance of university factors in Study 3 implies that intellectual property policies must fit with their organizational contexts in order to be productive. <br />This research also has important policy implications. Many governments have been attempting to emulate the American Bayh-Dole Act by introducing or changing national regulations affecting university IP policies. This research suggests that these national regulations may actually depress researcher support for technology transfer and thus the amount of activity at those institutions that would benefit from an alternate policy. In effect, standardization of university IP policies through national regulations may deprive university administrators of an effective lever for encouraging technology transfer on their campuses. This inference will be the focus of further research which will broaden the work documented in this dissertation by exploring the relationship between university IP policies, university-industry technology transfer, and university factors, including culture, across a wider range of universities.
294

Effects of Decision Support Tools on Cardiac Telephone Consultation Process

Enomoto, Yukari January 2006 (has links)
The Nursing Coordinators (NCs) at the University of Ottawa Heart Institute (UOHI) fields phone calls from patients who have been discharged and are undergoing home care procedures at a daily basis. The project described in this thesis aims to provide tools for the Personal Digital Assistant (PDA) that the NCs can use during the phone calls. The Cognitive Work Analysis (CWA) and Ecological Interface Design (EID) approach are used to identify the information requirements to design the system. Major challenges of the telephone consultation process that are additionally identified by literature review and interviewing the NCs included visibility of patients, individual differences, and lack of standardized procedures. A combination of decision trees and visualization techniques is proposed to aid the process. Implementation of decision trees would help unload mental workload especially accesses to "knowledge in the head" as well as facilitate expert knowledge transfer to less experienced nurses. Visualization tools display integration of multiple-cues from patients in an abstract nature and can be accessed by users at any point of decision process. <br /><br /> Preliminary experiment with static images showed that visualization tools helped the decision makers more when the judgement tasks were more complex. The effects of different types of decision support on the cardiac nurses in simulated telephone consultation processes were examined. The system improved the performance of the decision makers and induced different types of strategic behaviours: a standardized checklist, OLDCAR, induced more through assessment, the decision algorithms induced efficient and more detailed recommendation, and the semantic network symptom map induced information gathering more relevant to diagnosis. <br /><br /> The research also explored methodologies to examine multi-layered decision process, where many decision makers with varying expertise are involved in modeling the strategic behaviours. This type of process can be applicable when the primary decision makers do not monitor the work domain, but can be alerted when something goes wrong.
295

Parylene Microcolumn for Miniature Gas Chromatograph

Noh, Hongseok "Moses" 14 May 2004 (has links)
This research contributes to worldwide efforts to miniaturize one of the most powerful and versatile analytical tools, gas chromatography (GC). If a rapid, sensitive and selective hand-held GC system is realized, it would have a wide range of applications in many industries and research areas. As a part of developing a hand-held GC system, this research focuses on the separation column, which is the most important component of a GC system. This thesis describes the development of a miniature separation column that has low thermal mass and an embedded heating element for rapid thermal cycling. The worlds first thin polymer film (parylene) GC column has been successfully developed. This thesis includes: first, a study of theoretical column performance of rectangular GC column; second, the design optimization of parylene column and embedded heating element; third, the development of new processes such as parylene micromolding and stationary phase coating technique for parylene column; fourth, the fabrication of parylene GC column with an embedded heating element; and lastly, the testing and evaluation of parylene GC column through GC analysis.
296

Dynamic Memory Management for Embedded Real-Time Multiprocessor System-on-a-Chip

Shalan, Mohamed A. 25 November 2003 (has links)
The aggressive evolution of the semiconductor industry smaller process geometries, higher densities, and greater chip complexity has provided design engineers the means to create complex, high-performance System-on-a-Chip (SoC) designs. Such SoC designs typically have more than one processor and huge (tens of Mega Bytes) amount of memory, all on the same chip. Dealing with the global on-chip memory allocation/deallocation in a dynamic yet deterministic way is an important issue for upcoming billion transistor multiprocessor SoC designs. To achieve this, we propose a memory management hierarchy we call Two-Level Memory Management. To implement this memory management scheme which presents a shift in the way designers look at on-chip dynamic memory allocation we present the System-on-a-Chip Dynamic Memory Management Unit (SoCDMMU) for allocation of the global on-chip memory, which we refer to as Level Two memory management (Level One is the management of memory allocated to a particular on-chip Processing Element, e.g., an operating systems management of memory allocated to a particular processor). In this way, processing elements (heterogeneous or non-heterogeneous hardware or software) in an SoC can request and be granted portions of the global memory in a fast and deterministic time. A new tool is introduced to generate a custom optimized version of the SoCDMMU hardware. Also, a real-time operating system is modified support the new proposed SoCDMMU. We show an example where shared memory multiprocessor SoC that employs the Two-Level Memory Management and utilizes the SoCDMMU has an overall average speedup in application transition time as well as normal execution time.
297

Piezoelectrically-Transduced Silicon Micromechanical Resonators

Sivapurapu, Abhishek 26 August 2005 (has links)
This thesis reports on the design and fabrication of micro-electro-mechanical (MEM) resonators on silicon that are piezoelectrically-transduced for operation in the very high frequency (VHF) range. These devices have a block-type or beam-type design, and are designed to resonate in their in-plane and out-of-plane bulk extensional modes. Two piezoelectric materials were taken into consideration, zinc-oxide (ZnO) and lead-zirconate-titanate (PZT). The resonators are fabricated on silicon-on-insulator (SOI) wafers and the metal/piezo/metal stack of layers forming the device is built and patterned on the device layer silicon via photolithography techniques, RF sputtering (for the piezo-layer) and electron-beam evaporation (for the metal layers). The designing aspect involved ANSYS simulations of the mode-shapes and estimation of frequencies, and these have correlated well with experimental results. Devices with RF sputtered ZnO were successfully fabricated and tested to give high quality factors at reasonably high frequencies. A gold ground plane was implemented to reduce the feed-through level and increase the signal-to-noise ratio. Extensive characterization of PZT was also done as a replacement for ZnO, as the former material has a much higher piezoelectric coefficient (~20X that of ZnO) and can therefore extend the operation of these MEM resonators into the UHF range. Although the basic design of the device remains the same, incorporation of PZT complicates the process flow considerably with respect to the chemistry now involved with the patterning of different layers. The frequency response for ZnO-based resonators as well as all the characterization data for PZT has been reported.
298

Designing for Privacy in Interactive Systems

Jensen, Carlos 29 November 2005 (has links)
People are increasingly concerned about online privacy and how computers collect, process, share, and store their personal information. Such concerns are understandable given the growing number of privacy invasions and the pervasiveness of information capture and sharing between IT systems. This situation has led to an increasingly regulated environment, limiting what systems may do, and what safeguards they must offer users. Privacy is an especially important concern in the fields of computer supported collaborative work (CSCW), Ubiquitous Computing, and e-commerce, where the nature of the applications often requires some information collection and sharing. In order to minimize risks to users it is essential to identify privacy problems early in the design process. Several methods and frameworks for accomplishing this have been proposed in the last decades. These frameworks, though based on hard-earned experience and great insight, have not seen widespread adoption despite the high level of interest in this topic. Part of the reason for this is likely the lack of evaluation and study of these frameworks. In our research we examine the key design and analysis frameworks and their elements, and compare these to the kinds of problems users face and are concerned with in terms of privacy. Based on this analysis of the relative strengths and weaknesses of existing design frameworks we derive a new design framework; STRAP (STRuctured Analysis of Privacy). In STRAP we combine light-weight goal-oriented analysis with heuristics to provide a simple yet effective design framework. We validate our analysis by demonstrating in a series of design experiments that STRAP is more efficient and effective than any one of the existing design frameworks, and provide quantitative and qualitative evidence of the value of using such frameworks as part of the design process.
299

Geometry Estimation and Adaptive Actuation for Centering Preprocessing and Precision Measurement

Mears, Michael Laine 06 April 2006 (has links)
Precise machining of bearing rings is integral to finished bearing assembly quality. The output accuracy of center-based machining systems such as lathes or magnetic chuck grinders relates directly to the accuracy of part centering before machining. Traditional tooling and methods for centering on such machines are subject to wear, dimensional inaccuracy, setup time (hard tooling) and human error (manual centering).A flexible system for initial part centering is developed based on a single measurement system and actuator, whereby the part is placed by hand onto the machine table, rotated and measured to identify center of geometry offset from center of rotation, then moved by a series of controlled impacts or pushes to align the centers. The prototype centering system is developed as a demonstration platform for research in a number of mechanical engineering areas, particularly: Characterization of optimal state estimators through analysis of accuracy and computational efficiency; Distributed communication and control, efficient transfer of information in a real-time environment, and information sharing between processes; Modeling of sliding dynamics and the interaction of friction with compliant body dynamic models; Motion path planning through both deterministic geometric transforms and through frequency domain command manipulation.A vision is created for future work not only in the described areas, but also in the areas of advanced controller design incorporating multiple variables, derived machine diagnostic information, and application of the distributed communication architecture to information flow throughout the manufacturing organization. The guiding motivation for this research is reduction of manufacturing processing costs in the face of global competition. The technologies researched, developments made, and directions prescribed for future research aid in enabling this goal.
300

Adaptive Power Amplifier Linearization by Digital Pre-Distortion with Narrowband Feedback using Genetic Algorithms

Sperlich, Roland 19 July 2005 (has links)
This dissertation presents a study of linearization techniques that have been applied to power amplifiers in the cellular communication industry. The objective of this work is to understand the limitations of power amplifiers, specifically the limitations introduced by the use of spectrally efficient modulation schemes. The digitization of communication systems has favored the use of new techniques and technologies capable of increasing the efficiency of costly power amplifiers. The work explores traditional and digital linearization systems; an algorithm based on the principles of natural recombination is proposed to directly address the limitations of previous embodiments. Previous techniques, although effective, have significant implementation costs that increase exponentially with the increasing signal bandwidths. The proposed software-hardware architecture significantly reduces implementation costs and the overall complexity of the design without sacrificing performance. To fulfill the requirements of this study, multiple systems are implemented through simulation and closed-loop hardware. Both simulation and hardware embodiments meet the expected performance metrics, providing validation of the proposed algorithm. The application of the algorithm to memory power amplifier linearization is a new approach to adaptive digital pre-distortion using narrowband feedback. The work will show performance improvements on an amplifier with memory effects suggesting that this technique can be employed as a lower-cost solution to meet requirements when compared to typical system implementations.

Page generated in 0.0584 seconds