• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 875
  • 201
  • 126
  • 110
  • 73
  • 25
  • 17
  • 16
  • 7
  • 6
  • 6
  • 5
  • 4
  • 4
  • 4
  • Tagged with
  • 1726
  • 412
  • 311
  • 245
  • 228
  • 184
  • 173
  • 166
  • 166
  • 156
  • 154
  • 152
  • 152
  • 150
  • 140
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
571

Characterization of Preliminary Breast Tomosynthesis Data: Noise and Power Spectra Analysis

Behera, Madhusmita 06 July 2004 (has links)
Early detection, diagnosis, and suitable treatment are known to significantly improve the chance of survival for breast cancer (BC) patients. To date, the most cost effective method for screening and early detection is screen-film mammography, which is also the only tool that has demonstrated its ability to reduce BC mortality. Full-field digital mammography (FFDM) is an extension of screen-film mammography that eliminates the need for film-processing because the images are detected electronically from their inception. Tomosynthesis is an emerging technology in digital mammography built on the FFDM framework, which offers an alternative to conventional two-dimensional mammography. Tomosynthesis produces three-dimensional (volumetric) images of the breast that may be superior to planar imaging due to improved visualization. In this work preliminary tomosynthesis data derived from cadaver breasts are analyzed, which includes volume data acquired from various reconstruction techniques as well as the planar projection data. The noise and power spectra characteristics analyses are the focus of this study. Understanding the noise characteristics is significant in the study of radiological images and in the evaluation of the imaging system, so that its degrading effect on the image can be minimized, if possible and lead to better diagnosis and optimal computer aided diagnosis schemes. Likewise, the power spectra behavior of the data are analyzed, so that statistical methods developed for digitized film images or FFDM images may be applied directly or modified accordingly for tomosynthesis applications. The work shows that, in general, the power spectra for three of the reconstruction techniques are very similar to the spectra of planar FFDM data as well as digitized film; projection data analysis follows the same trend. To a good approximation the Fourier power spectra obey an inverse power law, which indicates a degree of self-similarity. The noise analysis indicates that the noise and signal are dependent and the dependency is a function of the reconstruction technique. New approaches for the analysis of signal dependent noise were developed specifically for this work based on both the linear wavelet expansion and on nonlinear order statistics. These methods were tested on simulated data that closely follow the statistics of mammograms prior to the real-data applications. The noise analysis methods are general and have applications beyond mammography.
572

Joint Preprocesser-Based Detectors for One-Way and Two-Way Cooperative Communication Networks

Abuzaid, Abdulrahman I. 05 1900 (has links)
Efficient receiver designs for cooperative communication networks are becoming increasingly important. In previous work, cooperative networks communicated with the use of L relays. As the receiver is constrained, channel shortening and reduced-rank techniques were employed to design the preprocessing matrix that reduces the length of the received vector from L to U. In the first part of the work, a receiver structure is proposed which combines our proposed threshold selection criteria with the joint iterative optimization (JIO) algorithm that is based on the mean square error (MSE). Our receiver assists in determining the optimal U. Furthermore, this receiver provides the freedom to choose U for each frame depending on the tolerable difference allowed for MSE. Our study and simulation results show that by choosing an appropriate threshold, it is possible to gain in terms of complexity savings while having no or minimal effect on the BER performance of the system. Furthermore, the effect of channel estimation on the performance of the cooperative system is investigated. In the second part of the work, a joint preprocessor-based detector for cooperative communication networks is proposed for one-way and two-way relaying. This joint preprocessor-based detector operates on the principles of minimizing the symbol error rate (SER) instead of minimizing MSE. For a realistic assessment, pilot symbols are used to estimate the channel. From our simulations, it can be observed that our proposed detector achieves the same SER performance as that of the maximum likelihood (ML) detector with all participating relays. Additionally, our detector outperforms selection combining (SC), channel shortening (CS) scheme and reduced-rank techniques when using the same U. Finally, our proposed scheme has the lowest computational complexity.
573

Comparison and improvement of time aware collaborative filtering techniques : Recommender systems / Jämförelsestudie och förbättring av tidsmedvetna kollaborativa filtreringstekniker : Rekommendationssystem

Grönberg, David, Denesfay, Otto January 2019 (has links)
Recommender systems emerged in the mid '90s with the objective of helping users select items or products most suited for them. Whether it is Facebook recommending people you might know, Spotify recommending songs you might like or Youtube recommending videos you might want to watch, recommender systems can now be found in every corner of the internet. In order to handle the immense increase of data online, the development of sophisticated recommender systems is crucial for filtering out information, enhancing web services by tailoring them according to the preferences of the user. This thesis aims to improve the accuracy of recommendations produced by a classical collaborative filtering recommender system by utilizing temporal properties, more precisely the date on which an item was rated by a user. Three different time-weighted implementations are presented and evaluated: time-weighted prediction approach, time-weighted similarity approach and our proposed approach, weighting the mean rating of a user on time. The different approaches are evaluated using the well known MovieLens 100k dataset. Results show that it is possible to slightly increase the accuracy of recommendations by utilizing temporal properties.
574

Precise Velocity and Acceleration Determination Using a Standalone GPS Receiver in Real Time

Zhang, Jianjun, j3029709.zhang@gmail.com January 2006 (has links)
Precise velocity and acceleration information is required for many real time applications. A standalone GPS receiver can be used to derive such information; however, there are many unsolved problems in this regard. This thesis establishes the theoretical basis for precise velocity and acceleration determination using a standalone GPS receiver in real time. An intensive investigation has been conducted into the Doppler effect in GPS. A highly accurate Doppler shift one-way observation equation is developed based on a comprehensive error analysis of each contributing factor including relativistic effects. Various error mitigation/elimination methods have been developed to improve the measurement accuracy of both the Doppler and Doppler-rate. Algorithms and formulae are presented to obtain real-time satellite velocity and acceleration in the ECEF system from the broadcast ephemeris. Low order IIR differentiators are designed to derive Doppler and Doppler-rate measurements from the raw GPS data for real-time applications. Abnormalities and their corresponding treatments in real-time operations are also discussed. In addition to the velocity and acceleration determination, this thesis offers a good tool for GPS measurement modelling and for design of interpolators, differentiators, as well as Kalman filters. The relativistic terms presented by this thesis suggest that it is possible to measure the geopotential directly using Doppler shift measurements. This may lead to a foundation for the development of a next generation satellite system for geodesy in the future.
575

Modelling intelligent agents for web-based information gathering.

Li, Yuefeng, mikewood@deakin.edu.au January 2000 (has links)
The recent emergence of intelligent agent technology and advances in information gathering have been the important steps forward in efficiently managing and using the vast amount of information now available on the Web to make informed decisions. There are, however, still many problems that need to be overcome in the information gathering research arena to enable the delivery of relevant information required by end users. Good decisions cannot be made without sufficient, timely, and correct information. Traditionally it is said that knowledge is power, however, nowadays sufficient, timely, and correct information is power. So gathering relevant information to meet user information needs is the crucial step for making good decisions. The ideal goal of information gathering is to obtain only the information that users need (no more and no less). However, the volume of information available, diversity formats of information, uncertainties of information, and distributed locations of information (e.g. World Wide Web) hinder the process of gathering the right information to meet the user needs. Specifically, two fundamental issues in regard to efficiency of information gathering are mismatch and overload. The mismatch means some information that meets user needs has not been gathered (or missed out), whereas, the overload means some gathered information is not what users need. Traditional information retrieval has been developed well in the past twenty years. The introduction of the Web has changed people's perceptions of information retrieval. Usually, the task of information retrieval is considered to have the function of leading the user to those documents that are relevant to his/her information needs. The similar function in information retrieval is to filter out the irrelevant documents (or called information filtering). Research into traditional information retrieval has provided many retrieval models and techniques to represent documents and queries. Nowadays, information is becoming highly distributed, and increasingly difficult to gather. On the other hand, people have found a lot of uncertainties that are contained in the user information needs. These motivate the need for research in agent-based information gathering. Agent-based information systems arise at this moment. In these kinds of systems, intelligent agents will get commitments from their users and act on the users behalf to gather the required information. They can easily retrieve the relevant information from highly distributed uncertain environments because of their merits of intelligent, autonomy and distribution. The current research for agent-based information gathering systems is divided into single agent gathering systems, and multi-agent gathering systems. In both research areas, there are still open problems to be solved so that agent-based information gathering systems can retrieve the uncertain information more effectively from the highly distributed environments. The aim of this thesis is to research the theoretical framework for intelligent agents to gather information from the Web. This research integrates the areas of information retrieval and intelligent agents. The specific research areas in this thesis are the development of an information filtering model for single agent systems, and the development of a dynamic belief model for information fusion for multi-agent systems. The research results are also supported by the construction of real information gathering agents (e.g., Job Agent) for the Internet to help users to gather useful information stored in Web sites. In such a framework, information gathering agents have abilities to describe (or learn) the user information needs, and act like users to retrieve, filter, and/or fuse the information. A rough set based information filtering model is developed to address the problem of overload. The new approach allows users to describe their information needs on user concept spaces rather than on document spaces, and it views a user information need as a rough set over the document space. The rough set decision theory is used to classify new documents into three regions: positive region, boundary region, and negative region. Two experiments are presented to verify this model, and it shows that the rough set based model provides an efficient approach to the overload problem. In this research, a dynamic belief model for information fusion in multi-agent environments is also developed. This model has a polynomial time complexity, and it has been proven that the fusion results are belief (mass) functions. By using this model, a collection fusion algorithm for information gathering agents is presented. The difficult problem for this research is the case where collections may be used by more than one agent. This algorithm, however, uses the technique of cooperation between agents, and provides a solution for this difficult problem in distributed information retrieval systems. This thesis presents the solutions to the theoretical problems in agent-based information gathering systems, including information filtering models, agent belief modeling, and collection fusions. It also presents solutions to some of the technical problems in agent-based information systems, such as document classification, the architecture for agent-based information gathering systems, and the decision in multiple agent environments. Such kinds of information gathering agents will gather relevant information from highly distributed uncertain environments.
576

Studie av integration mellan rategyron och magnetkompass / Study of sensor fusion of rategyros and magnetometers

Nilsson, Sara January 2004 (has links)
<p>This master thesis is a study on how a rategyro triad, an accelerometer triad, and a magnetometer triad can be integrated into a navigation system, estimating a vehicle’s attitude, i.e. its roll, tipp, and heading angles. When only a rategyro triad is used to estimate a vehicle’s attitude, a drift in the attitude occurs due to sensor errors. </p><p>When an accelerometer triad and a magnetometer triad are used, an error in the vehicle’s heading, appearing as a sine curve, depending on the heading, occurs. By integrating these sensor triads, the sensor errors have been estimated with a filter to improve the estimated attitude’s accuracy. </p><p>To investigate and evaluate the navigation system, a simulation model has been developed in Simulink/Matlab. The implementation has been made using a Kalman filter where the sensor fusion takes place. Simulations for different scenarios have been made and the results from these simulations show that the drift in the vehicle’s attitude is avoided.</p>
577

Robust Automotive Positioning: Integration of GPS and Relative Motion Sensors / Robust fordonspositionering: Integration av GPS och sensorer för relativ rörelse

Kronander, Jon January 2004 (has links)
<p>Automotive positioning systems relying exclusively on the input from a GPS receiver, which is a line of sight sensor, tend to be sensitive to situations with limited sky visibility. Such situations include: urban environments with tall buildings; inside parking structures; underneath trees; in tunnels and under bridges. In these situations, the system has to rely on integration of relative motion sensors to estimate vehicle position. However, these sensor measurements are generally affected by errors such as offsets and scale factors, that will cause the resulting position accuracy to deteriorate rapidly once GPS input is lost. </p><p>The approach in this thesis is to use a GPS receiver in combination with low cost sensor equipment to produce a robust positioning module. The module should be capable of handling situations where GPS input is corrupted or unavailable. The working principle is to calibrate the relative motion sensors when GPS is available to improve the accuracy during GPS intermission. To fuse the GPS information with the sensor outputs, different models have been proposed and evaluated on real data sets. These models tend to be nonlinear, and have therefore been processed in an Extended Kalman Filter structure. </p><p>Experiments show that the proposed solutions can compensate for most of the errors associated with the relative motion sensors, and that the resulting positioning accuracy is improved accordingly.</p>
578

Design of Fast Multidimensional Filters by Genetic Algorithms

Langer, Max January 2004 (has links)
<p>The need for fast multidimensional signal processing arises in many areas. One of the more demanding applications is real time visualization of medical data acquired with e.g. magnetic resonance imaging where large amounts of data can be generated. This data has to be reduced to relevant clinical information, either by image reconstruction and enhancement or automatic feature extraction. Design of fast-acting multidimensional filters has been subject to research during the last three decades. Usually methods for fast filtering are based on applying a sequence of filters of lower dimensionality acquired by e.g. weighted low-rank approximation. Filter networks is a method to design fast multidimensional filters by decomposing multiple filters into simpler filter components in which coefficients are allowed to be sparsely scattered. Up until now, coefficient placement has been done by hand, a procedure which is time-consuming and difficult. The aim of this thesis is to investigate whether genetic algorithms can be used to place coefficients in filter networks. A method is developed and tested on 2-D filters and the resulting filters have lower distortion values while still maintaining the same or lower number of coefficients than filters designed with previously known methods.</p>
579

Implementation and Performance Analysis of Filternets

Einarsson, Henrik January 2006 (has links)
No description available.
580

Enhancement of X-ray Fluoroscopy Image Sequences using Temporal Recursive Filtering and Motion Compensation

Forsberg, Anni January 2006 (has links)
<p>This thesis consider enhancement of X-ray fluoroscopy image sequences. The purpose is to investigate the possibilities to improve the image enhancement in Biplanar 500, a fluoroscopy system developed by Swemac Medical Appliances, for use in orthopedic surgery.</p><p>An algorithm based on recursive filtering, for temporal noise suppression, and motion compensation, for avoidance of motion artifacts, is developed and tested on image sequences from the system. The motion compensation is done both globally, by using the theory of the shift theorem, and locally, by subtracting consecutive frames. Also a new type of contrast adjustment is presented, received with an unlinear mapping function.</p><p>The result is a noise reduced image sequence that shows no blurring effects upon motion. A brief study of the result shows, that both the image sequences with this algorithm applied and the contrast adjusted images are preferred by orthopedists compared to the present images in the system.</p>

Page generated in 0.0776 seconds