• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 875
  • 201
  • 126
  • 110
  • 73
  • 25
  • 17
  • 16
  • 7
  • 6
  • 6
  • 5
  • 4
  • 4
  • 4
  • Tagged with
  • 1726
  • 412
  • 311
  • 245
  • 228
  • 184
  • 173
  • 166
  • 166
  • 156
  • 154
  • 152
  • 152
  • 150
  • 140
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
211

Examination of selected passive tracking schemes using adaptive kalman filtering

Dailey, Timothy E. January 1982 (has links)
In the past, passive SONAR range tracking systems have used Extended Kalman filters to process nonlinear time-delay measurements. This approach has several flaws due to the inherent divergence problems of Extended Kalman filters. This paper discusses a new approach which uses a prefilter to linearize the measurements so that they can be processed by a standard Kalman filter. The approach is subsequently expanded for use with an adaptive Kalman filter which allows source maneuvers to be tracked. A new approach to passive Doppler velocity tracking is also proposed which uses a dedicated Kalman filter to track random fluctuations in the sources center frequency. This dedicated tracker simplifies the problem so that it can be handled by a basic adaptive Kalman filter. / Master of Science
212

Jumping Connections: A Graph-Theoretic Model for Recommender Systems

Mirza, Batul J. 14 March 2001 (has links)
Recommender systems have become paramount to customize information access and reduce information overload. They serve multiple uses, ranging from suggesting products and artifacts (to consumers), to bringing people together by the connections induced by (similar) reactions to products and services. This thesis presents a graph-theoretic model that casts recommendation as a process of 'jumping connections' in a graph. In addition to emphasizing the social network aspect, this viewpoint provides a novel evaluation criterion for recommender systems. Algorithms for recommender systems are distinguished not in terms of predicted ratings of services/artifacts, but in terms of the combinations of people and artifacts that they bring together. We present an algorithmic framework drawn from random graph theory and outline an analysis for one particular form of jump called a 'hammock.' Experimental results on two datasets collected over the Internet demonstrate the validity of this approach. / Master of Science
213

Recommender Systems for the Conference Paper Assignment Problem

Conry, Donald C. 29 June 2009 (has links)
Conference paper assignment---the task of assigning paper submissions to reviewers---is a key step in the management and smooth functioning of conferences. We study this problem as an application of recommender systems research. Besides the traditional goal of predicting `who likes what?', a conference management system must take into account reviewer capacity constraints, adequate numbers of reviews for papers, expertise modeling, conflicts of interest, and an overall distribution of assignments that balances reviewer preferences with conference objectives. Issues of modeling preferences and tastes in reviewing have traditionally been studied separately from the optimization of assignments. In this thesis, we present an integrated study of both aspects. First, due to the sparsity of data (relative to other recommender systems applications), we integrate multiple sources of information to learn reviewer/paper preference models, using methods commonly associated with merging content-based and collaborative filtering in the study of large recommender systems. Second, our models are evaluated not just in terms of prediction accuracy, but also in terms of end-assignment quality, and considering multiple evaluation criteria. Using a linear programming-based assignment optimization formulation, we show how our approach better explores the space of potential assignments to maximize the overall affinities of papers assigned to reviewers. Finally, we demonstrate encouraging results on real reviewer preference data gathered during the IEEE ICDM 2007 conference, a premier international data mining conference. Our research demonstrates that there are significant advantages to applying recommender system concepts to the conference paper assignment problem. / Master of Science
214

Reduced-order adaptive control

Hutchinson, James H. 02 May 2009 (has links)
The method of Pseudo-Linear Identification (PLID) is developed for application in an adaptive control loop. The effects of noise are investigated for the case of full-order system identification, and the results are applied to the use of PLID as a reduced-order system estimator. A self-tuning regulator (STR) is constructed using PLID and the effects of reducing the expected order of the system are demonstrated. A second adaptive control algorithm is presented wherein the STR controller is varied to achieve some degree of closeness to a given model (model-reference adaptive control). / Master of Science
215

State estimation using a multiple model likelihood weighted filter array

Wood, Eric F. 01 April 2001 (has links)
No description available.
216

Tackling the problems of diversity in recommender systems

Karanam, Manikanta Babu January 1900 (has links)
Master of Science / Department of Computing and Information Sciences / William H. Hsu / A recommender system is a computational mechanism for information filtering, where users provide recommendations (in the form of ratings or selecting items) as inputs, which the system then aggregates and directs to appropriate recipients. With the advent of web based media and publicity methods, the age where standardized methods of publicity, sales, production and marketing strategies do not. As such, in many markets the users are given a wide range of products and information to choose which product they like, to find a way out of this recommender systems are used in a way similar to the live social scenario, that is a user tries to get reviews from friends before opting for a product in a similar way recommender system tries to be a friend who recommends the options. Most of the recommender systems currently developed solely accuracy driven, i.e., reducing the Mean Absolute Error (MAE) between the predictions of the recommender system and actual ratings of the user. This leads to various problems for recommender systems such as lack of diversity and freshness. Lack of diversity arises when the recommender system is overly focused on accuracy by recommending a set of items, in which all of the items are too similar to each other, because they are predicted to be liked by the user. Lack of freshness also arises with overly focusing on accuracy but as a limitation on the set of items recommended making it overly predictable. This thesis work is directed at addressing the issues of diversity, by developing an approach, where a threshold of accuracy (in terms of Mean Absolute Error in prediction) is maintained while trying to diversify the set of item recommendations. Here for the problem of diversity a combination of Attribute-based diversification and user preference based diversification is done. This approach is then evaluated using non-classical methods along with evaluating the base recommender algorithm to prove that diversification is indeed is possible with a mixture of collaborative and content based approach.
217

Shilling attack detection in recommender systems.

Bhebe, Wilander. January 2015 (has links)
M. Tech. Information Networks / The growth of the internet has made it easy for people to exchange information resulting in the abundance of information commonly referred to as information overload. It causes retailers to fail to make adequate sales since the customers are swamped with a lot of options and choices. To lessen this problem retailers have begun to find it useful to make use of algorithmic approaches to determine which content to show consumers. These algorithmic approaches are known as recommender systems. Collaborative Filtering recommender systems suggest items to users based on other users reported prior experience with those items. These systems are, however, vulnerable to shilling attacks since they are highly dependent on outside sources of information. Shilling is a process in which syndicating users can connive to promote or demote a certain item, where malicious users benefit from introducing biased ratings. It is, however, critical that shilling detection systems are implemented to detect, warn and shut down shilling attacks within minutes. Modern patented shilling detection systems employ: (a) classification methods, (b) statistical methods, and (c) rules and threshold values defined by shilling detection analysts, using their knowledge of valid shilling cases and the false alarm rate as guidance. The goal of this dissertation is to determine a context for, and assess the performance of Meta-Learning techniques that can be integrated in the shilling detection process.
218

An Examination of Internet Filtering and Safety Policy Trends and Issues in South Carolina's K-12 Public Schools

Vicks, Mary E. 01 January 2013 (has links)
School districts have implemented filtering and safety policies in response to legislative and social mandates to protect students from the proliferation of objectionable online content. Subject related literature suggests these policies are more restrictive than legal mandates require and are adversely affecting information access and instruction. There is limited understanding of how filtering and safety policies are affecting teaching and learning because no comprehensive studies have investigated the issues and trends surrounding filtering and safety policy implementation. In order to improve existing safety policies, policymakers need research-based data identifying end user access issues that limit technology integration in the kindergarten-12th grade (K-12) educational setting. This study sought to examine Internet filtering and safety policy implementation issues in South Carolina's K-12 public schools to determine their influence on information access and instruction. A mixed methods research design, which includes both quantitative and qualitative approaches, was used to investigate the research problem. Quantitative data were collected from information technology (IT) administrators who were surveyed regarding filtering and safety policy implementation, and school library media specialists (SLMS) were surveyed concerning the issues they encounter while facilitating information access in a filtered environment. Qualitative data were collected through interviews with a subset of the SLMS population, thereby providing further insight about Internet access issues and their influence on teaching and learning. School districts' Acceptable Use Policies (AUPs) were analyzed to determine how they addressed recent legislative mandates to educate minors about specific Web 2.0 safety issues. The research results support the conclusions of previous anecdotal studies which show that K-12 Internet access policies are overly restrictive, resulting in inhibited access to online educational resources. The major implication of this study is that existing Internet access policies need to be fine-tuned in order to permit greater access to educational content. The study recommends Internet safety practices that will empower teachers and students to access the Internet's vast educational resources safely and securely while realizing the Internet's potential to enrich teaching and learning.
219

ASW fusion on a PC

Mann, Joelle J. 06 1900 (has links)
Approved for public release; distribution is unlimited / LosCon, the software program developed for the author's thesis and tested at sea, is designed to help the ASW commander regain tactical control in a loss of submarine contact situation. Persistent detection and cueing in the battlespace depend on utilizing contact reports from a network of combatant platform and offboard sensors. LosCon, an extended Kalman filter-based program modeled after MTST (Maneuvering Target Statistical Tracker), can integrate the sensor network very efficiently. Kalman filtering is a method of recursively updating the position of an evading target and accuracy of that position using imperfect measurements. Lines of bearing to the contact with associated standard deviation bearing errors and positions with their standard deviation range errors are the measurements LosCon uses to generate an ellipse of the submarine's likely position or AOU (Area Of Uncertainty). LosCon will also generate an expanded AOU for any future time, allowing commanders to correctly estimate the size of the search area. The effectiveness of the sea shield concept depends on the ability of organic forces to deny the enemy tactical control of the battlespace area. Incorporating the information generated by LosCon would assist ASW commanders in maintaining undersea superiority. / Ensign, United States Navy
220

Re-growing a tropical dry forest: functional plant trait composition and community assembly during succession

Buzzard, Vanessa, Hulshof, Catherine M., Birt, Trevor, Violle, Cyrille, Enquist, Brian J. 06 1900 (has links)
1. A longstanding goal of ecology and conservation biology is to understand the environmental and biological controls of forest succession. However, the patterns and mechanisms that guide successional trajectories, especially within tropical forests, remain unclear. 2. We collected leaf functional trait and abiotic data across a 110-year chronosequence within a tropical dry forest in Costa Rica. Focusing on six key leaf functional traits related to resource acquisition and competition, along with measures of forest stand structure, we propose a mechanistic framework to link species composition, community trait distributions and forest structure. We quantified the community-weighted trait distributions for specific leaf area, leaf dry matter concentration, leaf phosphorus concentration, leaf carbon to nitrogen ratio and leaf stable isotopic carbon and nitrogen. We assessed several prominent hypotheses for how these functional measures shift in response to changing environmental variables (soil water content, bulk density and pH) across the chronosequence. 3. Increasingly, older forests differed significantly from younger forests in species composition, above-ground biomass and shifted trait distributions. Early stages of succession were uniformly characterized by lower values of community-weighted mean specific leaf area, leaf stable nitrogen isotope and leaf phosphorus concentration. Leaf dry matter concentration and leaf carbon to nitrogen ratio were lower during earlier stages of succession, and each trait reached an optimum during intermediate stages of succession. The leaf carbon isotope ratio was the only trait to decrease linearly with increasing stand age indicating reduced water use efficiency in older forests. However, in contrast with expectations, community-weighted trait variances did not generally change through succession, and when compared to null expectations were lower than expected. 4. The observed directional shift in community-weighted mean trait values is consistent with the 'productivity filtering' hypothesis where a directional shift in water and light availability shifts physiological strategies from 'slow' to 'fast'. In contrast with expectations arising from niche based ecology, none of the community trait distributions were over-dispersed. Instead, patterns of trait dispersion are consistent with the abiotic filtering and/or competitive hierarchy hypotheses.

Page generated in 0.0657 seconds