• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 3288
  • 568
  • 314
  • 234
  • 196
  • 127
  • 108
  • 83
  • 83
  • 83
  • 83
  • 83
  • 83
  • 32
  • 29
  • Tagged with
  • 6052
  • 6052
  • 2180
  • 1676
  • 1377
  • 1257
  • 944
  • 922
  • 809
  • 790
  • 717
  • 658
  • 636
  • 587
  • 574
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
131

Application of artificial neural networks and the wavelet transform for pattern recognition, noise reduction and data compression

Choudhry, Muhammad Din January 2000 (has links)
Theory of Artificial Neural Networks (ANNs) could not provide an exact method of weights training. The training is done mostly by iterative trial and error minimisation methods which do not enable the ANNs for time incremental learning. In this thesis, it is shown that the weights successfully produced by an error minimisation method are nothing more than the scaled versions of their respective components of the sample pattern and that the training methods leaves a chance for a neuron to be deceived. An exact method of weight construction is developed in the form of a system of linear equations. A new linear classifier ANN and a number of thresholding procedures are developed. It is proved that the Hopfield network and the Boltzmann machine do not qualify as the reasonable networks. A generalised multiclass linear classifier ANN is developed which is a combination of a newly developed multiclass linear ANN and a newly developed multiclass XOR classifier ANN. A biological neuromuscular system is interpreted as a multiclass linear classifier ANN. A new technique for pattern recognition. especially for images, has been presented with a software check. The technique minimises the design topology of ANNs and enables them to classify a scaled, a mirror image, and a noisy version of the sample pattern. The Continuous Wavelet Transform (CWT), the Discrete Wavelet Transform, and the Wavelet Decomposition has been connected by developing an extend-able and intensifyable system of particular six Gaussian wavelets. A binary transform applicable for every real function is developed. The confusing automatic nature of the CWT is explained along with presenting a new style of defining wavelets. Application of the wavelet transforms for noise reduction and data compression/expansion is explained and their performance is checked through the self developed software. A modification in the CWT is made in order to make their application easier through ANNs. The ANNs are developed and their performance is checked against the self developed software. A new multiresolution zoom-out wavelet transform is developed which expands data without smoothing it. A new wavelet is deduced from the smoothing average filter. Some twodimensional wavelets for noise reduction and data compression/expansion are developed on the same style and their performance is checked through the self developed software. An ANN for CWT using a newly developed two-dimensional wavelet is developed and its activation is explained. Data compression by locating peaks and bottoms of data and setting other elements equals zero is done with the guarantee of reconstruction. The new wavelet transform is modified to reconstruct the data between peaks and bottoms. Peaks and bottoms detecting ANNs are developed and their performance is checked against the self developed software. Procedures for classification are presented with self developed software check. The theory of ANNs requires bit-wise parallel adders and multiplexors. A parallel adder circuit is developed by combining some newly developed basic units for the purpose.
132

Combining domain expert knowledge with neural networks for predicting corporate bankruptcies

Nasir, M. L. January 2000 (has links)
No description available.
133

Incremental activity and plan recognition for human teams

Masato, Daniele January 2012 (has links)
Anticipating human subjects' intentions and information needs is considered one of the ultimate goals of Artificial Intelligence. Activity and plan recognition contribute to this goal by studying how low-level observations about subjects and the environment in which they act can be linked to a high-level plan representation. This task is challenging in a dynamic and uncertain environment; the environment may change while the subjects are reasoning about it, and the effects of the subjects' interactions cannot be predicted with certainty. Humans generally struggle to enact plans and maintain situation awareness in such circumstances, even when they work in teams towards a common objective. Intelligent software assistants can support human teams by monitoring their activities and plan progress, thus relieving them from some of the cognitive burden they experience. The assistants' design needs to keep into account that teams can form and disband quickly in response to environmental changes, and that the course of action may change during plan execution. It is also crucial to efficiently and incrementally process a stream of observations in order to enable online prediction of those intentions and information needs. In this thesis we propose an incremental approach for team composition and activity recognition based on probabilistic graphical models. We show that this model can successfully learn team formations and behaviours in highly dynamic domains, and that classification can be performed in polynomial time. We evaluate our model within a simulated scenario provided by an open-source computer game. In addition, we discuss an incremental approach to plan recognition that exploits the results yielded by activity recognition to assess a team's course of action. We show how this model can account for incomplete or inconsistent knowledge about recognised activities, and how it can be integrated into an existing mechanism for plan recognition.
134

Integrating Exponential Dispersion Models to Latent Structures

Basbug, Mehmet Emin 08 February 2017 (has links)
<p> Latent variable models have two basic components: a latent structure encoding a hypothesized complex pattern and an observation model capturing the data distribution. With the advancements in machine learning and increasing availability of resources, we are able to perform inference in deeper and more sophisticated latent variable models. In most cases, these models are designed with a particular application in mind; hence, they tend to have restrictive observation models. The challenge, surfaced with the increasing diversity of data sets, is to generalize these latent models to work with different data types. We aim to address this problem by utilizing exponential dispersion models (EDMs) and proposing mechanisms for incorporating them into latent structures. (Abstract shortened by ProQuest.)</p>
135

Intelligent Maintenance Aid (IMA)

Shockley, Keith J. 06 1900 (has links)
Technological complexities of current ground combat systems require advanced maintenance methods to keep the fleet in a state of operational readiness. Currently, maintenance personnel use paper Technical Manuals (TM) that are cumbersome and not easily transportable or updated in the field. This thesis proposes using the latest technology to support maintainers in the field or depot by integrating the TMs with the onboard diagnostics Built-In-Test (BIT) and Fault Isolation Test (FIT) of the vehicle, to provide the maintainer with an improved diagnostics tool to expedite troubleshooting analysis. This will be accomplished by connecting the vehicle, using the vehicle's 1553 multiplex bus, with the Graphical User Interface (GUI) of an Intelligent Maintenance Aid (IMA). The IMA will use Troubleshooting Procedure (TP) codes generated during BIT and FIT testing. Using the information provided by these TP codes, through the IMA GUI, information from the technical manuals will be displayed to aid the maintainers in their diagnostic work. The results of this thesis will serve as a baseline for further research and will be presented to the program management office for combat systems (PM-CS) for further consideration and development. / US Army RDECOM-TACOM author (civilian).
136

Human behavior representation of military teamwork

Martin, Michael W. 06 1900 (has links)
This work presents a conceptual structure for the behaviors of artificial intelligence agents, with emphasis on creating teamwork through individual behaviors. The goal is to set up a framework which enables teams of simulation agents to behave more realistically. Better team behavior can lend a higher fidelity of human behavior representation in a simulation, as well as provide opportunities to experiment with the factors that create teamwork. The framework divides agent behaviors into three categories: leadership, individual, and team-enabling. Leadership behaviors consist of planning, decision-making, and delegating. Individual behaviors consist of moving, shooting, environment-monitoring, and self-monitoring. Team-enabling behaviors consist of communicating, synchronizing actions, and team member monitoring. These team-enabling behaviors augment the leadership and individual behaviors at all phases of an agent's thought process, and create aggregate team behavior that is a hybrid of emergent and hierarchical teamwork. The net effect creates, for each agent, options and courses of action which are sub-optimal from the individual agent's standpoint, but which leverage the power of the team to accomplish objectives. The individual behaviors synergistically combine to create teamwork, allowing a group of agents to act in such a manner that their overall effectiveness is greater than the sum of their individual contributions. / US Army (USA) author.
137

Discovering credible events in near real time from social media streams

Buntain, Cody 26 January 2017 (has links)
<p>Recent reliance on social media platforms as major sources of news and information, both for journalists and the larger population and especially during times of crisis, motivate the need for better methods of identifying and tracking high-impact events in these social media streams. Social media's volume, velocity, and democratization of information (leading to limited quality controls) complicate rapid discovery of these events and one's ability to trust the content posted about these events. This dissertation addresses these complications in four stages, using Twitter as a model social platform. The first stage analyzes Twitter's response to major crises, specifically terrorist attacks in Western countries, showing these high-impact events do not significantly impact message or user volume. Instead, these events drive changes in Twitter's topic distribution, with conversation, retweets, and hashtags relevant to these events experiencing significant, rapid, and short-lived bursts in frequency. Furthermore, conversation participants tend to prefer information from local authorities/organizations/media over national or international sources, with accounts for local police or local newspapers often emerging as central in the networks of interaction. Building on these results, the second stage in this dissertation presents and evaluates a set of features that capture these topical bursts associated with crises by modeling bursts in frequency for individual tokens in the Twitter stream. The resulting streaming algorithm is capable of discovering notable moments across a series of major sports competitions using Twitter's public stream without relying on domain- or language-specific information or models. Furthermore, results demonstrate models trained on sporting competition data perform well when transferred to earthquake identification. This streaming algorithm is then extended in this dissertation's third stage to support real-time event tracking and summarization. This real-time algorithm leverages new distributed processing technology to operate at scale and is evaluated against a collection of other community-developed information retrieval systems, where it performs comparably. Further experiments also show this real-time burst detection algorithm can be integrated with these other information retrieval systems to increase overall performance. The final stage then investigates automated methods for evaluating credibility in social media streams by leveraging two existing data sets. These two data sets measure different types of credibility (veracity versus perception), and results show veracity is negatively correlated with the amount of disagreement in and length of a conversation, and perceptions of credibility are influenced by the amount of links to other pages, shared media about the event, and the number of verified users participating in the discussion. Contributions made across these four stages are then usable in the relatively new fields of computational journalism and crisis informatics, which seek to improve news gathering and crisis response by leveraging new technologies and data sources like machine learning and social media.
138

Adaptive estimation techniques for resident space object characterization

LaPointe, Jamie 26 January 2017 (has links)
<p> This thesis investigates using adaptive estimation techniques to determine unknown model parameters such as size and surface material reflectivity, while estimating position, velocity, attitude, and attitude rates of a resident space object. This work focuses on the application of these methods to the space situational awareness problem.</p><p> This thesis proposes a unique method of implementing a top-level gating network in a dual-layer hierarchical mixture of experts. In addition it proposes a decaying learning parameter for use in both the single layer mixture of experts and the dual-layer hierarchical mixture of experts. Both a single layer mixture of experts and dual-layer hierarchical mixture of experts are compared to the multiple model adaptive estimation in estimating resident space object parameters such as size and reflectivity. The hierarchical mixture of experts consists of macromodes. Each macromode can estimate a different parameter in parallel. Each macromode is a single layer mixture of experts with unscented Kalman filters used as the experts. A gating network in each macromode determines a gating weight which is used as a hypothesis tester. Then the output of the macromode gating weights go to a top level gating weight to determine which macromode contains the most probable model. The measurements consist of astrometric and photometric data from non-resolved observations of the target gathered via a telescope with a charge coupled device camera. Each filter receives the same measurement sequence. The apparent magnitude measurement model consists of the Ashikhmin Shirley bidirectional reflectance distribution function. The measurements, process models, and the additional shape, mass, and inertia characteristics allow the algorithm to predict the state and select the most probable fit to the size and reflectance characteristics based on the statistics of the measurement residuals and innovation covariance. A simulation code is developed to test these adaptive estimation techniques. The feasibility of these methods will be demonstrated in this thesis.</p>
139

Data-driven computer vision for science and the humanities

Lee, Stefan 05 November 2016 (has links)
<p> The rate at which humanity is producing visual data from both large-scale scientific imaging and consumer photography has been greatly accelerating in the past decade. This thesis is motivated by the hypothesis that this trend will necessarily change the face of observational science and the humanities, requiring the development of automated methods capable of distilling vast image collections to produce meaningful analyses. Such methods are needed to empower novel science both by improving throughput in traditionally quantitative disciplines and by developing new techniques to study culture through large scale image datasets.</p><p> When computer vision or machine learning in general is leveraged to aid academic inquiry, it is important to consider the impact of erroneous solutions produced by implicit ambiguity or model approximations. To that end, we argue for the importance of algorithms that are capable of generating multiple solutions and producing measures of confidence. In addition to providing solutions to a number of multi-disciplinary problems, this thesis develops techniques to address these overarching themes of confidence estimation and solution diversity. </p><p> This thesis investigates a diverse set of problems across a broad range of studies including glaciology, developmental psychology, architectural history, and demography to develop and adapt computer vision algorithms to solve these domain-specific applications. We begin by proposing vision techniques for automatically analyzing aerial radar imagery of polar ice sheets while simultaneously providing glaciologists with point-wise estimates of solution confidence. We then move to psychology, introducing novel recognition techniques to produce robust hand localizations and segmentations in egocentric video to empower psychologists studying child development with automated annotations of grasping behaviors integral to learning. We then investigate novel large-scale analysis for architectural history, leveraging tens of thousands of publicly available images to identify and track distinctive architectural elements. Finally, we show how rich estimates of demographic and geographic properties can be predicted from a single photograph.</p>
140

Design and implementation of an English to Arabic machine translation (MEANA MT)

Alneami, Ahmed H. January 2001 (has links)
A new system for Arabic Machine Translation (called MEANA MT) has been built. This system is capable of the analysis of English language as a source and can convert the given sentences into Arabic. The designed system contains three sets of grammar rules governing the PARSING, TRANSFORMATION AND GENERATION PHASES. In the system, word sense ambiguity and some pragmatic patterns were resolved. A new two-way (Analysis/Generation) computational lexicon system dealing with the morphological analysis of the Arabic language has been created. The designed lexicon contains a set of rules governing the morphological inflection and derivation of Arabic nouns, verbs, verb "to be", verb "not to be" and pronouns. The lexicon generates Arabic word forms and their inflectional affixes such as plural and gender morphemes as well as attached pronouns, each according to its rules. It can not parse or generate unacceptable word inflections. This computational system is capable of dealing with vowelized Arabic words by parsing the vowel marks which are attached to the letters. Semantic value pairs were developed to show ~he word sense and other issues in morphology; e.g. genders, numbers and tenses. The system can parse and generate some pragmatic sentences and phrases like proper names, titles, acknowledgements, dates, telephone numbers and addresses. A Lexical Functional Grammar (LFG) formalism is used to combine the syntactic, morphological and semantic features. The grammar rules of this system were implemented and compiled in COMMON. LISP based on Tomita's Generalised LR parsing algorithm, augmented by Pseudo and Full Unification packages. After parsing, sentence constituents of the English sentence are rep- _ resented as Feature Structures (F-Structures). These take part in the transfer and generation process which uses transformation' grammar rules to change the English F-Structure into Arabic F-Structure. These Arabic F-Structure features will be suitable for the Arabic generation grammar to build the required Arabic sentence. This system has been tested on three domains (sentences and phrases); the first is a selected children's story, the second semantic sentences and the third domain consists of pragmatic sentences. This research could be considered as a complete solution for a personal MT system for small messages and sublanguage domains.

Page generated in 0.093 seconds