Spelling suggestions: "subject:"artificialintelligence"" "subject:"articialintelligence""
41 |
Cognitive Computing for Decision SupportJanuary 2020 (has links)
abstract: The Cognitive Decision Support (CDS) model is proposed. The model is widely applicable and scales to realistic, complex decision problems based on adaptive learning. The utility of a decision is discussed and four types of decisions associated with CDS model are identified. The CDS model is designed to learn decision utilities. Data enrichment is introduced to promote the effectiveness of learning. Grouping is introduced for large-scale decision learning. Introspection and adjustment are presented for adaptive learning. Triage recommendation is incorporated to indicate the trustworthiness of suggested decisions.
The CDS model and methodologies are integrated into an architecture using concepts from cognitive computing. The proposed architecture is implemented with an example use case to inventory management.
Reinforcement learning (RL) is discussed as an alternative, generalized adaptive learning engine for the CDS system to handle the complexity of many problems with unknown environments. An adaptive state dimension with context that can increase with newly available information is discussed. Several enhanced components for RL which are critical for complex use cases are integrated. Deep Q networks are embedded with the adaptive learning methodologies and applied to an example supply chain management problem on capacity planning.
A new approach using Ito stochastic processes is proposed as a more generalized method to generate non-stationary demands in various patterns that can be used in decision problems. The proposed method generates demands with varying non-stationary patterns, including trend, cyclical, seasonal, and irregular patterns. Conventional approaches are identified as special cases of the proposed method. Demands are illustrated in realistic settings for various decision models. Various statistical criteria are applied to filter the generated demands. The method is applied to a real-world example. / Dissertation/Thesis / Doctoral Dissertation Industrial Engineering 2020
|
42 |
Guidelines for handling multidimensionality in a terminological knowledge base.Bowker, Lynne January 1992 (has links)
The goal of this thesis is to develop and apply a set of guidelines for handling multidimensionality in a terminological knowledge base (TKB). A dimension represents one way of classifying a group of objects; a classification with more than one dimension is said to be multidimensional. The recognition and representation of multidimensionality is a subject that has received very little attention in the terminology literature. One tool for dealing with multidimensionality is CODE (Conceptually Oriented Description Environment). This thesis is divided into four main parts. In Part I, we discuss the general principles of classification, and explain multidimensionality. In Part II, we develop an initial set of guidelines to help terminologists both recognize and represent multidimensionality in a TKB. In Part III, we develop a technical complement to the initial guidelines. We begin with a general description of the CODE system, and then we analyze those features that are particularly helpful for handling multidimensionality. Finally, in Part IV, we apply our guidelines by using the CODE system to construct a small TKB for concepts in a subfield of hypertext, namely hypertext links. (Abstract shortened by UMI.)
|
43 |
Reinforcement Learning Algorithms: Acceleration Design and Non-asymptotic TheoryXiong, Huaqing 19 November 2021 (has links)
No description available.
|
44 |
Topics of deep learning in security and compressionWang, Xiao 22 January 2021 (has links)
This thesis covers topics at the intersection of deep learning (DL), security and compression. These topics include the issues of security and compression of DL models themselves, as well as their applications in the fields of cyber security and data compression.
The first part of the thesis focuses on the security problems of DL. Recent studies have revealed the vulnerability of DL under several malicious attacks such as adversarial attacks, where the output of a DL model is manipulated through an invisibly small perturbation of the model's input. We propose to defend against these threats by incorporating stochasticity into DL models. Multiple randomization schemes are introduced including Defensive Dropout (DD), Hierarchical Random Switching (HRS) and Adversarially Trained Model Switching (AdvMS).
The next part of the thesis discusses the usage of DL in security domain. In particular, we consider anomaly detection problems in an unsupervised learning setting using auto-encoders and apply this method to both side-channel signals and proxy logs.
In the third part we discuss the interaction between DL and Compressed Sensing (CS). In CS systems, the processing time is largely limited by the computational cost of sparse reconstruction. We show that full reconstruction can be bypassed by training deep networks that extract information directly from the compressed signals. From another perspective, CS also help reducing the complexity of DL models by providing a more compact data representation.
The last topic is DL based codecs for image compression. As an extension to the current framework, we propose Substitutional Neural Image Compression (SNIC) that finds the optimal input substitute for a specific compression target. SNIC leads to both improved rate-distortion trade-off and easier bit-rate control.
|
45 |
Convolutional Neural Networks for Hurricane Road Closure Probability and Tree Debris EstimationUnknown Date (has links)
Hurricanes cause significant property loss every year. A substantial part of that loss is due to the trees destroyed by the wind, which in turn block the roads and produce a large amount of debris. The debris not only can cause damage to nearby properties, but also needs to be cleaned after the hurricane. Neural Networks grown significantly as a field over the last year finding a lot of applications in many disciplines like computer science, medicine, banking, physics, and engineering. In this thesis, a new method is proposed to estimate the tree debris due to high winds using the Convolutional Neural Networks (CNNs). For the purposes of this thesis the tree satellite image dataset was created which then was used to train two networks CNN-I and CNN-II for tree recognition and tree species recognition, respectively. Satellite images were used as the input for the CNNs to recognize the locations and types of the trees that can produce the debris. The tree images selected by CNN were used to approximate the tree parameters that were later used to calculate the tree failure density function often called fragility function (at least one failure in the time period) for each recognized tree. The tree failure density functions were used to compose the probability of road closure due to hurricane winds and overall amount of the tree debris. The proposed approach utilizes the current trends in Neural Networks and is easily applicable, such that can help cities and state authorities to better plan for the adverse consequences of tree failures due to hurricane winds. / A Thesis submitted to the Department of Computer Science in partial fulfillment of the requirements for the degree of Master of Science. / 2019 / September 27, 2019. / Convolutional Neuron Networks, Fragility, Hurricane, Tree debris, Wind / Includes bibliographical references. / Xiuwen Liu, Professor Co-Directing Thesis; Sungmoon Jung, Professor Co-Directing Thesis; Peixiang Zhao, Committee Member; Shayok Chakraborty, Committee Member.
|
46 |
Cognitive computing: algorithm design in the intersection of cognitive science and emerging computer architecturesChandler, Benjamin 22 January 2016 (has links)
For the first time in decades computers are evolving into a fundamentally new class of machine. Transistors are still getting smaller, more economical, and more power-efficient, but operating frequencies leveled off in the mid-2000's. Today, improving performance requires placing a larger number of slower processing cores on each of many chips. Software written for such machines must scale out over many cores rather than scaling up with a faster single core. Biological computation is an extreme manifestation of such a many-slow-core architecture and therefore offers a potential source of ideas for leveraging new hardware. This dissertation addresses several problems in the intersection of emerging computer architectures and biological computation, termed Cognitive Computing: What mechanisms are necessary to maintain stable representations in a large distributed learning system? How should complex biologically-inspired algorithms be tested? How do visual sensing limitations like occlusion influence performance of classification algorithms?
Neurons have a limited dynamic output range, but must process real-world signals over a wide dynamic range without saturating or succumbing to endogenous noise. Many existing neural network models leverage spatial competition to address this issue, but require hand-tuning of several parameters for a specific, fixed distribution of inputs. Integrating spatial competition with a stabilizing learning process produces a neural network model capable of autonomously adapting to a non-stationary distribution of inputs.
Human-engineered complex systems typically include a number of architectural features to curtail complexity and simplify testing. Biological systems do not obey these constraints. Biologically-inspired algorithms are thus dramatically more difficult to engineer. Augmenting standard tools from the software engineering community with features targeted towards biologically-inspired systems is an effective mitigation.
Natural visual environments contain objects that are occluded by other objects. Such occlusions are under-represented in the standard benchmark datasets for testing classification algorithms. This bias masks the negative effect of occlusion on performance. Correcting the bias with a new dataset demonstrates that occlusion is a dominant variable in classification performance. Modifying a state-of-the-art algorithm with mechanisms for occlusion resistance doubles classification performance in high-occlusion cases without penalty for unoccluded objects.
|
47 |
Hybrid Simulations: New Directions in Combining Machine Learning and Discrete ModelsWozniak, Maciej Kazimierz 04 January 2022 (has links)
No description available.
|
48 |
Mixture Weighted Policy Cover: Exploration in Multi-Agent Reinforcement LearningMiller, Dylan 13 July 2022 (has links)
No description available.
|
49 |
The Simulation of Autonomous Racing Based on Reinforcement LearningLi, Jiachen, Li 14 August 2018 (has links)
No description available.
|
50 |
Experiments with Neural Network LibrariesKhazanova, Yekaterina January 2013 (has links)
No description available.
|
Page generated in 0.0854 seconds