• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 144
  • 60
  • 27
  • 14
  • 11
  • 11
  • 9
  • 8
  • 6
  • 4
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 333
  • 333
  • 105
  • 90
  • 87
  • 67
  • 57
  • 49
  • 46
  • 44
  • 41
  • 40
  • 38
  • 36
  • 35
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

Development of a Management Guide for Concrete Bridge Decks in Utah

Emery, Tenli Waters 10 December 2020 (has links)
The objectives of this research were to 1) investigate bridge deck condition assessment methods used in the field and laboratory, methods of managing bridge decks, and methods for estimating remaining bridge deck service life using computer models through a comprehensive literature review on these subjects; 2) collect and analyze field data from representative concrete bridge decks in Utah; and 3) develop a decision tree for concrete bridge deck management in Utah. As a result of the literature review performed for objective 1, a synthesis of existing information about condition assessment, bridge deck preservation and rehabilitation, bridge deck reconstruction, and estimating remaining service life using computer models was compiled. For objective 2, 15 bridge decks were strategically selected for testing in this research. Five bridge decks had bare concrete surfaces, five bridge decks had asphalt overlays, and five bridge decks had polymer overlays. Bridge deck testing included site layout, cover depth measurement, chloride concentration testing, chain dragging, half-cell potential testing, Schmidt rebound hammer testing, impact-echo testing, and vertical electrical impedance testing. Two-sample t-tests were performed to investigate the effects of selected bridge deck features, including polymer overlay application, deck age at polymer overlay application, overlay age, asphalt overlay application with and without a membrane, stay-in-place metal forms (SIPMFs), SIPMF removal, internally cured concrete, and use of an automatic deck deicing system. For objective 3, condition assessment methods were described in terms of test type, factors evaluated, equipment cost, data collection speed, required expertise, and traffic control for each method. Unit costs, expected treatment service life estimates, and factors addressed for the preservation, rehabilitation, and reconstruction methods most commonly used by the Utah Department of Transportation (UDOT) were also summarized. Bridge deck testing results were supplemented with information about current bridge deck management practices and treatment costs obtained from UDOT, as well as information about condition assessment and expected treatment service life, to develop a decision tree for concrete bridge deck management. Based on the results of field work and statistical analyses, placing an overlay within a year after construction is recommended. Removing SIPMFs after a deck age greater than 18 years is not likely to be effective at reversing the adverse effects of the SIPMFs on bridge deck condition and is not recommended. Bridge deck construction using internally cured concrete is not recommended for protecting against rebar corrosion. To the extent that excluding an automatic deck deicing system does not compromise public safety, automatic deck deicing systems are not recommended. To supplement the typical corrosion initiation threshold of 2.0 lb Cl-/yd3 of concrete for black bar, a corrosion initiation threshold of 8.0 lb Cl-/yd3 of concrete is recommended in this research for bridge decks with intact epoxy-coated rebar. For chloride concentrations less than 20 lb Cl-/yd3 of concrete as measured between reinforcing bars, an increase of up to 70 percent should be applied to estimate the corresponding chloride concentration of the concrete in direct contact with the rebar. The decision tree developed in this research includes 10 junctions and seven recommended treatments. The junctions require the user to address questions about surface type, degree of protection against water and chloride ion ingress, degree of deterioration, and years of additional service life needed; the answers lead to selection of treatment options ranging from repairing an overlay to full-depth bridge deck reconstruction. Revisions to the decision tree should be incorporated as additional methods, data, treatments, or other relevant information become available.
102

Shearlet-Based Descriptors and Deep Learning Approaches for Medical Image Classification

Al-Insaif, Sadiq 07 June 2021 (has links)
In this Ph.D. thesis, we develop effective techniques for medical image classification, particularly, for histopathological and magnetic resonance images (MRI). Our techniques are capable of handling the high variability in the content of such images. Handcrafted techniques based on texture analysis are used for the classification task. We also use deep learning models but training such models from scratch can be a challenging process, instead, we employ deep features and transfer learning. First, we propose a combined texture-based feature representation that is computed in the complex shearlet domain for histopathological image classification. With complex coefficients, we examine both the magnitude and relative phase of shearlets to form the feature space. Our proposed techniques are successful for histopathological image classification. Furthermore, we investigate their ability to generalize to MRI datasets that present an additional challenge, namely high dimensionality. An MRI sample consists of a large number of slices. Our proposed shearlet-based feature representation for histopathological images cannot be used without adjustment. Therefore, we consider the 3D shearlet transform given the volumetric nature of MRI data. An advantage of the 3D shearlet transform is that it takes into consideration adjacent slices of MRI data. Secondly, we study the classification of histopathological images using pre-trained deep learning models. A pre-trained deep learning model can act as a starting point for datasets with a limited number of samples. Therefore, we used various models either as unsupervised feature extractors, or weight initializers to classify histopathological images. When it comes to MRI samples, fine-tuning a deep learning model is not straightforward. Pre-trained models are trained on RGB images which have a channel size of 3, but an MRI sample has a larger number of slices. Fine-tuning a convolutional neural network (CNN) requires adjusting a model to work with MRI data. We fine-tune pre-trained models and then use them as feature extractors. Thereafter, we demonstrate the effectiveness of fine-tuned deep features with classical machine learning (ML) classifiers, namely a support vector machine and a decision tree bagger. Furthermore, instead of using a classical ML classifier for the MRI sample, we built a custom CNN that takes both the 3D shearlet descriptors and deep features as an input. This custom network processes our feature representation end-to-end and then classifies an MRI sample. Our custom CNN is more effective in comparison to a classical ML on a hidden MRI dataset. It is an indication that our CNN model is less susceptible to over-fitting.
103

MACHINE LEARNING BASED ALGORITHMIC APPROACHES FOR NETWORK TRAFFIC CLASSIFICATION

Jamil, Md Hasibul 01 December 2021 (has links)
Networking and distributed computing system have provided computational resources for machine learning (ML) application for a long time. Network system itself also can benefit from ML technologies. For example high performance packet classification is a key component to support scalable network applications like firewalls, intrusion detection, and differentiated services. With ever increasing demand in the line rate for core networks, a great challenge is to use hand-tuned heuristic approaches to design a scalable and high performance packet classification solution. By exploiting the sparsity present in a ruleset, in this thesis an algorithm is proposed to use few effective bits (EBs) to extract a large number of candidate rules with just a few number of memory access. These effective bits are learned with deep reinforcement learning and they are used to create a bitmap to filter out the majority of rules which do not need to be fully matched to improve the online system performance. Utilizing reinforcement learning allows the proposed solution to be learning based rather than heuristic based algorithms. So proposed learning-based selection method is independent of the ruleset, which can be applied to different rulesets without relying on the heuristics. Proposed multibit tries classification engine outperforms lookup time both in worst and average case by 55% and reduce memory footprint, compared to traditional decision tree without EBs. Furthermore, many field packet classification are required for openFlow supported switches. With the proliferation of fields in the packet header, a traditional 5-field classification technique isn’t applicable for an efficient classification engine for those openFlow supported switches. Although the algorithmic insights obtained from 5-field classification techniques could still be applied for many field classification engine. To decompose given fields of a ruleset, different grouping metrics like standard deviation of individual fields and a novel metric called Diversity Index (DI) is considered for such many field scenarios. A detailed discussion and evaluation of how to decompose rule fields/dimension into subgroup, how a decision tree construction can be considered as reinforcement learning problem, and how to encode state and action space, reward calculation to effectively build trees for each subgroup with a global optimization objective is introduced in this work. Finally, to identify benign or malicious heterogeneous type of traffic present in a modern home network, a deep neural network based approach is introduced. A split architecture of such traffic classifier, in application of home network intrusion detection system consists of multiple machine learning (ML) models. These models trained on two separate dataset for heterogeneous traffic types. An analysis of run-time implementation performance of the proposed IDS models is also discussed.
104

Inter-annual stability of land cover classification: explorations and improvements

Abercrombie, Stewart Parker 22 January 2016 (has links)
Land cover information is a key input to many earth system models, and thus accurate and consistent land cover maps are critically important to global change science. However, existing global land cover products show unrealistically high levels of year-to-year change. This thesis explores methods to improve accuracies for global land cover classifications, with a focus on reducing spurious year-to-year variation in results derived from MODIS data. In the first part of this thesis I use clustering to identify spectrally distinct sub-groupings within defined land cover classes, and assess the spectral separability of the resulting sub-classes. Many of the sub-classes are difficult to separate due to a high degree of overlap in spectral space. In the second part of this thesis, I examine two methods to reduce year-to-year variation in classification labels. First, I evaluate a technique to construct training data for a per-pixel supervised classification algorithm by combining multiple years of spectral measurements. The resulting classifier achieves higher accuracy and lower levels of year-to-year change than a reference classifier trained using a single year of data. Second, I use a spatio-temporal Markov Random Field (MRF) model to post-process the predictions of a per-pixel classifier. The MRF framework reduces spurious label change to a level comparable to that achieved by a post-hoc heuristic stabilization technique. The timing of label change in the MRF processed maps better matched disturbance events in a reference data, whereas the heuristic stabilization results in label changes that lag several years behind disturbance events.
105

Intelligent gravitational search random forest algorithm for fake news detection

Natarajan, Rathika, Mehbodniya, Abolfazl, Rane, Kantilal Pitambar, Jindal, Sonika, Hasan, Mohammed Faez, Vives, Luis, Bhatt, Abhishek 01 January 2022 (has links)
El texto completo de este trabajo no está disponible en el Repositorio Académico UPC por restricciones de la casa editorial donde ha sido publicado. / Online social media has made the process of disseminating news so quick that people have shifted their way of accessing news from traditional journalism and press to online social media sources. The rapid rotation of news on social media makes it challenging to evaluate its reliability. Fake news not only erodes public trust but also subverts their opinions. An intelligent automated system is required to detect fake news as there is a tenuous difference between fake and real news. This paper proposes an intelligent gravitational search random forest (IGSRF) algorithm to be employed to detect fake news. The IGSRF algorithm amalgamates the Intelligent Gravitational Search Algorithm (IGSA) and the Random Forest (RF) algorithm. The IGSA is an improved intelligent variant of the classical gravitational search algorithm (GSA) that adds information about the best and worst gravitational mass agents in order to retain the exploitation ability of agents at later iterations and thus avoid the trapping of the classical GSA in local optimum. In the proposed IGSRF algorithm, all the intelligent mass agents determine the solution by generating decision trees (DT) with a random subset of attributes following the hypothesis of random forest. The mass agents generate the collection of solutions from solution space using random proportional rules. The comprehensive prediction to decide the class of news (fake or real) is determined by all the agents following the attributes of random forest. The performance of the proposed algorithm is determined for the FakeNewsNet dataset, which has sub-categories of BuzzFeed and PolitiFact news categories. To analyze the effectiveness of the proposed algorithm, the results are also evaluated with decision tree and random forest algorithms. The proposed IGSRF algorithm has attained superlative results compared to the DT, RF and state-of-the-art techniques. / Revisión por pares
106

Applying Different Wide-Area Response-Based Controls to Different Contingencies in Power Systems

Iranmanesh, Shahrzad 08 1900 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / The electrical disturbances in the power system have threatened the stability of the system. In the first step, it is necessary to detect these electrical disturbances or events. In the next step, a proper control should apply to the system to decrease the consequences of the disturbances. One-shot control is one of the effective methods for stabilizing the events. In this method, a proper amount of loads are increased or decreased to the electrical system. Determining the amounts of loads, and the location for shedding is crucial. Moreover, some control combinations are more effective for some events and less effective for some others. Therefore, this project is completed in two different sections. First, finding the effective control combinations, second, finding an algorithm for applying different control combinations to different contingencies in real-time. To find effective control combinations, sensitivity analysis is employed to locate the most effective loads in the system. Then to find the control combination commands, gradient descent, and PSO algorithm are used in this project. In the next step, a pattern recognition method is used to apply the appropriate control combination for every event. The decision tree is selected as the pattern recognition method. The three most effective control combinations found by sensitivity analysis and the PSO method are used in the remainder of this study. A decision tree is trained for each of the three control combinations, and their outputs are combined into an algorithm for selecting the best control in real-time. Finally, the algorithm is evaluated using a test set of contingencies. The final results reveal a 30\% improvement in comparison to the previous studies.
107

Searching for Light Sterile Neutrinos with NOvA Through Neutral-Current Disappearance

Yang, Shaokai 19 November 2019 (has links)
No description available.
108

Cognitive Electronic Warfare System

McWhorter, Tanner Maxwell 27 July 2020 (has links)
No description available.
109

Protein Function Prediction Using Decision Tree Technique

Yedida, Venkata Rama Kumar Swamy 02 September 2008 (has links)
No description available.
110

Improving Psychotherapy Outcome: The Use of Immediate Electronic Feedback and Revised Clinical Support Tools

Slade, Karstin Lee 16 July 2008 (has links) (PDF)
While the beneficial effects of psychotherapy have been well documented, the fact remains that 5 to 10 percent of clients get worse while in treatment (Lambert & Ogles, 2004) and a large minority of patients show little response (Hansen, Lambert, & Forman, 2003). The effects of four interventions, aimed at reducing deterioration and enhancing positive outcomes were examined in an Immediate Electronic Feedback sample of 1101 patients whose outcome was contrasted across experimental groups and with two archival groups: the Week-Delayed Feedback group, consisting of archival data from 1374 patients and the treatment-as-usual control group consisting of archival data from 1445 patients. Results indicate that feedback to therapists improved outcome across clients, especially for signal-alarm cases. Therapist feedback effects were enhanced by the use of manually based Clinical Support Tools, but not by providing direct feedback to clients about their progress. There were no significant differences in outcome between the Week-Delayed CST feedback and the 2-Week-Delayed CST feedback groups; however, clients in the Week-Delayed CST feedback condition, attended 3 less sessions, on average, than their 2-Week-Delayed CST feedback counterparts. Furthermore, a significantly greater number of people in the Week-Delayed CST Feedback group ended treatment in the Recovered/Improved classification of the Jacobson/Truax model.

Page generated in 0.0891 seconds