• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 469
  • 77
  • 34
  • 31
  • 29
  • 12
  • 5
  • 4
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • Tagged with
  • 807
  • 511
  • 239
  • 228
  • 173
  • 149
  • 129
  • 98
  • 97
  • 86
  • 83
  • 82
  • 73
  • 73
  • 71
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
201

DATA COLLECTION FRAMEWORK AND MACHINE LEARNING ALGORITHMS FOR THE ANALYSIS OF CYBER SECURITY ATTACKS

Unknown Date (has links)
The integrity of network communications is constantly being challenged by more sophisticated intrusion techniques. Attackers are shifting to stealthier and more complex forms of attacks in an attempt to bypass known mitigation strategies. Also, many detection methods for popular network attacks have been developed using outdated or non-representative attack data. To effectively develop modern detection methodologies, there exists a need to acquire data that can fully encompass the behaviors of persistent and emerging threats. When collecting modern day network traffic for intrusion detection, substantial amounts of traffic can be collected, much of which consists of relatively few attack instances as compared to normal traffic. This skewed distribution between normal and attack data can lead to high levels of class imbalance. Machine learning techniques can be used to aid in attack detection, but large levels of imbalance between normal (majority) and attack (minority) instances can lead to inaccurate detection results. / Includes bibliography. / Dissertation (Ph.D.)--Florida Atlantic University, 2019. / FAU Electronic Theses and Dissertations Collection
202

THE IMPACT OF THE MEDIEVAL CLIMATIC ANOMALY ON THE ARCHAEOLOGY AT EDWARDS AIR FORCE BASE

Porter-Rodriguez, Jessica Amanda 01 June 2017 (has links)
A series of severe and prolonged droughts occurred throughout the Northern Hemisphere between approximately 1150 BP to 600 BP. This phenomenon is referred to as the Medieval Climatic Anomaly and has been shown to have differentially impacted various regions of the world. Previous studies have suggested causal links between the Medieval Climatic Anomaly and observed culture change. The goal of this study was to examine the Antelope Valley region of the Mojave Desert for evidence of impacts on human populations related to the Medieval Climatic Anomaly. To achieve this goal, a sample selection of archaeological sites was chosen from lands within Edwards Air Force Base. These sites represented occupations which occurred immediately before, during, and after the Medieval Climatic Anomaly. Site assemblages were analyzed and compared by cultural period, with cross-comparisons made of artefactual and ecofactual constituents. Site densities and areal extents were also examined and compared. These analyses showed the emergence of trends concurrent with the introduction of the Medieval Climatic Anomaly. The data supports the hypothesis that humans who populated the Antelope Valley region of the Mojave Desert during this period may have engaged in population aggregation, with a tethered nomadism subsistence strategy. The data also shows evidence that upon the amelioration of the environment after the Medieval Climatic Anomaly, site characteristics within the region saw a significant shift. While the evidence generated by this study does suggest a link between climatic change experienced during the Medieval Climatic Anomaly and change observed within the archaeology of the Antelope Valley, it does not suggest climate as a sole, or even primary, causal factor. Rather, the intent of this study was to identify one possible variable responsible for observed change that occurred in the region. With this in mind, the Medieval Climatic Anomaly was found to have been significant enough to have either directly or indirectly impacted the prehistoric occupants of the study region.
203

Molecular genetics of optic nerve disease using patients with cavitary optic disc anomaly

Hazlewood, Ralph Jeremiah, II 01 January 2015 (has links)
Glaucoma is the second leading cause of irreversible blindness in the United States and is the leading cause of blindness in African Americans. Cupping or excavation of the optic nerve, which sends the visual signal from the photoreceptors in the eye to the brain, is a chief feature of glaucoma. A similar excavated appearance of the optic nerve is also the primary clinical sign of other congenital malformations of the eye including optic nerve head coloboma, optic pit, and morning glory disc anomaly collectively termed cavitary optic disc anomaly (CODA). Clinical similarities between CODA and glaucoma have suggested that these conditions may have overlapping pathophysiology. Although risk factors are known, such as the elevated intraocular pressure (IOP) observed in some glaucoma subjects, the biological pathways and molecular events that lead to excavation of the optic disc in glaucoma and in CODA are incompletely understood, which has hindered efforts to improve diagnosis and treatment of these diseases. Consequently, there is a critical need to clarify the biological mechanisms that lead to excavation of the optic nerve, which will lead to improvements in our understanding of these important disease processes. Because of their similar clinical phenotypes and the limited therapy geared at lowering IOP in glaucoma patients, our central hypothesis is that genes involved in Mendelian forms of CODA would also be involved in a subset of glaucoma cases and may provide insight into glaucomatous optic neuropathy. The purpose of my research project has been to identify and functionally characterize the gene that causes congenital autosomal dominant CODA in a multiplex family with 17 affected members. The gene that causes CODA was previously mapped to chromosome 12q14 and following screening of candidate genes within the region that did not yield any plausible coding sequence mutations, a triplication of a 6KB segment of DNA upstream of the matrix metalloproteinase 19 (MMP19) gene was subsequently identified using comparative genomic hybridization arrays and qPCR. This copy number variation (CNV) was present in all affected family members but absent in unaffected family members, a panel of 78 normal control subjects, and the Database of Genomic Variants. In a case-control study of singleton CODA subjects, CNVs were also detected; we detected the same 6KB triplication in 1 of 24 subjects screened. This subject was part of another 3-generation autosomal dominant CODA pedigree where affected members each have the same CNV identified in the larger CODA pedigree. A separate case-control study with 172 glaucoma cases (primary open angle glaucoma = 84, normal tension glaucoma = 88) was evaluated for MMP19 CNVs, however none were detected. Although our cohort of CODA patients is small limiting our ability to accurately determine the proportion of CODA caused by MMP19 mutations, our data indicates that the MMP19 CNV is not an isolated case and additional CODA subjects may have MMP19 defects. Because of the location of the CNV, we evaluated its effect on downstream gene expression with luciferase reporter gene assays. These assays revealed that the 6KB sequence spanned by the CNV in CODA subjects functioned as a transcriptional enhancer; in particular, a 773bp segment had a strong positive influence (8-fold higher) on downstream gene expression. MMP19, a largely understudied gene, was further characterized by expression studies in the optic nerve and retina. Using frozen sections from normal donor eyes, we demonstrated that MMP19 is predominantly localized to the optic nerve head in the lamina cribrosa region with moderate labeling in the postlaminar region, and weak labeling in the prelaminar region and retina. We also evaluated MMP19 expression in relation to the cell types that populate the optic nerve such as astrocytes and retinal ganglion cells. The pattern of expression is consistent with MMP19 being a secreted protein accumulating in the extracellular spaces and basement membranes of the optic nerve. Our studies have identified the first gene associated with CODA and future research is focused on recapitulating CODA phenotypes in animal models and assessing the mechanism of MMP19 involvement during development.
204

Regression and boosting methods to inform precisionized treatment rules using data from crossover studies

Barnes, Janel Kay 15 December 2017 (has links)
The usual convention for assigning a treatment to an individual is a "one-size fits all" rule that is based on broad spectrum trends. Heterogeneity within and between subjects and improvements in scientific research convey the need for more effective treatment assignment strategies. Precisionized treatment (PT) offers an alternative to the traditional treatment assignment approach by making treatment decisions based on one or more covariates pertaining to an individual. We investigate two methods to inform PT rules: the Maximum Likelihood Estimation (MLE) method and the Boosting method. We apply these methods in the context of a crossover study design with a continuous outcome variable, one continuous covariate, and two intervention options. We explore the methods via extensive simulation studies and apply them to a data set from a study of safety warnings in passenger vehicles. We evaluate the performance of the estimated PT rules based on the improvement in mean response (RMD), the percent of correct treatment assignments (PCC), and the accuracy of estimating the location of the crossing point (MSE((x_c )). We also define a new metric that we call the percent of anomalies (PA). We characterize the potential benefit of using PT by relating it to the strength of interaction, the location of the crossing point, and the within-person intraclass correlation (ICC). We also explore the effects of sample size and overall variance along with the methods’ robustness to violations of model assumptions. We investigate the performance of the Boosting method under the standard weight and two alternative weighting schemes. Our investigation indicated the largest potential benefit of implementing a PT approach was when the crossover point was near the median, the strength of interaction was large, and the ICC was high. When a PT rule is used to assign treatments instead of a one-size fits all rule, an approximate 10-30% improvement in mean outcome can be gained. The MLE and Boosting method performed comparably across most of the simulation scenarios, yet in our data example, it appeared there may be an empirical benefit of the Boosting method over the MLE method. Under a distribution misspecification, the difference in performance between the methods was minor; however, when the functional form of the model was misspecified, we began to see improvement of the Boosting method over the MLE method. In the simulation conditions we considered, the weighting scheme used in the Boosting method did not markedly impact performance. Using data to develop PT rules can lead to an improvement in outcome over the standard approach of assigning treatments. We found that in a variety of scenarios, there was little added benefit to utilizing the more complex iterative Boosting procedure compared to the relatively straightforward MLE method when developing the PT rules. The results from our investigations could be used to optimize treatment recommendations for participants in future studies.
205

Improving Service Level of Free-Floating Bike Sharing Systems

Pal, Aritra 13 November 2017 (has links)
Bike Sharing is a sustainable mode of urban mobility, not only for regular commuters but also for casual users and tourists. Free-floating bike sharing (FFBS) is an innovative bike sharing model, which saves on start-up cost, prevents bike theft, and offers significant opportunities for smart management by tracking bikes in real-time with built-in GPS. Efficient management of a FFBS requires: 1) analyzing its mobility patterns and spatio-temporal imbalance of supply and demand of bikes, 2) developing strategies to mitigate such imbalances, and 3) understanding the causes of a bike getting damaged and developing strategies to minimize them. All of these operational management problems are successfully addressed in this dissertation, using tools from Operations Research, Statistical and Machine Learning and using Share-A-Bull Bike FFBS and Divvy station-based bike sharing system as case studies.
206

Adaptive Real-time Anomaly Detection for Safeguarding Critical Networks

Ring Burbeck, Kalle January 2006 (has links)
<p>Critical networks require defence in depth incorporating many different security technologies including intrusion detection. One important intrusion detection approach is called anomaly detection where normal (good) behaviour of users of the protected system is modelled, often using machine learning or data mining techniques. During detection new data is matched against the normality model, and deviations are marked as anomalies. Since no knowledge of attacks is needed to train the normality model, anomaly detection may detect previously unknown attacks.</p><p>In this thesis we present ADWICE (Anomaly Detection With fast Incremental Clustering) and evaluate it in IP networks. ADWICE has the following properties:</p><p>(i) Adaptation - Rather than making use of extensive periodic retraining sessions on stored off-line data to handle changes, ADWICE is fully incremental making very flexible on-line training of the model possible without destroying what is already learnt. When subsets of the model are not useful anymore, those clusters can be forgotten.</p><p>(ii) Performance - ADWICE is linear in the number of input data thereby heavily reducing training time compared to alternative clustering algorithms. Training time as well as detection time is further reduced by the use of an integrated search-index.</p><p>(iii) Scalability - Rather than keeping all data in memory, only compact cluster summaries are used. The linear time complexity also improves scalability of training.</p><p>We have implemented ADWICE and integrated the algorithm in a software agent. The agent is a part of the Safeguard agent architecture, developed to perform network monitoring, intrusion detection and correlation as well as recovery. We have also applied ADWICE to publicly available network data to compare our approach to related works with similar approaches. The evaluation resulted in a high detection rate at reasonable false positives rate.</p> / Report code: LiU-Tek-Lic-2006:12.
207

PE and EV/EBITDA Investment Strategies vs. the Market : A Study of Market Efficiency

Persson, Eva, Ståhlberg, Caroline January 2007 (has links)
<p>Background:</p><p>The efficient market hypothesis states that it is not possible to consistently outperform the overall stock market by stock picking and market timing. This is because, in an efficient market, all stock prices are at their correct level, and there are no over- or undervalued stocks. Nevertheless, deviations from true price can occur according to the hypothesis, but when they do they are always random. Thus, the only way an investor can perform better than the overall stock market is by being lucky. However, the efficient market hypothesis is very controversial. It is often discussed within the area of modern financial theory and there are strong arguments both for and against it.</p><p>Purpose:</p><p>The purpose of this study was to investigate whether it is possible to outperform the overall stock market by investing in stocks that are undervalued according to the enterprise multiple (EV/EBITDA), and the price-earnings ratio.</p><p>Realization of the Study:</p><p>Portfolios were constructed based on information from five years, 2001 to 2005. Each year two portfolios were put together, one of them consisting of the six stocks with the lowest price-earnings ratio, and the other consisting of the six stocks with the lowest EV/EBITDA. Each portfolio was kept for one year and the unadjusted returns as well as the risk adjusted returns of the portfolios were compared to the returns on the two indexes OMXS30 and AFGX. The sample consisted of the 30 most traded stocks on the Nordic Stock Exchange in Stockholm 2006.</p><p>Conclusion:</p><p>The study shows that it is possible to outperform the overall stock market by investing in undervalued stocks according the price-earnings ratio and the EV/EBITDA. This indicates that the market is not efficient, even in its weak form.</p>
208

Anomaly detection in unknown environments using wireless sensor networks

Li, YuanYuan 01 May 2010 (has links)
This dissertation addresses the problem of distributed anomaly detection in Wireless Sensor Networks (WSN). A challenge of designing such systems is that the sensor nodes are battery powered, often have different capabilities and generally operate in dynamic environments. Programming such sensor nodes at a large scale can be a tedious job if the system is not carefully designed. Data modeling in distributed systems is important for determining the normal operation mode of the system. Being able to model the expected sensor signatures for typical operations greatly simplifies the human designer’s job by enabling the system to autonomously characterize the expected sensor data streams. This, in turn, allows the system to perform autonomous anomaly detection to recognize when unexpected sensor signals are detected. This type of distributed sensor modeling can be used in a wide variety of sensor networks, such as detecting the presence of intruders, detecting sensor failures, and so forth. The advantage of this approach is that the human designer does not have to characterize the anomalous signatures in advance. The contributions of this approach include: (1) providing a way for a WSN to autonomously model sensor data with no prior knowledge of the environment; (2) enabling a distributed system to detect anomalies in both sensor signals and temporal events online; (3) providing a way to automatically extract semantic labels from temporal sequences; (4) providing a way for WSNs to save communication power by transmitting compressed temporal sequences; (5) enabling the system to detect time-related anomalies without prior knowledge of abnormal events; and, (6) providing a novel missing data estimation method that utilizes temporal and spatial information to replace missing values. The algorithms have been designed, developed, evaluated, and validated experimentally in synthesized data, and in real-world sensor network applications.
209

Traffic Analysis, Modeling and Their Applications in Energy-Constrained Wireless Sensor Networks : On Network Optimization and Anomaly Detection

Wang, Qinghua January 2010 (has links)
Wireless sensor network (WSN) has emerged as a promising technology thanks to the recent advances in electronics, networking, and information processing. A wide range of WSN applications have been proposed such as habitat monitoring, environmental observations and forecasting systems, health monitoring, etc. In these applications, many low power and inexpensive sensor nodes are deployed in a vast space to cooperate as a network. Although WSN is a promising technology, there is still a great deal of additional research required before it finally becomes a mature technology. This dissertation concentrates on three factors which are holding back the development of WSNs. Firstly, there is a lack of traffic analysis &amp; modeling for WSNs. Secondly, network optimization for WSNs needs more investigation. Thirdly, the development of anomaly detection techniques for WSNs remains a seldomly touched area. In the field of traffic analysis &amp; modeling for WSNs, this dissertation presents several ways of modeling different aspects relating to WSN traffic, including the modeling of sequence relations among arriving packets, the modeling of a data traffic arrival process for an event-driven WSN, and the modeling of a traffic load distribution for a symmetric dense WSN. These research results enrich the current understanding regarding the traffic dynamics within WSNs, and provide a basis for further work on network optimization and anomaly detection for WSNs. In the field of network optimization for WSNs, this dissertation presents network optimization models from which network performance bounds can be derived. This dissertation also investigates network performances constrained by the energy resources available in an indentified bottleneck zone. For a symmetric dense WSN, an optimal energy allocation scheme is proposed to minimize the energy waste due to the uneven energy drain among sensor nodes. By modeling the interrelationships among communication traffic, energy consumption and WSN performances, these presented results have efficiently integrated the knowledge on WSN traffic dynamics into the field of network optimization for WSNs. Finally, in the field of anomaly detection for WSNs, this dissertation uses two examples to demonstrate the feasibility and the ease of detecting sensor network anomalies through the analysis of network traffic. The presented results will serve as an inspiration for the research community to develop more secure and more fault-tolerant WSNs. / STC
210

Myten om Palme : En texttolkning av dokumentären Palme och dess skildring av det sociala minnet efter Olof Palme som norm eller anomali / The Myth of Palme : a textual analysis of the documentary Palme and its description of the social memory of Olof Palme as a norm or anomaly

Svanström, Emma January 2013 (has links)
”The myth of Palme- a textual analysis of the documentary Palme and its description of the social memory of Olof Palme as a norm or anomaly” by Emma Svanström aims to analyze how the directors of the documentary Palme choose to present Olof Palme to the future generations. Also the goal is to find out if their version presents Palme as a person who followed the norms or was divergent. To this purpose the thesis use textual analysis combined with a quantitative method in search of which persons the directors give the right to form the myth of Palme and which keywords they use to do describe him. To view the film as a social memory in the transformation to a myth the thesis use Jan Assmans theory of social memory and to find out if the documentary describes Palme as following the norms or divergent it uses Mary Douglas theory of anomaly. The results show that it is mainly the narrator and Olof Palme that gets to form the myth of him but also his family, friends, fellow employees and other persons that met him or was affected by his actions. Palme is discribed as special, intelligent, interested in social politics and able to act as he saw fit even if it was against the norms. He is above all described as a complex person with many and sometimes contradicting sides. Some of these actions and characteristics’ are viewed as following the norms while others are shown as anomalies.

Page generated in 0.0404 seconds