1 |
Enabling the processing of bioinformatics workflows where data is located through the use of cloud and container technologiesde Beste, Eugene January 2019 (has links)
>Magister Scientiae - MSc / The growing size of raw data and the lack of internet communication technology to
keep up with that growth is introducing unique challenges to academic researchers.
This is especially true for those residing in rural areas or countries with sub-par
telecommunication infrastructure. In this project I investigate the usefulness of cloud
computing technology, data analysis workflow languages and portable computation
for institutions that generate data. I introduce the concept of a software solution
that could be used to simplify the way that researchers execute their analysis on
data sets at remote sources, rather than having to move the data. The scope of this
project involved conceptualising and designing a software system to simplify the
use of a cloud environment as well as implementing a working prototype of said
software for the OpenStack cloud computing platform. I conclude that it is possible
to improve the performance of research pipelines by removing the need for
researchers to have operating system or cloud computing knowledge and that utilising
technologies such as this can ease the burden of moving data.
|
2 |
Information Technology Infrastructure: Global Economy and National Development in HaitiAlcena, Reynolds 01 January 2018 (has links)
Political and environmental chaos recently experienced in Haiti has damaged the economic sector and telecommunication infrastructure. Developmental data from Haiti show 3 major trends: inadequate social and economic development, insufficient benefits from the global economy, and poorly planned information technology infrastructure (ITI). The specific problem addressed in this study is a knowledge gap in the views of stakeholders within Haiti's national culture on how the country's ITI can be developed to better engage Haiti in 21st century global and digital economy. The purpose of this qualitative case study was to explore the views of 48 expert participants regarding ITI development within Haiti's national culture to better engage Haiti with the 21st century global and digital economy. To satisfy the goal of this exploratory research a case study research design was used, and data were collected from multiple sources including in-depth interviews of 48 participants, observational field notes, and archival documentation. The analysis of the archival data, online surveys, and semi-structured interviews of expert informants revealed that nationwide broadband internet availability has been achieved, which has resulted in internet usage increasing from 2% in 2002 to 12% in 2009. The study participants noted the lack of reliable access to electricity limits the implementation of ITI in the nation. Legislation and financial investment are needed to improve ITI in Haiti. The academic significance and social change implications of the study include filling the knowledge gap of the status of ITI in Haiti, helping the national development of a modernized ITI well-connected to the global economy, and a better quality of life for Haiti's people.
|
3 |
Improved performance high speed network intrusion detection systems (NIDS). A high speed NIDS architectures to address limitations of Packet Loss and Low Detection Rate by adoption of Dynamic Cluster Architecture and Traffic Anomaly Filtration (IADF).Akhlaq, Monis January 2011 (has links)
Intrusion Detection Systems (IDS) are considered as a vital component in network security architecture. The system allows the administrator to detect unauthorized use of, or attack upon a computer, network or telecommunication infrastructure. There is no second thought on the necessity of these systems however; their performance remains a critical question.
This research has focussed on designing a high performance Network Intrusion Detection Systems (NIDS) model. The work begins with the evaluation of Snort, an open source NIDS considered as a de-facto IDS standard. The motive behind the evaluation strategy is to analyze the performance of Snort and ascertain the causes of limited performance. Design and implementation of high performance techniques are considered as the final objective of this research.
Snort has been evaluated on highly sophisticated test bench by employing evasive and avoidance strategies to simulate real-life normal and attack-like traffic. The test-methodology is based on the concept of stressing the system and degrading its performance in terms of its packet handling capacity. This has been achieved by normal traffic generation; fussing; traffic saturation; parallel dissimilar attacks; manipulation of background traffic, e.g. fragmentation, packet sequence disturbance and illegal packet insertion. The evaluation phase has lead us to two high performance designs, first distributed hardware architecture using cluster-based adoption and second cascaded phenomena of anomaly-based filtration and signature-based detection.
The first high performance mechanism is based on Dynamic Cluster adoption using refined policy routing and Comparator Logic. The design is a two tier mechanism where front end of the cluster is the load-balancer which distributes traffic on pre-defined policy routing ensuring maximum utilization of cluster resources. The traffic load sharing mechanism reduces the packet drop by exchanging state information between load-balancer and cluster nodes and implementing switchovers between nodes in case the traffic exceeds pre-defined threshold limit. Finally, the recovery evaluation concept using Comparator Logic also enhance the overall efficiency by recovering lost data in switchovers, the retrieved data is than analyzed by the recovery NIDS to identify any leftover threats.
Intelligent Anomaly Detection Filtration (IADF) using cascaded architecture of anomaly-based filtration and signature-based detection process is the second high performance design. The IADF design is used to preserve resources of NIDS by eliminating large portion of the traffic on well defined logics. In addition, the filtration concept augment the detection process by eliminating the part of malicious traffic which otherwise can go undetected by most of signature-based mechanisms. We have evaluated the mechanism to detect Denial of Service (DoS) and Probe attempts based by analyzing its performance on Defence Advanced Research Projects Agency (DARPA) dataset. The concept has also been supported by time-based normalized sampling mechanisms to incorporate normal traffic variations to reduce false alarms. Finally, we have observed that the IADF has augmented the overall detection process by reducing false alarms, increasing detection rate and incurring lesser data loss. / National University of Sciences & Technology (NUST), Pakistan
|
4 |
Improved performance high speed network intrusion detection systems (NIDS) : a high speed NIDS architectures to address limitations of packet loss and low detection rate by adoption of dynamic cluster architecture and traffic anomaly filtration (IADF)Akhlaq, Monis January 2011 (has links)
Intrusion Detection Systems (IDS) are considered as a vital component in network security architecture. The system allows the administrator to detect unauthorized use of, or attack upon a computer, network or telecommunication infrastructure. There is no second thought on the necessity of these systems however; their performance remains a critical question. This research has focussed on designing a high performance Network Intrusion Detection Systems (NIDS) model. The work begins with the evaluation of Snort, an open source NIDS considered as a de-facto IDS standard. The motive behind the evaluation strategy is to analyze the performance of Snort and ascertain the causes of limited performance. Design and implementation of high performance techniques are considered as the final objective of this research. Snort has been evaluated on highly sophisticated test bench by employing evasive and avoidance strategies to simulate real-life normal and attack-like traffic. The test-methodology is based on the concept of stressing the system and degrading its performance in terms of its packet handling capacity. This has been achieved by normal traffic generation; fussing; traffic saturation; parallel dissimilar attacks; manipulation of background traffic, e.g. fragmentation, packet sequence disturbance and illegal packet insertion. The evaluation phase has lead us to two high performance designs, first distributed hardware architecture using cluster-based adoption and second cascaded phenomena of anomaly-based filtration and signature-based detection. The first high performance mechanism is based on Dynamic Cluster adoption using refined policy routing and Comparator Logic. The design is a two tier mechanism where front end of the cluster is the load-balancer which distributes traffic on pre-defined policy routing ensuring maximum utilization of cluster resources. The traffic load sharing mechanism reduces the packet drop by exchanging state information between load-balancer and cluster nodes and implementing switchovers between nodes in case the traffic exceeds pre-defined threshold limit. Finally, the recovery evaluation concept using Comparator Logic also enhance the overall efficiency by recovering lost data in switchovers, the retrieved data is than analyzed by the recovery NIDS to identify any leftover threats. Intelligent Anomaly Detection Filtration (IADF) using cascaded architecture of anomaly-based filtration and signature-based detection process is the second high performance design. The IADF design is used to preserve resources of NIDS by eliminating large portion of the traffic on well defined logics. In addition, the filtration concept augment the detection process by eliminating the part of malicious traffic which otherwise can go undetected by most of signature-based mechanisms. We have evaluated the mechanism to detect Denial of Service (DoS) and Probe attempts based by analyzing its performance on Defence Advanced Research Projects Agency (DARPA) dataset. The concept has also been supported by time-based normalized sampling mechanisms to incorporate normal traffic variations to reduce false alarms. Finally, we have observed that the IADF has augmented the overall detection process by reducing false alarms, increasing detection rate and incurring lesser data loss.
|
5 |
Ensaios em economia industrial e comportamentalLucatelli, Hugo de Andrade 23 May 2017 (has links)
Submitted by Hugo de Andrade Lucatelli (h.lucatelli@hotmail.com) on 2017-06-10T00:54:14Z
No. of bitstreams: 1
Tese_Hugo Lucatelli.pdf: 1969280 bytes, checksum: 7eca95ab0a82921b3b59ee3d58c073e0 (MD5) / Approved for entry into archive by Suzinei Teles Garcia Garcia (suzinei.garcia@fgv.br) on 2017-06-12T12:08:05Z (GMT) No. of bitstreams: 1
Tese_Hugo Lucatelli.pdf: 1969280 bytes, checksum: 7eca95ab0a82921b3b59ee3d58c073e0 (MD5) / Made available in DSpace on 2017-06-12T13:04:48Z (GMT). No. of bitstreams: 1
Tese_Hugo Lucatelli.pdf: 1969280 bytes, checksum: 7eca95ab0a82921b3b59ee3d58c073e0 (MD5)
Previous issue date: 2017-05-23 / Paper I – The paper shows that in bundle markets, when a monopolist faces sizeable constraints on supply capacity, implementing a two-part tariff is the optimal strategy for the firm. This contractual design allows the firm setting the consumers’ consumption level at the firm’s desired point. In this scenario, it is expected the final tariff of the contract to be smaller than it would be in a fixed tariff contract, what lead to the entrance of more consumers in the market. This equilibrium improves the welfare of producers and consumers. Paper II – The aim of this work is to study the optimal pricing strategy of a firm that introduces a new product and competes by quality and price in a market. In this environment, prices are not only able to signal quality, but can also change the quality perceived by the consumers. This work analyzes the problem in a theoretical dimension in an environment where firms are aware of their ability to change the consumers experience with its pricing policy. The paper analyzes the model fit to the empirical literature. Paper III – The third essay of this thesis empirically analyzes the relationship between perceived quality and the elements which form the consumers’ satisfaction: prices, market competition and product/service intrinsic quality. Using Brazilian data on mobile telecommunications, this study estimated these relationships. We found a robust connection between prices and satisfaction, endorsing the results found to others markets by the literature. As was expected, competition also seems to promote better services supply, what translates into better consumers’ evaluations. Finally, services with better operational quality appear to have substantial better consumers' rating. These results are especially important for markets where consumers evaluate the whole experience of consuming the service, as we verified in the robustness test. The analysis also found some evidence of the existence of important infrastructure bottlenecks in the sector. In an environment where the telecommunication services tend to converge, with high probability of demand growth, network sizing problems could become relevant. / Ensaio I – O ensaio mostra que em mercados de pacotes, quando um monopolista enfrenta significativas restrições de capacidade de oferta, implementar contratos compostos por tarifas de duas partes é a estratégia ótima para a firma. Este desenho contratual permite a firma posicionar o nível de consumo dos consumidores no nível ótimo desejado. Neste cenário, espera-se que a tarifa final do contrato seja inferior à tarifa exercida em um contrato composto por uma tarifa fixa, o que implica na entrada de mais consumidores no mercado. Este equilíbrio melhora o bem-estar da firma e dos consumidores, de forma agregada. À luz do modelo apresentado, o trabalho discute o uso de franquias de consumo em contratos de provisão de internet fixa no Brasil. Ensaio II – O ensaio tem como objetivo estudar a estratégia ótima de preços de uma firma que introduz um produto novo em determinado mercado consumidor e compete em preços e qualidade em um ambiente onde os preços, não apenas sinalizam a qualidade do bem, mas também têm a possibilidade de alterar a percepção sensorial e a classificação de qualidade dos consumidores. Busca-se avaliar o problema em sua dimensão teórica, ao estudar o comportamento das firmas em um ambiente em que são conscientes quanto a sua capacidade de alterar a experiência dos consumidores e a sua reputação de mercado com a sua política de preços. O artigo avalia o ajuste do modelo a literatura empírica. Ensaio III – O terceiro ensaio da tese estuda, empiricamente, a relação entre qualidade percebida e os elementos formadores da satisfação do consumidor: preços, competição de mercado e qualidade intrínseca do produto ou serviço. Utilizando dados do mercado de telefonia móvel do Brasil, o trabalho estimou estas relações. Encontramos uma robusta associação entre preços e satisfação, em linha com os resultados estabelecidos pela literatura para outros mercados. Competição, como esperado, também parece promover a oferta de melhores serviços, o que se traduz em melhores avaliações dos consumidores. Por fim, serviços ofertados com melhor qualidade operacional apresentam sensíveis melhores notas dos usuários. Estes resultados são especialmente importantes em um mercado onde os consumidores avaliam toda a experiência de consumo com o serviço, conforme foi verificado no exercício de robustez deste artigo. O trabalho também encontrou evidências de que existem gargalos de infraestrutura importantes no setor. Em um ambiente onde há tendência de convergência de serviços de telecomunicações, com indicativo de crescente demanda, problemas de dimensionamento de rede podem se tornar relevantes.
|
Page generated in 0.1559 seconds