Spelling suggestions: "subject:"[een] NETWORK MODELING"" "subject:"[enn] NETWORK MODELING""
31 |
Session-based Intrusion Detection System To Map Anomalous Network TrafficCaulkins, Bruce 01 January 2005 (has links)
Computer crime is a large problem (CSI, 2004; Kabay, 2001a; Kabay, 2001b). Security managers have a variety of tools at their disposal -- firewalls, Intrusion Detection Systems (IDSs), encryption, authentication, and other hardware and software solutions to combat computer crime. Many IDS variants exist which allow security managers and engineers to identify attack network packets primarily through the use of signature detection; i.e., the IDS recognizes attack packets due to their well-known "fingerprints" or signatures as those packets cross the network's gateway threshold. On the other hand, anomaly-based ID systems determine what is normal traffic within a network and reports abnormal traffic behavior. This paper will describe a methodology towards developing a more-robust Intrusion Detection System through the use of data-mining techniques and anomaly detection. These data-mining techniques will dynamically model what a normal network should look like and reduce the false positive and false negative alarm rates in the process. We will use classification-tree techniques to accurately predict probable attack sessions. Overall, our goal is to model network traffic into network sessions and identify those network sessions that have a high-probability of being an attack and can be labeled as a "suspect session." Subsequently, we will use these techniques inclusive of signature detection methods, as they will be used in concert with known signatures and patterns in order to present a better model for detection and protection of networks and systems.
|
32 |
Modeling Large-scale Peer-to-Peer Networks and a Case Study of GnutellaJovanovic, Mihajlo A. 11 October 2001 (has links)
No description available.
|
33 |
A Framework to Protect Water Distribution Systems Against Potential IntrusionsLindley, Trevor Ray 11 October 2001 (has links)
No description available.
|
34 |
Modeling online social networks using Quasi-clique communitiesBotha, Leendert W. 12 1900 (has links)
Thesis (MSc)--Stellenbosch University, 2011 / ENGLISH ABSTRACT: With billions of current internet users interacting through social networks, the need
has arisen to analyze the structure of these networks. Many authors have proposed
random graph models for social networks in an attempt to understand and reproduce
the dynamics that govern social network development.
This thesis proposes a random graph model that generates social networks using
a community-based approach, in which users’ affiliations to communities are explicitly
modeled and then translated into a social network. Our approach explicitly
models the tendency of communities to overlap, and also proposes a method for
determining the probability of two users being connected based on their levels of
commitment to the communities they both belong to. Previous community-based
models do not incorporate community overlap, and assume mutual members of
any community are automatically connected.
We provide a method for fitting our model to real-world social networks and demonstrate
the effectiveness of our approach in reproducing real-world social network
characteristics by investigating its fit on two data sets of current online social networks.
The results verify that our proposed model is promising: it is the first
community-based model that can accurately reproduce a variety of important social
network characteristics, namely average separation, clustering, degree distribution,
transitivity and network densification, simultaneously. / AFRIKAANSE OPSOMMING: Met biljoene huidige internet-gebruikers wat deesdae met behulp van aanlyn sosiale
netwerke kommunikeer, het die analise van hierdie netwerke in die navorsingsgemeenskap
toegeneem. Navorsers het al verskeie toevalsgrafiekmodelle vir sosiale
netwerke voorgestel in ’n poging om die dinamika van die ontwikkeling van dié
netwerke beter te verstaan en te dupliseer.
In hierdie tesis word ’n nuwe toevalsgrafiekmodel vir sosiale netwerke voorgestel
wat ’n gemeenskapsgebaseerde benadering volg, deurdat gebruikers se verbintenisse
aan gemeenskappe eksplisiet gemodelleer word, en dié gemeenskapsmodel
dan in ’n sosiale netwerk omskep word. Ons metode modelleer uitdruklik die
geneigdheid van gemeenskappe om te oorvleuel, en verskaf ’n metode waardeur
die waarskynlikheid van vriendskap tussen twee gebruikers bepaal kan word, op
grond van hulle toewyding aan hulle wedersydse gemeenskappe. Vorige modelle
inkorporeer nie gemeenskapsoorvleueling nie, en aanvaar ook dat alle lede van
dieselfde gemeenskap vriende sal wees.
Ons verskaf ’n metode om ons model se parameters te pas op sosiale netwerk
datastelle en vertoon die vermoë van ons model om eienskappe van sosiale netwerke
te dupliseer. Die resultate van ons model lyk belowend: dit is die eerste gemeenskapsgebaseerde
model wat gelyktydig ’n belangrike verskeidenheid van sosiale
netwerk eienskappe, naamlik gemiddelde skeidingsafstand, samedromming, graadverdeling,
transitiwiteit en netwerksverdigting, akkuraat kan weerspieël.
|
35 |
Neural Network And Regression Models To Decide Whether Or Not To Bid For A Tender In Offshore Petroleum Platform Fabrication IndustrySozgen, Burak 01 August 2009 (has links) (PDF)
In this thesis, three methods are presented to model the decision process of whether
or not to bid for a tender in offshore petroleum platform fabrication. A sample data and
the assessment based on this data are gathered from an offshore petroleum platform
fabrication company and this information is analyzed to understand the significant
parameters in the industry.
The alternative methods, &ldquo / Regression Analysis&rdquo / , &ldquo / Neural Network Method&rdquo / and &ldquo / Fuzzy
Neural Network Method&rdquo / , are used for modeling of the bidding decision process. The
regression analysis examines the data statistically where the neural network method
and fuzzy neural network method are based on artificial intelligence. The models are
developed using the bidding data compiled from the offshore petroleum platform
fabrication projects. In order to compare the prediction performance of these methods
&ldquo / Cross Validation Method&rdquo / is utilized.
The models developed in this study are compared with the bidding decision method
used by the company. The results of the analyses show that regression analysis and
neural network method manage to have a prediction performance of 80% and fuzzy neural network has a prediction performance of 77,5% whereas the method used by
the company has a prediction performance of 47,5%. The results reveal that the
suggested models achieve significant improvement over the existing method for
making the correct bidding decision.
|
36 |
Real-time analysis of aggregate network traffic for anomaly detectionKim, Seong Soo 29 August 2005 (has links)
The frequent and large-scale network attacks have led to an increased need for
developing techniques for analyzing network traffic. If efficient analysis tools were
available, it could become possible to detect the attacks, anomalies and to appropriately
take action to contain the attacks before they have had time to propagate across the
network.
In this dissertation, we suggest a technique for traffic anomaly detection based on
analyzing the correlation of destination IP addresses and distribution of image-based
signal in postmortem and real-time, by passively monitoring packet headers of traffic.
This address correlation data are transformed using discrete wavelet transform for
effective detection of anomalies through statistical analysis. Results from trace-driven
evaluation suggest that the proposed approach could provide an effective means of
detecting anomalies close to the source. We present a multidimensional indicator using
the correlation of port numbers as a means of detecting anomalies.
We also present a network measurement approach that can simultaneously detect,
identify and visualize attacks and anomalous traffic in real-time. We propose to
represent samples of network packet header data as frames or images. With such a
formulation, a series of samples can be seen as a sequence of frames or video. Thisenables techniques from image processing and video compression such as DCT to be
applied to the packet header data to reveal interesting properties of traffic. We show that
??scene change analysis?? can reveal sudden changes in traffic behavior or anomalies. We
show that ??motion prediction?? techniques can be employed to understand the patterns of
some of the attacks. We show that it may be feasible to represent multiple pieces of data
as different colors of an image enabling a uniform treatment of multidimensional packet
header data.
Measurement-based techniques for analyzing network traffic treat traffic volume
and traffic header data as signals or images in order to make the analysis feasible. In this
dissertation, we propose an approach based on the classical Neyman-Pearson Test
employed in signal detection theory to evaluate these different strategies. We use both of
analytical models and trace-driven experiments for comparing the performance of
different strategies. Our evaluations on real traces reveal differences in the effectiveness
of different traffic header data as potential signals for traffic analysis in terms of their
detection rates and false alarm rates. Our results show that address distributions and
number of flows are better signals than traffic volume for anomaly detection. Our results
also show that sometimes statistical techniques can be more effective than the NP-test
when the attack patterns change over time.
|
37 |
Home therapist network modelingShao, Yufen 03 February 2012 (has links)
Home healthcare has been a growing sector of the economy over the last three decades with roughly 23,000 companies now doing business in the U.S. producing over $56 billion in combined annual revenue. As a highly fragmented market, profitability of individual companies depends on effective management and efficient operations. This dissertation aims at reducing costs and improving productivity for home healthcare companies.
The first part of the research involves the development of a new formulation for the therapist routing and scheduling problem as a mixed integer program. Given the time horizon, a set of therapists and a group of geographically dispersed patients, the objective of the model is to minimize the total cost of providing service by assigning patients to therapists while satisfying a host of constraints concerning time windows, labor regulations and contractual agreements. This problem is NP-hard and proved to be beyond the capability of commercial solvers like CPLEX. To obtain good solutions quickly, three approaches have been developed that include two heuristics and a decomposition algorithm.
The first approach is a parallel GRASP that assigns patients to multiple routes in a series of rounds. During the first round, the procedure optimizes the patient distribution among the available therapists, thus trying to reach a local optimum with respect to the combined cost of the routes. Computational results show that the parallel GRASP can reduce costs by 14.54% on average for real datasets, and works efficiently on randomly generated datasets.
The second approach is a sequential GRASP that constructs one route at a time. When building a route, the procedure tracks the amount of time used by the therapists each day, giving it tight control over the treatment time distribution within a route. Computational results show that the sequential GRASP provides a cost savings of 18.09% on average for the same real datasets, but gets much better solutions with significantly less CPU for the same randomly generated datasets.
The third approach is a branch and price algorithm, which is designed to find exact optima within an acceptable amount of time. By decomposing the full problem by therapist, we obtain a series of constrained shortest path problems, which, by comparison are relatively easy to solve. Computational results show that, this approach is not efficient here because: 1) convergence of Dantzig-Wolfe decomposition is not fast enough; and 2) subproblem is strongly NP-hard and cannot be solved efficiently.
The last part of this research studies a simpler case in which all patients have fixed appointment times. The model takes the form of a large-scale mixed-integer program, and has different computational complexity when different features are considered. With the piece-wise linear cost structure, the problem is strongly NP-hard and not solvable with CPLEX for instances of realistic size. Subsequently, a rolling horizon algorithm, two relaxed mixed-integer models and a branch-and-price algorithm were developed. Computational results show that, both the rolling horizon algorithm and two relaxed mixed-integer models can solve the problem efficiently; the branch-and-price algorithm, however, is not practical again because the convergence of Dantzig-Wolfe decomposition is slow even when stabilization techniques are applied. / text
|
38 |
A system of systems flexibility framework: A method for evaluating designs that are subjected to disruptionsWarshawsky, David 07 January 2016 (has links)
As systems become more interconnected, the focus of engineering design must shift
to include consideration for systems of systems (SoS) e ects. As the focus shifts from
singular systems to systems of systems, so too must the focus shift from performance
based analysis to an evaluation method that accounts for the tendency of such large
scale systems to far outlive their original operational environments and continually
evolve in order to adapt to the changes. It is nearly impossible to predict the nature
of these changes, therefore the rst focus of this thesis is the measurement of
the
exibility of the SoS and its ability to evolve and adapt. Flexibility is measured
using a combination of network theory and a discrete event simulation, therefore,
the second focus is the development of a simulation environment that can also measure
the system's performance for baseline comparisons. The results indicate that
simulated
exibility is related to the performance and cost of the SoS and is worth
measuring during the design process. The third focus of this thesis is to reduce the
computational costs of SoS design evaluation by developing heuristics for
exibility.
This was done by developing a network model to correspond with the discrete event
simulation and evaluating network properties using graph theory. It was shown that
the network properties can correlate with simulated
exibility. In such cases it was
shown that the heuristics could be used in connection with an evolutionary algorithm
to rapidly search the design space for good solutions. The entire methodology was
demonstrated on a multi-platform maintenance planning problem in connection with
the Navy Hardware Open System Technologies initiative.
|
39 |
A Computational Simulation Model for Predicting Infectious Disease Spread using the Evolving Contact Network AlgorithmMunkhbat, Buyannemekh 02 July 2019 (has links)
Commonly used simulation models for predicting outbreaks of re-emerging infectious diseases (EIDs) take an individual-level or a population-level approach to modeling contact dynamics. These approaches are a trade-off between the ability to incorporate individual-level dynamics and computational efficiency. Agent-based network models (ABNM) use an individual-level approach by simulating the entire population and its contact structure, which increases the ability of adding detailed individual-level characteristics. However, as this method is computationally expensive, ABNMs use scaled-down versions of the full population, which are unsuitable for low prevalence diseases as the number of infected cases would become negligible during scaling-down. Compartmental models use differential equations to simulate population-level features, which is computationally inexpensive and can model full-scale populations. However, as the compartmental model framework assumes random mixing between people, it is not suitable for diseases where the underlying contact structures are a significant feature of disease epidemiology. Therefore, current methods are unsuitable for simulating diseases that have low prevalence and where the contact structures are significant.
The conceptual framework for a new simulation method, Evolving Contact Network Algorithm (ECNA), was recently proposed to address the above gap. The ECNA combines the attributes of ABNM and compartmental modeling. It generates a contact network of only infected persons and their immediate contacts, and evolves the network as new persons become infected.
The conceptual framework of the ECNA is promising for application to diseases with low prevalence and where contact structures are significant. This thesis develops and tests different algorithms to advance the computational capabilities of the ECNA and its flexibility to model different network settings. These features are key components that determine the feasibility of ECNA for application to disease prediction. Results indicate that the ECNA is nearly 20 times faster than ABNM when simulating a population of size 150,000 and flexible for modeling networks with two contact layers and communities. Considering uncertainties in epidemiological features and origin of future EIDs, there is a significant need for a computationally efficient method that is suitable for analyses of a range of potential EIDs at a global scale. This work holds promise towards the development of such a model.
|
40 |
Návrh nových laboratorních úloh pro prostředí GNS3 / Design of new laboratory exercises for GNS3 environmentBarniak, Martin January 2015 (has links)
Diploma thesis deals with four laboratory tasks in simulation environment GNS3. Designed tasks are primarily focused on comparison of IPv4 and IPv6 protocols. In the first task the subject is concerned about OSPFv2 and OSPFv3 routing protocols. Next themes are transit techniques like NAT-PT and tunneling like GRE and 6to4. The second task is focused on configuration of routing protocols like EIGRP and EIGRPv6. Next sections are concerned about DHCP and ICMP protocols within IPv4 and IPv6 protocol suits. The third task is primarily focused on security relations of protocol suite IPv6. It contains OSPFv3 authentication, access lists and Cisco stateful IOS firewall. Content of the fourth task is protocol MPLS. First part of this task is concerned about basic configuration of this protocol and second part is focused on MPLS within IPv6 environment. All tasks contain test questions and individual part task.
|
Page generated in 0.0291 seconds