Spelling suggestions: "subject:"distributed denial off dervice (DDoS)"" "subject:"distributed denial off bservice (DDoS)""
1 |
Αναγνώριση επιθέσεων DDoS σε δίκτυα υπολογιστώνΔαμπολιάς, Ιωάννης 16 May 2014 (has links)
Στόχος της εργασίας είναι η μελέτη των κατανεμημένων επιθέσεων άρνησης υπηρεσίας σε δίκτυα υπολογιστών καθώς και οι τρόποι αντιμετώπισής και αναγνώρισής τους με χρήση νευρωνικού δικτύου. / The aim of this work is the study of distributed denial of service attacks on computer networks. Analyze the methods of DDoS attacks as well as how to deal and recognize them by using neural network.
|
2 |
Design and Analysis of Anomaly Detection and Mitigation Schemes for Distributed Denial of Service Attacks in Software Defined Network. An Investigation into the Security Vulnerabilities of Software Defined Network and the Design of Efficient Detection and Mitigation Techniques for DDoS Attack using Machine Learning TechniquesSangodoyin, Abimbola O. January 2019 (has links)
Software Defined Networks (SDN) has created great potential and hope to
overcome the need for secure, reliable and well managed next generation
networks to drive effective service delivery on the go and meet the demand
for high data rate and seamless connectivity expected by users. Thus, it
is a network technology that is set to enhance our day-to-day activities.
As network usage and reliance on computer technology are increasing
and popular, users with bad intentions exploit the inherent weakness of
this technology to render targeted services unavailable to legitimate users.
Among the security weaknesses of SDN is Distributed Denial of Service
(DDoS) attacks.
Even though DDoS attack strategy is known, the number of successful
DDoS attacks launched has seen an increment at an alarming rate over
the last decade. Existing detection mechanisms depend on signatures of
known attacks which has not been successful in detecting unknown or
different shades of DDoS attacks. Therefore, a novel detection mechanism
that relies on deviation from confidence interval obtained from the normal
distribution of throughput polled without attack from the server. Furthermore, sensitivity analysis to determine which of the network metrics (jitter, throughput and response time) is more sensitive to attack by
introducing white Gaussian noise and evaluating the local sensitivity using feed-forward artificial neural network is evaluated. All metrics are sensitive in detecting DDoS attacks. However, jitter appears to be the most sensitive to attack. As a result, the developed framework provides
an avenue to make the SDN technology more robust and secure to DDoS
attacks.
|
3 |
Real-Time Network Simulations for Ml/Dl Ddos Detection Using DockerGarcia, Luis D 01 December 2024 (has links) (PDF)
As the integration of artificial intelligence (AI) within cybersecurity continues to
grow, machine learning (ML) and deep learning (DL) models are increasingly used to
detect cyber attacks. However, these models are rarely evaluated in real-time attack
scenarios to see how subtle changes from the real networking environment can affect
their predictions. To address this issue, we propose a scalable, platform-independent
Docker testbed specifically designed for simulating real-time Distributed Denial of
Service (DDoS) attack scenarios that allows researchers to deploy and evaluate their
pre-trained, ML and DL detection models. Our framework is simple to configure
and can run across Intel and ARM CPUs, as well as Windows, Linux, and MacOS
operating systems. The testbed was validated with our six pre-trained models in
a 10-minute DDoS attack simulation, where performance metrics such as resource
consumption were actively monitored across different operating systems and CPUs.
This Dockerized environment offers researchers an accessible and flexible solution for
testing and improving DDoS detection models in a realistic, real-time context.
|
4 |
E-crimes and e-authentication - a legal perspectiveNjotini, Mzukisi Niven 27 October 2016 (has links)
E-crimes continue to generate grave challenges to the ICT regulatory agenda. Because e-crimes involve a wrongful appropriation of information online, it is enquired whether information is property which is capable of being stolen. This then requires an investigation to be made of the law of property. The basis for this scrutiny is to establish if information is property for purposes of the law. Following a study of the Roman-Dutch law approach to property, it is argued that the emergence of an information society makes real rights in information possible. This is the position because information is one of the indispensable assets of an information society. Given the fact that information can be the object of property, its position in the law of theft is investigated. This study is followed by an examination of the conventional risks that ICTs generate. For example, a risk exists that ICTs may be used as the object of e-crimes. Furthermore, there is a risk that ICTs may become a tool in order to appropriate information unlawfully. Accordingly, the scale and impact of e-crimes is more than those of the offline crimes, for example theft or fraud.
The severe challenges that ICTs pose to an information society are likely to continue if clarity is not sought regarding: whether ICTs can be regulated or not, if ICTs can be regulated, how should an ICT regulatory framework be structured? A study of the law and regulation for regulatory purposes reveals that ICTs are spheres where regulations apply or should apply. However, better regulations are appropriate in dealing with the dynamics of these technologies. Smart-regulations, meta-regulations or reflexive regulations, self-regulations and co-regulations are concepts that support better regulations. Better regulations enjoin the regulatory industries, for example the state, businesses and computer users to be involved in establishing ICT regulations. These ICT regulations should specifically be in keeping with the existing e-authentication measures. Furthermore, the codes-based theory, the Danger or Artificial Immune Systems (the AIS) theory, the Systems theory and the Good Regulator Theorem ought to inform ICT regulations.
The basis for all this should be to establish a holistic approach to e-authentication. This approach must conform to the Precautionary Approach to E-Authentication or PAEA. PAEA accepts the importance of legal rules in the ICT regulatory agenda. However, it argues that flexible regulations could provide a suitable framework within which ICTs and the ICT risks are controlled. In addition, PAEA submit that a state should not be the single role-player in ICT regulations. Social norms, the market and nature or architecture of the technology to be regulated are also fundamental to the ICT regulatory agenda. / Jurisprudence / LL. D.
|
Page generated in 0.0975 seconds