• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 185
  • 14
  • 2
  • Tagged with
  • 223
  • 223
  • 223
  • 22
  • 21
  • 19
  • 19
  • 18
  • 17
  • 16
  • 16
  • 15
  • 15
  • 15
  • 15
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Security risk prioritization for logical attack graphs

Almohri, Hussain January 1900 (has links)
Master of Science / Department of Computing and Information Sciences / William H. Hsu / Xinming (Simon) Ou / To prevent large networks from potential security threats, network administrators need to know in advance what components of their networks are under high security risk. One way to obtain this knowledge is via attack graphs. Various types of attack graphs based on miscellaneous techniques has been proposed. However, attack graphs can only make assertion about different paths that an attacker can take to compromise the network. This information is just half the solution in securing a particular network. Network administrators need to analyze an attack graph to be able to identify the associated risk. Provided that attack graphs can get very large in size, it would be very difficult for them to perform the task. In this thesis, I provide a security risk prioritization algorithm to rank logical attack graphs produced by MulVAL (A vulnerability analysis system) . My proposed method (called StepRank) is based on a previously published algorithm called AssetRank that generalizes over Google's PageRank algorithm. StepRank considers a forward attack graph that is a reversed version of the original MulVAL attack graph used by AssetRank. The result of the ranking algorithm is a rank value for each node that is relative to every other rank value and shows how difficult it is for an attacker to satisfy a node.
42

Scoreboard Tool

Srinivasan, Anush January 1900 (has links)
Master of Science / Department of Computing and Information Sciences / Daniel A. Andresen / Scoreboard tool is a web application developed to provide a platform where administrators can conduct quizzes (typically in an organization) or employees can take a quiz (individually or in a team). These quizzes are generally conducted to improve the knowledge of the users (typically employees) taking the quiz. This project is a tool where the user can take online quizzes which comprise of various categories like technology, science, math etc . The categories have various sub-categories. The admin can also perform CRUD on teams. The quizzes are created by the admin using a database which has the questions for the quizzes. The quizzes are basically divided into quizzes with multiple choice questions and timed quizzes where a user has a specified time to complete the quiz. Once a user logs in he can view his quiz history where he can view the scores of the various quizzes taken and also the time and date when taken. At the end of the quiz the user is given the option of rating the quiz and also entering his/her opinion of the quiz. The users will also have an option to view their scores graphically and compare their scores with other teams in the form of reports and also as graphs. The important feature of this application is that the scores will be reported using reporting services. And another important feature is instead of manually testing the application test cases will be written for automated testing.
43

A host-based security assessment architecture for effective leveraging of shared knowledge

Rakshit, Abhishek January 1900 (has links)
Master of Science / Department of Computing and Information Sciences / Xinming (Simon) Ou / Security scanning performed on computer systems is an important step to identify and assess potential vulnerabilities in an enterprise network, before they are exploited by malicious intruders. An effective vulnerability assessment architecture should assimilate knowledge from multiple security knowledge sources to discover all the security problems present on a host. Legitimate concerns arise since host-based security scanners typically need to run at administrative privileges, and takes input from external knowledge sources for the analysis. Intentionally or otherwise, ill-formed input may compromise the scanner and the whole system if the scanner is susceptible to, or carries one or more vulnerability itself. It is not easy to incorporate new security analysis tools and/or various security knowlege- bases in the conventional approach, since this would entail installing new agents on every host in the enterprise network. This report presents an architecture where a host-based security scanner's code base can be minimized to an extent where its correctness can be verified by adequate vetting. At the same time, the architecture also allows for leveraging third-party security knowledge more efficiently and makes it easier to incorporate new security tools. In our work, we implemented the scanning architecture in the context of an enterprise-level security analyzer. The analyzer finds security vulnerabilities present on a host according to the third-party security knowledge specified in Open Vulnerability Assessment Language(OVAL). We empirically show that the proposed architecture is potent in its ability to comprehensively leverage third-party security knowledge, and is flexible to support various higher-level security analysis.
44

Biosecurity risk and impact calculator

Chandwani, Somil January 1900 (has links)
Master of Science / Department of Computing and Information Sciences / Daniel A. Andresen / "BRIC" is a web survey application that can provide feedback to the feedyard managers regarding the different types of risk involved in their feedyards. By answering a set of basic questions in the survey, the application generates three categories of reports for the managers which provide them with measures to improve the existing condition of their feedyard. These dynamically generated reports can help to decrease the risk of introduction of some disease or its impact once it is introduced in a feedyard. The survey can be beneficial to collect data from various feedyards through the internet. This collected data can be used to make some interesting analysis and beneficial conclusions in this field of research.
45

Live video streaming for handheld devices over an ad hoc network

Mandowara, Piyush January 1900 (has links)
Master of Science / Department of Computing and Information Sciences / Gurdip Singh / A streaming video application allows a sequence of "moving images" to be sent over the Internet and displayed by a viewer as they arrive. This application is meant for viewing live videos on handheld devices such as PDAs and iPAQs. It captures video data from a webcam installed on a tablet pc, which is then sent over a UDP socket to an iPAQ via an ad hoc network where live video can be viewed in real time. This is achieved by sending video data, frame by frame, and displaying on iPAQ as it arrives. This application also allows taking a snapshot of the video which can be saved for later viewing and also allows the user to dynamically change the resolution of the video as being viewed. Two versions of this application have been developed, one using a TCP connection for video transfer between a tablet pc (server) and an iPAQ (client) and the other using a UDP connection. This report studies the trade-off between distance and time as each frame arrives at the client for both the versions. This implementation also supports multiple clients to connect to the server and allows video to be viewed simultaneously on more than one client and thus studies the trade-off between distance and time for multiple clients. This project is implemented using C#.NET on Microsoft Visual Studio 2005. It uses Microsoft .NET framework 2.0 for server (tablet pc) and Microsoft .NET Compact Framework 2.0 for client (iPAQ). Video streaming is useful in several areas such as entertainment media, live conferences, surveillance and security field. For entertainment media, streaming video avoids having a web user wait for a large file to be downloaded before viewing the video. Instead, the media is sent in a continuous stream and is played as it arrives. For surveillance purposes, the streamed video gives a real time view of the field. The primary application of this implementation is to be used in the field of sensor networks.
46

Real estate web application

Chopra, Rashi January 1900 (has links)
Master of Science / Department of Computing and Information Sciences / Daniel A. Andresen / The Real Estate Web Application is an interactive, effective and revenue-generating website designed for the Real Estate Industry. The main objective of this application is to help the Real Estate Company to display unlimited number of property listings on the website. The primary focus is to get familiar to .NET framework and code with ASP.NET and C# .NET to provide a featured GUI which contains sophisticated search engine for buyer's to search for property listings specific to their needs. The search engine not only provides an easy and convenient way to search for listings but also display the entire list of properties in a customized grid format. The buyer can then view the complete specification of each property listing with its features, description and photographs. The application also provides a drag and drop control to save a list of selected property listings while browsing other options on the Real Estate Website. There are hundreds of Real Estate Websites on the World Wide Web but the intention of designing this application is to develop something new, innovative and efficient using latest technologies like AJAX, Java Script, etc which not only enhances the already existing search features available on the internet but also gets rid of their annoying and unessential features. The main emphasis lies in providing a user-friendly search engine for effectively showing the desired results on the GUI.
47

Automatic detection of significant features and event timeline construction from temporally tagged data

Erande, Abhijit January 1900 (has links)
Master of Science / Department of Computing and Information Sciences / William H. Hsu / The goal of my project is to summarize large volumes of data and help users to visualize how events have unfolded over time. I address the problem of extracting overview terms from a time-tagged corpus of data and discuss some previous work conducted in this area. I use a statistical approach to automatically extract key terms, form groupings of related terms, and display the resultant groups on a timeline. I use a static corpus composed of news stories, as opposed to an on-line setting where continual additions to the corpus are being made. Terms are extracted using a Named Entity Recognizer, and importance of a term is determined using the [superscript]X[superscript]2 measure. My approach does not address the problem of associating time and date stamps with data, and is restricted to corpora that been explicitly tagged. The quality of results obtained is gauged subjectively and objectively by measuring the degree to which events known to exist in the corpus were identified by the system.
48

A tool for implementing distributed algorithms written in PROMELA, using DAJ toolkit

Nuthi, Kranthi Kiran January 1900 (has links)
Master of Science / Department of Computing and Information Sciences / Gurdip Singh / PROMELA stands for Protocol Meta Language. It is a modeling language for developing distributed systems. It allows for the dynamic creation of concurrent processes which can communicate through message channels. DAJ stands for Distributed Algorithms in Java. It is a Java toolkit for designing, implementing, simulating, and visualizing distributed algorithms. The toolkit consists of Java class library with a simple programming interface that allows development of distributed algorithms based on a message passing model. It also provides a visualization environment where the protocol execution can be paused, performed step by step, and restarted. This project is a Java application designed to translate a model written in Promela into a model using the Java class library provided by DAJ and simulate it using DAJ. Even though there are similarities between the programming constructs of Promela and DAJ, the programming interface supported by DAJ is smaller, so the input has been confined to a variant, which is a subset of Promela. The implementation was performed in three steps. In the first step an input domain was defined and an ANTLR grammar was defined for the input structure. Java code has been embedded to this ANTLR grammar so that it can parse the input and translates it into an intermediate xml format. In the second step, a String Template is used which would consist of templates of the output model, along with a Java program which traverses the intermediate xml file and generates the output model. In the third step, the obtained output model is compiled and then simulated and visualized using DAJ. The application has been tested over input models having different topologies, process nodes, messages, and variables and covering most of the input domain.
49

An empirical approach to modeling uncertainty in intrusion analysis

Sakthivelmurugan, Sakthiyuvaraja January 1900 (has links)
Master of Science / Department of Computing and Information Sciences / Xinming (Simon) Ou / A well-known problem in current intrusion detection tools is that they create too many low-level alerts and system administrators find it hard to cope up with the huge volume. Also, when they have to combine multiple sources of information to confirm an attack, there is a dramatic increase in the complexity. Attackers use sophisticated techniques to evade the detection and current system monitoring tools can only observe the symptoms or effects of malicious activities. When mingled with similar effects from normal or non-malicious behavior they lead intrusion analysis to conclusions of varying confidence and high false positive/negative rates. In this thesis work we present an empirical approach to the problem of modeling uncertainty where inferred security implications of low-level observations are captured in a simple logical language augmented with uncertainty tags. We have designed an automated reasoning process that enables us to combine multiple sources of system monitoring data and extract highly-confident attack traces from the numerous possible interpretations of low-level observations. We have developed our model empirically: the starting point was a true intrusion that happened on a campus network we studied to capture the essence of the human reasoning process that led to conclusions about the attack. We then used a Datalog-like language to encode the model and a Prolog system to carry out the reasoning process. Our model and reasoning system reached the same conclusions as the human administrator on the question of which machines were certainly compromised. We then automatically generated the reasoning model needed for handling Snort alerts from the natural-language descriptions in the Snort rule repository, and developed a Snort add-on to analyze Snort alerts. Keeping the reasoning model unchanged, we applied our reasoning system to two third-party data sets and one production network. Our results showed that the reasoning model is effective on these data sets as well. We believe such an empirical approach has the potential of codifying the seemingly ad-hoc human reasoning of uncertain events, and can yield useful tools for automated intrusion analysis.
50

A comparitive performance analysis of GENI control framework aggregates

Tare, Nidhi January 1900 (has links)
Master of Science / Department of Electrical and Computer Engineering / Caterina M. Scoglio / Network researchers for a long time have been investigating ways to improve network performance and reliability by devising new protocols, services, and network architectures. For the most part, these innovative ideas are tested through simulations and emulation techniques that though yield credible results; fail to account for realistic Internet measurements values like traffic, capacity, noise, and variable workload, and network failures. Overlay networks, on the other hand have existed for a decade, but they assume the current internet architecture is not suitable for clean-slate network architecture research. Recently, the Global Environment for Network Innovations (GENI) project aims to address this issue by providing an open platform comprising of a suite of highly programmable and shareable network facilities along with its control software. The aim of this report is to introduce GENI’s key architectural concepts, its control frameworks, and how they are used for dynamic resource allocation of computing and networking resources. We mainly discuss about the architectural concepts and design goals of two aggregates, namely the BBN Open Resource Control Architecture of the (BBNORCA) of the ORCA control framework and Great Plains Environment for Network Innovations (GpENI) belonging to the PlanetLab control framework. We then describe the procedure adopted for hardware and software setup of individual aggregates. After giving an overview of the two prototypes, an analysis of the simple experiments that were conducted on each of the aggregates is presented. Based on the study and experimental results, we present a comparative analysis of control framework architectures, their relative merits and demerits, experimentation ease, virtualization technology, and its suitability for a future GENI prototype. We use metrics such as scalability, leasing overhead, oversubscription of resources, and experiment isolation for comparison.

Page generated in 0.0645 seconds