• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 505
  • 208
  • 197
  • 162
  • 27
  • Tagged with
  • 1180
  • 773
  • 699
  • 436
  • 436
  • 401
  • 401
  • 398
  • 398
  • 115
  • 115
  • 103
  • 88
  • 86
  • 81
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
141

Designing and analyzing an event service for sensor networks

Gujrati, Sumeet January 1900 (has links)
Master of Science / Department of Computing and Information Sciences / Gurdip Singh / This work is motivated by the OMG’s CORBA Event Service Specification. CORBA is the acronym for Common Object Request Broker Architecture. In this research, we implemented and analyzed an event service using a model similar to the OMG model for sensor networks applications which are written in nesC programming language, an extension of C programming language. This implementation has been tested on a test bed created using Crossbow’s TelosB motes and Crossbow’s Stargate Netbridge modules as gateways. Event service interface implementations, which reside on the motes, are written in nesC. The data routing part, which is done through Stargate Netbridges, is written in the C language. This document contains experimental results obtained by deploying and running the implementation on the test bed.
142

Online shopping

Mittapelli, Chaitanya Reddy January 1900 (has links)
Master of Science / Department of Computing and Information Sciences / Daniel A. Andresen / The Online Shopping is a web based application intended for online retailers. The main objective of this application is to make it interactive and its ease of use. It would make searching, viewing and selection of a product easier. It contains a sophisticated search engine for user's to search for products specific to their needs. The search engine provides an easy and convenient way to search for products where a user can Search for a product interactively and the search engine would refine the products available based on the user’s input. The user can then view the complete specification of each product. They can also view the product reviews and also write their own reviews. The application also provides a drag and drop feature so that a user can add a product to the shopping cart by dragging the item in to the shopping cart. The main emphasis lies in providing a user-friendly search engine for effectively showing the desired results and its drag and drop behavior.
143

Online job search

Deva, Swetha January 1900 (has links)
Master of Science / Department of Computing and Information Sciences / Daniel A. Andresen / The aim of this project is to help students find a job that suits their profile. This provides a common platform for the job seekers to search for jobs on one website instead of searching them on multiple websites which highly reduces the time of searching for a suitable job. This website also provides a platform for the recruiters to post a job and search for the resume suitable to their job requirements. This website allows the job seekers to build a resume using resume builder (using this students can design their resume online), search for a job (search is based on different selection criteria like location, salary, job type, company, category etc), check apply history (can go through the list of jobs applied), create a search agent according to their priorities through which they can be updated with all the latest jobs posted on the website. This application also allows the recruiters to post a new job available in their organization, can search for resume and can schedule the interview if the person’s profile matches with the job requirements posted by the recruiter. This website is developed using ASP.NET 2005 and MS SQL SERVER 2005. The main goal in designing this website was to get familiar with .NET technology.
144

Security risk prioritization for logical attack graphs

Almohri, Hussain January 1900 (has links)
Master of Science / Department of Computing and Information Sciences / William H. Hsu / Xinming (Simon) Ou / To prevent large networks from potential security threats, network administrators need to know in advance what components of their networks are under high security risk. One way to obtain this knowledge is via attack graphs. Various types of attack graphs based on miscellaneous techniques has been proposed. However, attack graphs can only make assertion about different paths that an attacker can take to compromise the network. This information is just half the solution in securing a particular network. Network administrators need to analyze an attack graph to be able to identify the associated risk. Provided that attack graphs can get very large in size, it would be very difficult for them to perform the task. In this thesis, I provide a security risk prioritization algorithm to rank logical attack graphs produced by MulVAL (A vulnerability analysis system) . My proposed method (called StepRank) is based on a previously published algorithm called AssetRank that generalizes over Google's PageRank algorithm. StepRank considers a forward attack graph that is a reversed version of the original MulVAL attack graph used by AssetRank. The result of the ranking algorithm is a rank value for each node that is relative to every other rank value and shows how difficult it is for an attacker to satisfy a node.
145

Scoreboard Tool

Srinivasan, Anush January 1900 (has links)
Master of Science / Department of Computing and Information Sciences / Daniel A. Andresen / Scoreboard tool is a web application developed to provide a platform where administrators can conduct quizzes (typically in an organization) or employees can take a quiz (individually or in a team). These quizzes are generally conducted to improve the knowledge of the users (typically employees) taking the quiz. This project is a tool where the user can take online quizzes which comprise of various categories like technology, science, math etc . The categories have various sub-categories. The admin can also perform CRUD on teams. The quizzes are created by the admin using a database which has the questions for the quizzes. The quizzes are basically divided into quizzes with multiple choice questions and timed quizzes where a user has a specified time to complete the quiz. Once a user logs in he can view his quiz history where he can view the scores of the various quizzes taken and also the time and date when taken. At the end of the quiz the user is given the option of rating the quiz and also entering his/her opinion of the quiz. The users will also have an option to view their scores graphically and compare their scores with other teams in the form of reports and also as graphs. The important feature of this application is that the scores will be reported using reporting services. And another important feature is instead of manually testing the application test cases will be written for automated testing.
146

A host-based security assessment architecture for effective leveraging of shared knowledge

Rakshit, Abhishek January 1900 (has links)
Master of Science / Department of Computing and Information Sciences / Xinming (Simon) Ou / Security scanning performed on computer systems is an important step to identify and assess potential vulnerabilities in an enterprise network, before they are exploited by malicious intruders. An effective vulnerability assessment architecture should assimilate knowledge from multiple security knowledge sources to discover all the security problems present on a host. Legitimate concerns arise since host-based security scanners typically need to run at administrative privileges, and takes input from external knowledge sources for the analysis. Intentionally or otherwise, ill-formed input may compromise the scanner and the whole system if the scanner is susceptible to, or carries one or more vulnerability itself. It is not easy to incorporate new security analysis tools and/or various security knowlege- bases in the conventional approach, since this would entail installing new agents on every host in the enterprise network. This report presents an architecture where a host-based security scanner's code base can be minimized to an extent where its correctness can be verified by adequate vetting. At the same time, the architecture also allows for leveraging third-party security knowledge more efficiently and makes it easier to incorporate new security tools. In our work, we implemented the scanning architecture in the context of an enterprise-level security analyzer. The analyzer finds security vulnerabilities present on a host according to the third-party security knowledge specified in Open Vulnerability Assessment Language(OVAL). We empirically show that the proposed architecture is potent in its ability to comprehensively leverage third-party security knowledge, and is flexible to support various higher-level security analysis.
147

Biosecurity risk and impact calculator

Chandwani, Somil January 1900 (has links)
Master of Science / Department of Computing and Information Sciences / Daniel A. Andresen / "BRIC" is a web survey application that can provide feedback to the feedyard managers regarding the different types of risk involved in their feedyards. By answering a set of basic questions in the survey, the application generates three categories of reports for the managers which provide them with measures to improve the existing condition of their feedyard. These dynamically generated reports can help to decrease the risk of introduction of some disease or its impact once it is introduced in a feedyard. The survey can be beneficial to collect data from various feedyards through the internet. This collected data can be used to make some interesting analysis and beneficial conclusions in this field of research.
148

Live video streaming for handheld devices over an ad hoc network

Mandowara, Piyush January 1900 (has links)
Master of Science / Department of Computing and Information Sciences / Gurdip Singh / A streaming video application allows a sequence of "moving images" to be sent over the Internet and displayed by a viewer as they arrive. This application is meant for viewing live videos on handheld devices such as PDAs and iPAQs. It captures video data from a webcam installed on a tablet pc, which is then sent over a UDP socket to an iPAQ via an ad hoc network where live video can be viewed in real time. This is achieved by sending video data, frame by frame, and displaying on iPAQ as it arrives. This application also allows taking a snapshot of the video which can be saved for later viewing and also allows the user to dynamically change the resolution of the video as being viewed. Two versions of this application have been developed, one using a TCP connection for video transfer between a tablet pc (server) and an iPAQ (client) and the other using a UDP connection. This report studies the trade-off between distance and time as each frame arrives at the client for both the versions. This implementation also supports multiple clients to connect to the server and allows video to be viewed simultaneously on more than one client and thus studies the trade-off between distance and time for multiple clients. This project is implemented using C#.NET on Microsoft Visual Studio 2005. It uses Microsoft .NET framework 2.0 for server (tablet pc) and Microsoft .NET Compact Framework 2.0 for client (iPAQ). Video streaming is useful in several areas such as entertainment media, live conferences, surveillance and security field. For entertainment media, streaming video avoids having a web user wait for a large file to be downloaded before viewing the video. Instead, the media is sent in a continuous stream and is played as it arrives. For surveillance purposes, the streamed video gives a real time view of the field. The primary application of this implementation is to be used in the field of sensor networks.
149

Real estate web application

Chopra, Rashi January 1900 (has links)
Master of Science / Department of Computing and Information Sciences / Daniel A. Andresen / The Real Estate Web Application is an interactive, effective and revenue-generating website designed for the Real Estate Industry. The main objective of this application is to help the Real Estate Company to display unlimited number of property listings on the website. The primary focus is to get familiar to .NET framework and code with ASP.NET and C# .NET to provide a featured GUI which contains sophisticated search engine for buyer's to search for property listings specific to their needs. The search engine not only provides an easy and convenient way to search for listings but also display the entire list of properties in a customized grid format. The buyer can then view the complete specification of each property listing with its features, description and photographs. The application also provides a drag and drop control to save a list of selected property listings while browsing other options on the Real Estate Website. There are hundreds of Real Estate Websites on the World Wide Web but the intention of designing this application is to develop something new, innovative and efficient using latest technologies like AJAX, Java Script, etc which not only enhances the already existing search features available on the internet but also gets rid of their annoying and unessential features. The main emphasis lies in providing a user-friendly search engine for effectively showing the desired results on the GUI.
150

Automatic detection of significant features and event timeline construction from temporally tagged data

Erande, Abhijit January 1900 (has links)
Master of Science / Department of Computing and Information Sciences / William H. Hsu / The goal of my project is to summarize large volumes of data and help users to visualize how events have unfolded over time. I address the problem of extracting overview terms from a time-tagged corpus of data and discuss some previous work conducted in this area. I use a statistical approach to automatically extract key terms, form groupings of related terms, and display the resultant groups on a timeline. I use a static corpus composed of news stories, as opposed to an on-line setting where continual additions to the corpus are being made. Terms are extracted using a Named Entity Recognizer, and importance of a term is determined using the [superscript]X[superscript]2 measure. My approach does not address the problem of associating time and date stamps with data, and is restricted to corpora that been explicitly tagged. The quality of results obtained is gauged subjectively and objectively by measuring the degree to which events known to exist in the corpus were identified by the system.

Page generated in 0.0247 seconds