Spelling suggestions: "subject:"computerscience"" "subject:"composerscience""
131 |
Deconstruction and Analysis of Email MessagesUnknown Date (has links)
Phishing scams have grown in frequency and developed in sophistication, and in recent years emails have been misused by scammers to frequently launch criminal attacks. By using phishing emails, scammers can make money in a very short time and generally avoid prosecution. Although it is typically easy for them to implement fraudulent plans with little cost, it is normally hard for law enforcement to catch them. On the other hand, victims can often face severe property loss or loss due to identity theft. Research focusing on detecting and preventing phishing attacks has thus become a hot topic in the area of computer and network security and a variety of tools have been developed to address aspects of this problem. However, there is currently not much software that can be used to detect and analyze phishing crimes efficiently. When investigating incidents of phishing and the related problem of identity theft, law enforcement investigators need to spend a lot of time and effort but they often get only few clues or results. We have developed the Undercover Multipurpose Anti-Spoofing Kit (UnMASK) to help solve this problem. This thesis presents the idea and the design of the deconstruction and analysis of email messages, which is used in UnMASK to help law enforcement in investigating and prosecuting email based crimes. It addresses the following problems: how can we parse a raw email message and find the information for investigation? What kind of information can we gather from the Internet? And which UNIX tools can be used for our investigation? In contrast to other work in this area, this research comprehensively considers exploits in phishing emails and defines a well-provided raw email parser for law enforcement investigations. And we also design and implement a new protocol used in the UNIX tool system. It not only tries to identify suspicious emails, but also emphasizes the gathering of evidence of crime. To the best of our knowledge, UnMASK is the first system that can automatically deconstruct email messages and present related forensic information in a convenient format to law enforcement. Test results show that the parser and the UNIX tool system of UnMASK are stable and useful. It can correctly extract information that law enforcement officers want to check in raw emails and it also correctly gathers information from the Internet. It generally takes a couple of minutes for our system to complete the report for one raw email message. Compared to the hours investigators spent to do the same work, our system greatly improves their efficiency / A Thesis submitted to the Department of Computer Science in partial fulfillment of
the requirements for the degree of Master of Science. / Degree Awarded: Fall Semester, 2007. / Date of Defense: October 10, 2007. / Email Parser, Email Investigation, Anti-phishing / Includes bibliographical references. / Sudhir Aggarwal, Professor Directing Thesis; Zhenhai Duan, Committee Member; Breno de Medeiros, Committee Member.
|
132 |
Mobile Agent Protection with Data Encapsulation and Execution TracingUnknown Date (has links)
Mobile agent systems provide a new method for computer communication. A mobile agent can migrate from platform to platform, performing a task or computation for its originator. Mobile agents are a promising new technology; however, there exist many security issues that need to be addressed. Security issues consist of protecting the agent platform and protecting the mobile agent. The toughest task is protecting the mobile agent, who is subject to attacks from the platform it is operating on. This thesis is concerned with protecting a mobile agent who collects data on behalf of its originator. A new mobile agent protection protocol, the data encapsulation protocol, is presented in this thesis. / A Thesis submitted to the Department of Computer Science in partial fulfillment of
the requirements for the degree of Master of Science. / Degree Awarded: Spring Semester, 2003. / Date of Defense: April 30, 2003. / New protocols, Migration, Protection / Includes bibliographical references. / Alec Yasinsac, Professor Directing Thesis; Mike Burmester, Committee Member; Lois Hawkes, Committee Member.
|
133 |
Reducing the WCET of Applications on Low End Embedded SystemsUnknown Date (has links)
Applications in embedded systems often need to meet specified timing constraints. It is advantageous to not only calculate the Worst-Case Execution Time (WCET) of an application, but to also perform transformations that attempt to reduce the WCET, since an application with a lower WCET will be less likely to violate its timing constraints. A compiler has been integrated with a timing analyzer to obtain the WCET of a program on demand during compilation. This environment is used to investigate three different types of compiler optimization techniques to reduce WCET. First, an interactive compilation system has been developed that allows a user to interact with a compiler and get feedback regarding the WCET. In addition, a genetic algorithm is used to automatically search for an effective optimization phase sequence to reduce the WCET. Second, a WCET code positioning optimization has been investigated that uses worst-case path information to reorder basic blocks so that the branch penalties can be reduced in the worst-case path. Third, WCET path optimizations, similar to frequent path optimizations, are used to reduce the WCET. There are several contributions to this work. To the best of our knowledge, this is the first compiler that interacts with a timing analyzer to use WCET predictions during the compilation of applications. The dissertation demonstrates that a genetic algorithm search can find an optimization sequence that simultaneously improves both WCET and code size. New compiler optimizations have been developed that use WC path information from a timing analyzer. The results show that the WCET code positioning algorithms typically find the optimal layout of the basic blocks with the minimal WCET. It is also shown that frequent path optimizations can be applied on WC paths using worst-case path information from a timing analyzer to reduce WCET. These new compiler optimizations described in this dissertation not only significantly reduce WCET, but also are completely automatic. / A Dissertation submitted to the Department of Computer Science in partial
fulfillment of the requirements for the degree of Doctor of Philosophy. / Degree Awarded: Summer Semester, 2005. / Date of Defense: July 11, 2005. / WCET, Performance, Embedded / Includes bibliographical references. / David Whalley, Professor Directing Dissertation; Anuj Srivastava, Outside Committee Member; Theodore P. Baker, Committee Member; Robert A. van Engelen, Committee Member; Kyle Gallivan, Committee Member.
|
134 |
Application Configurable ProcessorsUnknown Date (has links)
As the complexity requirements for embedded applications increase, the performance demands of embedded compilers also increase. Compiler optimizations, such as software pipelining and recurrence elimination, can significantly reduce execution time for applications, but these transformations require the use of additional registers to hold data values across one or more loop iterations. Compilers for embedded systems have difficulty exploiting these optimizations since they typically do not have enough registers on an embedded processor to be able to apply the transformations. In this paper, we evaluate a new application configurable processor utilizing several different register structures which can enable these optimizations without increasing the architecturally addressable register storage requirements. Using this approach can lead to an improved execution time through enabled optimizations and reduced register pressure for embedded architectures. / A Thesis submitted to the Department of Computer Science in partial fulfillment of
the Requirements for the degree of Master of Science. / Degree Awarded: Fall Semester, 2006. / Date of Defense: November 20, 2006. / Compilers, Computer Architecture, Embedded Systems / Includes bibliographical references. / David Whalley, Professor Co-Directing Thesis; Gary Tyson, Professor Co-Directing Thesis; Robert van Engelen, Committee Member.
|
135 |
Optimal Linear Features for Content Based Image Retrieval and ApplicationsUnknown Date (has links)
Since the number of digital images is growing explosively, content based image retrieval becomes an active research area to automatically index and retrieve images based on their semantic features and visual appearance. Content based image retrieval (CBIR) research is largely concentrating on two topics due to their fundamental importance: (1) similarity of images that depends on the feature representation and feature similarity function; (2)machine learning algorithms to enhance retrieval results through adaptively improving classification results and similarity metrics. Color histogram is one of the most commonly used features because it provides important color distribution information of images and is easy to calculate. However, color histogram ignores spatial information which is also important for discriminating for spatial patterns. We propose a new type of features called spetral histogram (SH) features to include spatial information of images by combining local patterns through filters and global features through histograms. Spetral histogram features are obtained by concatenating histograms of image spectral components associated with a bank of filters; it has been shown that they provide a unified representation for modeling textures, faces, and other images. Through experiments, we demonstrate their effectiveness for CBIR using a benchmark dataset. In order to alleviate the sensitivity to scaling, we propose to use "characteristic scale" to obtain intrinsic SH features that are invariant to changes in scale. In order to deal with domain specific images such as "images containing cats", we propose a new shape feature called gradient curve. The gradient curve feature combined with histogram of gradient (HOG) along edge fragment patches is shown to be effective in cat head detection. We develop a new machine learning algorithm called Optimal Factor Analysis (OFA), which is designed to learn low-dimensional representations that optimize discrimination based on the nearest neighbor classifier using Euclidean distances. The method is applied to content-based image categorization and retrieval using SH features. We have achieved significantly better retrieval results on a benchmark dataset than some existing methods. Then we also explore the possibility of improving classification and retrieval result by applying OFA with respect to the metrics derived from cross-correlation of spectral histograms. Considering the large amount of unlabeled data in real world applications, we propose a new semi-supervised learning algorithm named Transductive Optimal Component Analysis (Transductive OCA); it utilizes unlabeled data to learn optimal linear representations by incorporating an additional term that prefers representations with large "margins" when classifying unlabeled data in the nearest classifier sense. We have achieved improvements on face recognition applications using Transductive OCA. / A Dissertation submitted to the Department of Computer Science in partial
fulfillment of the requirements for the degree of Doctor of Philosphy. / Degree Awarded: Spring Semester, 2010. / Date of Defense: April 5, 2010. / Spectral Histogram Feature, Machine Learning, Content Based Image Retrieval / Includes bibliographical references. / Xiuwen Liu, Professor Directing Dissertation; Victor Patrangenaru, University Representative; Feifei Li, Committee Member; Michael Mascagni, Committee Member; Piyush Kumar, Committee Member; Washington Mio, Committee Member.
|
136 |
PPDA: Privacy Preserving Data Aggregation in Wireless Sensor NetworksUnknown Date (has links)
Wireless sensor networks are undoubtedly one of the largest growing types of networks today. Much research has been done to make these networks operate more efficiently including the application of data aggregation. Recently, more research has been done on the security of wireless sensor networks using data aggregation. In this thesis, we discuss a method in which data aggregation can be performed securely by allowing a sensor network to aggregate encrypted data without first decrypting it. / A Thesis submitted to the Department of Computer Science in partial fulfillment of
the requirements for the degree of Master of Science. / Degree Awarded: Spring Semester, 2004. / Date of Defense: April 12, 2004. / Function Composition / Includes bibliographical references. / Alec Yasinsac, Professor Directing Thesis; Mike Burmester, Committee Member; Greg Riccardi, Committee Member.
|
137 |
An automated system for symbolic approximate reasoningUnknown Date (has links)
This dissertation proposes an automated system for symbolic approximate reasoning. Herein referred to as SAR, this is a further evolution of a system developed earlier by Schwartz. It is based on the concept of a linguistic variable, first introduced by Zadeh for approximate reasoning within the context of a semantics based on fuzzy sets. The present approach differs from that of Zadeh in that logical inference is defined as an operation that applies directly on linguistic terms, rather than on their underlying fuzzy set interpretations. The Schwartz system proposed two different such kinds of symbolic inference. Here three additional kinds are introduced, including one which accommodates reasoning with precise numerical information. Also developed are various appropriate modes of evidence combination. / The core of this dissertation is a full exploration of a resolution method for SAR. Termed SAR-resolution, this is an adaptation of the well-known SLD-resolution which underlies Prolog. Tasks carried out for this purpose are: (i) introduce a resolution principle for SAR, which differs from the traditional one in that it attaches a computation formula to all clauses employed, (ii) discuss resolution with evidence combination, which differs from SLD-resolution in requiring that all paths leading to the empty clause be found, (iii) define the general notion of an SAR-derivation, which specifies how a resolution process successfully terminates, (iv) fully detail the overall SAR-refutation procedure. Finally, a design for a future implementation is briefly outlined. This includes specification of the needed internal data objects and an algorithm for an inference engine written in terms of those objects. / Source: Dissertation Abstracts International, Volume: 55-09, Section: B, page: 3972. / Major Professor: Daniel G. Schwartz. / Thesis (Ph.D.)--The Florida State University, 1994.
|
138 |
Static cache simulation and its applicationsUnknown Date (has links)
This work takes a fresh look at the simulation of cache memories. It introduces the technique of static cache simulation that statically predicts a large portion of cache references. To efficiently utilize this technique, a method to perform efficient on-the-fly analysis of programs in general is developed and proved correct. This method is combined with static cache simulation for a number of applications. The application of fast instruction cache analysis provides a new framework to evaluate instruction cache memories that outperforms even the fastest techniques published. Static cache simulation is shown to address the issue of predicting cache behavior, contrary to the belief that cache memories introduce unpredictability to real-time systems that cannot be efficiently analyzed. Static cache simulation for instruction caches provides a large degree of predictability for real-time systems. In addition, an architectural modification through bit-encoding is introduced that provides fully predictable caching behavior. Even for regular instruction caches without architectural modifications, tight bounds for the execution time of real-time programs can be derived from the information provided by the static cache simulator. Finally, the debugging of real-time applications can be enhanced by displaying the timing information of the debugged program at breakpoints. The timing information is determined by simulating the instruction cache behavior during program execution and can be used, for example, to detect missed deadlines and locate time-consuming code portions. Overall, the technique of static cache simulation provides a novel approach to analyze cache memories and has been shown to be very efficient for numerous applications. / Source: Dissertation Abstracts International, Volume: 55-09, Section: B, page: 3982. / Major Professor: David B. Whalley. / Thesis (Ph.D.)--The Florida State University, 1994.
|
139 |
A fuzzy logic approach for cognitive diagnosisUnknown Date (has links)
Computers have been used for a lot of tasks. Perhaps one of the most useful applications is for diagnosis. An approach for cognitive diagnosis, an area dealing with the intricacies of the higher mental processes, is presented in this dissertation. The method is developed to be embodied in an intelligent tutoring system (ITS) for classical Mendelian genetics. / The research is an investigation of how to model cognitive diagnosis using fuzzy graph structures called fuzzy cognitive maps (FCMs). Crucial to this formalism is the task of comparing structures. This is a problem of classification and entails a theory of similarity. Unlike standard approaches to similarity, which give emphasis to shared areas of agreement between structures, the method presented in this document extracts and considers the discrepancies as well. A few measures involving both concepts of similarity and discrepancy are also developed and contrasted. / Three investigations are presented. The first is an initial inquiry on fuzzy relational overlaps and discrepancies. Another study involved fuzzy techniques in comparing Hasse diagrams. Finally, a method for comparing FCM structures was formulated. Some examples for classical Mendelian genetics are provided and compared with results from a study in Science Education. And a mathematical model for chains of thought is provided. / Source: Dissertation Abstracts International, Volume: 54-02, Section: B, page: 0937. / Major Professor: Wyllis Bandler. / Thesis (Ph.D.)--The Florida State University, 1993.
|
140 |
A conceptual approach to reusability in object-oriented designUnknown Date (has links)
The growth in the size and complexity of software projects is usually accompanied by increased difficulties in project development and management. There is a growing demand for methodologies and computer-aided software engineering (CASE) tools to improve project evolution and life-cycle management. Improvements in documentation techniques, in general, and documentation of design rationale, in particular, are predicted to contribute significantly to understanding the design and to communicating about it, thus improving the process of cooperative work also. / An important element in the object-oriented approach to software development is that objects can be classified using inheritance hierarchies. This allows the properties of implemented objects to be inherited and reused. Standard software components correspond to frames, or scripts, which improve understanding. / In this paper, we develop a method to store and retrieve reusable objects. In order to facilitate reusability during the early stages of project development, under uncertainty conditions, objects are associated with concepts, and fuzzy relations are used to represent concept similarity and generalization. Concept properties are represented using an aggregation relation. A possibilistic approach based on fuzzy sets is used to reach and retrieve reusable objects. / Interpretations of concepts, their interrelations and properties yield inheritance hierarchies with exceptions derived automatically. Default reasoning is used in the representation and interpretation of concepts and their properties. / Object classification and indexing issues are discussed and a query language is developed to support information retrieval. This allows for the construction of objects based on reusable resources. / Source: Dissertation Abstracts International, Volume: 53-11, Section: B, page: 5819. / Major Professor: Lois W. Hawkes. / Thesis (Ph.D.)--The Florida State University, 1992.
|
Page generated in 0.1 seconds