• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 458
  • 77
  • 34
  • 31
  • 29
  • 12
  • 5
  • 4
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • Tagged with
  • 795
  • 500
  • 230
  • 218
  • 167
  • 147
  • 121
  • 96
  • 94
  • 86
  • 82
  • 79
  • 72
  • 72
  • 67
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
91

Configuration and Implementation Issues for a Firewall System Running on a Mobile Handset

Martinsen, Pal-Erik January 2005 (has links)
Any device connected to the Internet needs to be protected. Using a firewall as a first line of defence is a very common way to provide protection. A firewall can be set up to protect an entire network or just a single host. As it is becoming more and more popular to connect mobile phones and other hand held devices to the Internet, the big question is;"how to protect those devices from the perils of the Internet?" This work investigates issues with the implementation of a firewall system for protecting mobile devices. Firewall administration is an error prone and difficult task. Setting up a correctly configured firewall in a network setting is a difficult task for a network administrator. To enable an ordinary mobile phone user to set up a firewall configuration to protect his mobile phone it is important to have a system that is easy to understand and warns the user of possible mistakes. Generic algorithms for firewall rule-set sorting and anomaly discovery are presented. This ensures that the rule-set is error free and safe to use. This is a vital part of any firewall system. The prototype developed can be used to find errors in existing firewall rule-sets. The rule-set can be in either a native firewall configuration format (currently only IPF is supported) or in a generic XML format. This generic XML format was developed as a part of this research project. Further a new graphical visualization concept that allows the end user to configure an advanced firewall configuration from a device with a small screen and limited input possibilities is presented.
92

Network event detection with entropy measures

Eimann, Raimund E. A. January 2008 (has links)
Information measures may be used to estimate the amount of information emitted by discrete information sources. Network streams are an example for such discrete information sources. This thesis investigates the use of information measures for the detection of events in network streams. Starting with the fundamental entropy and complexity measures proposed by Shannon and Kolmogorov, it reviews a range of candidate information measures for network event detection, including algorithms from the Lempel-Ziv family and a relative newcomer, the T-entropy. Using network trace data from the University of Auckland, the thesis demonstrates experimentally that these measures are in principle suitable for the detection of a wide range of network events. Several key parameters influence the detectability of network events with information measures. These include the amount of data considered in each traffic sample and the choice of observables. Among others, a study of the entropy behaviour of individual observables in event and non-event scenarios investigates the optimisation of these parameters. The thesis also examines the impact of some of the detected events on different information measures. This motivates a discussion on the sensitivity of various measures. A set of experiments demonstrating multi-dimensional network event classification with multiple observables and multiple information measures concludes the thesis.
93

Network event detection with entropy measures

Eimann, Raimund E. A. January 2008 (has links)
Information measures may be used to estimate the amount of information emitted by discrete information sources. Network streams are an example for such discrete information sources. This thesis investigates the use of information measures for the detection of events in network streams. Starting with the fundamental entropy and complexity measures proposed by Shannon and Kolmogorov, it reviews a range of candidate information measures for network event detection, including algorithms from the Lempel-Ziv family and a relative newcomer, the T-entropy. Using network trace data from the University of Auckland, the thesis demonstrates experimentally that these measures are in principle suitable for the detection of a wide range of network events. Several key parameters influence the detectability of network events with information measures. These include the amount of data considered in each traffic sample and the choice of observables. Among others, a study of the entropy behaviour of individual observables in event and non-event scenarios investigates the optimisation of these parameters. The thesis also examines the impact of some of the detected events on different information measures. This motivates a discussion on the sensitivity of various measures. A set of experiments demonstrating multi-dimensional network event classification with multiple observables and multiple information measures concludes the thesis.
94

Network event detection with entropy measures

Eimann, Raimund E. A. January 2008 (has links)
Information measures may be used to estimate the amount of information emitted by discrete information sources. Network streams are an example for such discrete information sources. This thesis investigates the use of information measures for the detection of events in network streams. Starting with the fundamental entropy and complexity measures proposed by Shannon and Kolmogorov, it reviews a range of candidate information measures for network event detection, including algorithms from the Lempel-Ziv family and a relative newcomer, the T-entropy. Using network trace data from the University of Auckland, the thesis demonstrates experimentally that these measures are in principle suitable for the detection of a wide range of network events. Several key parameters influence the detectability of network events with information measures. These include the amount of data considered in each traffic sample and the choice of observables. Among others, a study of the entropy behaviour of individual observables in event and non-event scenarios investigates the optimisation of these parameters. The thesis also examines the impact of some of the detected events on different information measures. This motivates a discussion on the sensitivity of various measures. A set of experiments demonstrating multi-dimensional network event classification with multiple observables and multiple information measures concludes the thesis.
95

Network event detection with entropy measures

Eimann, Raimund E. A. January 2008 (has links)
Information measures may be used to estimate the amount of information emitted by discrete information sources. Network streams are an example for such discrete information sources. This thesis investigates the use of information measures for the detection of events in network streams. Starting with the fundamental entropy and complexity measures proposed by Shannon and Kolmogorov, it reviews a range of candidate information measures for network event detection, including algorithms from the Lempel-Ziv family and a relative newcomer, the T-entropy. Using network trace data from the University of Auckland, the thesis demonstrates experimentally that these measures are in principle suitable for the detection of a wide range of network events. Several key parameters influence the detectability of network events with information measures. These include the amount of data considered in each traffic sample and the choice of observables. Among others, a study of the entropy behaviour of individual observables in event and non-event scenarios investigates the optimisation of these parameters. The thesis also examines the impact of some of the detected events on different information measures. This motivates a discussion on the sensitivity of various measures. A set of experiments demonstrating multi-dimensional network event classification with multiple observables and multiple information measures concludes the thesis.
96

Data processing for anomaly detection in web-based applications /

Gaarudapuram Sriraghavan, Rajagopal. January 1900 (has links)
Thesis (M.S.)--Oregon State University, 2008. / Printout. Includes bibliographical references (leaves 53-57). Also available on the World Wide Web.
97

Outlier detection by network flow

Liu, Ying. January 2007 (has links) (PDF)
Thesis (Ph. D.)--University of Alabama at Birmingham, 2007. / Additional advisors: Elliot J. Lefkowitz, Kevin D. Reilly, Robert Thacker, Chengcui Zhang. Description based on contents viewed Feb. 7, 2008; title from title screen. Includes bibliographical references (p. 125-132).
98

GENERTIA a system for vulnerability analysis, design and redesign of immunity-based anomaly detection system /

Hou, Haiyu, Dozier, Gerry V. January 2006 (has links) (PDF)
Dissertation (Ph.D.)--Auburn University, 2006. / Abstract. Vita. Includes bibliographic references (p.149-156).
99

Industry-Specific Discretionary Accruals and Earnings Management

January 2011 (has links)
abstract: In this dissertation, I examine the source of some of the anomalous capital market outcomes that have been documented for firms with high accruals. Chapter 2 develops and implements a methodology that decomposes a firm's discretionary accruals into a firm-specific and an industry-specific component. I use this decomposition to investigate which component drives the subsequent negative returns associated with firms with high discretionary accruals. My results suggest that these abnormal returns are driven by the firm-specific component of discretionary accruals. Moreover, although industry-specific discretionary accruals do not directly contribute towards this anomaly, I find that it is precisely when industry-specific discretionary accruals are high that firms with high firm-specific discretionary accruals subsequently earn these negative returns. While consistent with irrational mispricing or a rational risk premium associated with high discretionary accruals, these findings also support a transactions-cost based explanation for the accruals anomaly whereby search costs associated with distinguishing between value-relevant and manipulative discretionary accruals can induce investors to overlook potential earnings manipulation. Chapter 3 extends the decomposition to examine the role of firm-specific and industry-specific discretionary accruals in explaining the subsequent market underperformance and negative analysts' forecast errors documented for firms issuing equity. I examine the post-issue market returns and analysts' forecast errors for a sample of seasoned equity issues between 1975 and 2004 and find that offering-year firm-specific discretionary accruals can partially explain these anomalous capital market outcomes. Nonetheless, I find this predictive power of firm-specific accruals to be more pronounced for issues that occur during 1975 - 1989 compared to issues taking place between 1990 and 2004. Additionally, I find no evidence that investors and analysts are more overoptimistic about the prospects of issuers that have both high firm-specific and industry-specific discretionary accruals (compared to firms with high discretionary accruals in general). The results indicate no role for industry-specific discretionary accruals in explaining overoptimistic expectations from seasoned equity issues and suggest the importance of firm-specific factors in inducing earnings manipulation surrounding equity issues. / Dissertation/Thesis / Ph.D. Business Administration 2011
100

Ensuring a Valid Source and Destination for Internet Traffic

Ehrenkranz, Toby, Ehrenkranz, Toby January 2012 (has links)
The Internet has become an indispensable resource for today's society. It is at the center of the today's business, entertainment, and social world. However, the core of our identities on the Internet, the IP addresses that are used to send and receive data throughout the Internet, are insecure. Attackers today are able to send data purporting to be from nearly any location (IP spoofing) and to reroute data destined for victims to the attackers themselves (IP prefix hijacking). Victims of these attacks may experience denial of service, misplaced blame, and theft of their traffic. These attacks are of the utmost importance since they affect the core layer of the Internet. Although the mechanisms of the attacks are different, they are essentially different sides of the same coin; spoofing attacks forge the identity of the sender, while hijacking attacks forge the identity of the receiver. They revolve around the same underlying lack of a secure identity on the Internet. This research reviews the existing state of the art IP spoofing and IP prefix hijacking research and proposes new defenses to close the missing gaps and provide a new level of security to our identities on the Internet. This material is based upon work supported by the National Science Foundation under Grants No. CNS-0520326 and CNS-1118101. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation. This dissertation includes both previously published/unpublished and co-authored material.

Page generated in 0.0804 seconds