• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 374
  • 40
  • 38
  • 26
  • 23
  • 12
  • 8
  • 8
  • 7
  • 7
  • 3
  • 3
  • 3
  • 2
  • 2
  • Tagged with
  • 697
  • 697
  • 298
  • 274
  • 156
  • 147
  • 112
  • 108
  • 107
  • 104
  • 100
  • 100
  • 87
  • 86
  • 82
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
201

Flexible authorizations in workflow management systems

Lui, W. C., 雷永祥. January 2002 (has links)
published_or_final_version / Computer Science and Information Systems / Master / Master of Philosophy
202

METHODOLOGY FOR THE OPTIMIZATION OF RESOURCES IN THE DETECTION OF COMPUTER FRAUD.

DUNN, THURMAN STANLEY. January 1982 (has links)
A methodology is proposed for optimizing the allocation of resources in the detection of computer fraud. The methodology consists of four major segments. First, a threat assessment is performed. A general threat assessment is provided which relies upon reported incidents of computer fraud. Then, recognizing the limitations of computer fraud reporting, a specific threat assessment technique is provided which is based entirely on the characteristics of a given computer system. Both the general and specific threat assessment techniques use a matrix approach which evaluates and assigns threat values by type of computer fraud and perpetrator. Second, a Detection Quotient is established which measures the effectiveness of computer fraud detection resource allocation for all of the possible combinations of computer fraud types and perpetrators. However, for many computer systems, the large number of possible resource allocation alternatives results in a Combinatorial Dilemma whereby the phenomenally large number of alternatives precludes comprehensive analysis. This leads to the third major segment of the dissertation, a General Solution to the Combinatorial Dilemma which ensures an alternative very near the optimum while evaluating only an extremely small percentage of possible alternatives. Fourth, a Resource Optimization Model is provided which, beginning with the results of the Threat Assessment, iteratively assigns varying levels of computer fraud detection resources to different fraud type and perpetrator combinations. Using the general solution to the Combinatorial Dilemma and the Detection Quotient as a measure of the effectiveness of each combination, the model produces a statistically defensible near optimum allocation of available resources to computer fraud detection. Also provided are the results of the research into reported cases of fraud in the form of a Typology. This Typology combines frequency of occurrence and dollar impact of reported cases of fraud into a measure of vulnerability for various types of fraud and perpetrator. Finally, an overview of investigative techniques and automated tools for evaluating the propriety of computer systems is provided.
203

PERSONAL PRIVACY IN A COMPUTER INFORMATION SOCIETY.

ESQUERRA, RONALD LEE. January 1982 (has links)
Americans live in a service-oriented, computer-based society whose collective market place is fueled by the collection, use, exchange, and storage of information about people by government and business institutions. Consequently, individuals are having fewer face-to-face contacts in their relationships with these institutions while more decisions affecting their everyday lives are being made by strangers based upon information maintained in computer data systems. This being so, public concern about privacy, specifically the potential abuse and misuse of personal information by government and business, has increased substantially in recent years. There also exists the constant threat of information technology outstripping existing legal frameworks and outpacing the privacy expectations of citizens. More than ever, government and business policy makers will face the dilemma of balancing the legitimate needs of institutions for information about people with the privacy standing of the individual. Knowledge of public views are essential to this task. The purpose of this opinion research study is to learn the views of Arizona residents regarding their personal privacy and relationships with select privacy-intensive public and private institutions. The results provide empirical data for the privacy protection deliberations of the government and business policy makers who practice within Arizona. The results show personal privacy as an issue of serious public concern, with Arizona residents requesting further government laws and business policies and practices to protect their privacy. Arizona residents recognize the legitimate information needs of government and business institutions, but they expect protections against unwelcome, unfair, improper, and excessive collection and dissemination of personal information about them. Computers are perceived as threats to personal privacy, suggesting if institutions expect to be able to continue widespread applications of computers, measures must be taken to assure the public that the personal information stored in such systems are safeguarded from abuse and misuse. The results also show that there is a direct relationship between the degree of alienation or estrangement which individuals feel from government and business institutions and their attitudes toward privacy issues and perception of computer benefits and dangers. Consequently, to affect such attitudes will require sound measures.
204

THE SOCIOLOGICAL IMPACT OF THE FAMILY EDUCATIONAL RIGHTS AND PRIVACY ACT ON AN INSTITUTION OF HIGHER EDUCATION.

Sparrow, Alice Pickett, 1939- January 1985 (has links)
No description available.
205

On Cross-Layer Design of Distributed MIMO Spatial Multiplexing Compliant Wireless Ad hoc Networks

LI, YIHU 18 October 2013 (has links)
IEEE 802.11n Wireless Local Area Networks (WLANs) employ Multiple-Input-Multiple-Output (MIMO), which significantly boosts the raw data rate at the Physical layer (PHY). But the potential of enhancing Medium Access Control (MAC) layer efficiencies by MIMO is still in its early stage and is the aim of the research in this thesis. Many existing works in this field mainly employ distributed MIMO spatial multiplexing/Multi-User Detection (MUD) technique and stream sharing to enable multiple simultaneous transmissions. Most works require synchronization among multiple transmissions, split the channel, and aim for single-hop networks. In this thesis, a novel Hybrid Carrier Sense (HCS) framework is proposed, mainly at the MAC layer to exploit the power of MIMO. HCS senses the channel availability jointly by the virtual carrier sense and physical carrier sense. HCS does not require synchronization among nodes; each node independently and locally determines when to start its transmission. HCS not only shares the channel, but also exploits the bi-directional handshakes of the wireless transmissions and increases the number of simultaneous stream transmissions. For a network with M antennas in each node, HCS can accommodate 2x(M-1) streams instead of M streams achieved by all other existing works. Moreover, HCS is aimed for multi-hop wireless ad hoc networks, in which the hidden terminal, exposed terminal, and deafness problems greatly degrade network performance. The HCS framework incorporates solutions to these problems. HCS is implemented in an NS2 network simulator and the performance evaluation shows that HCS significantly outperforms MIMO-enabled IEEE 802.11 (in which MIMO is only used for enhancing the raw data rate in the physical layer), resulting in higher aggregate throughput, packet delivery ratio and fairness in multi-hop wireless ad hoc networks. The HCS framework will be in wide use in the future generation of wireless networks and opens up more research possibilities. Some ideas in the HCS framework can be applied not only for MIMO, but also for many other techniques surveyed in this thesis; or we may combine them with HCS to further boost the network performance. / Thesis (Ph.D, Electrical & Computer Engineering) -- Queen's University, 2013-10-15 21:46:15.983
206

Automatic reconstruction and analysis of security policies from deployed security components

Martinez, Salvador 30 June 2014 (has links) (PDF)
Security is a critical concern for any information system. Security properties such as confidentiality, integrity and availability need to be enforced in order to make systems safe. In complex environments, where information systems are composed by a number of heterogeneous subsystems, each subsystem plays a key role in the global system security. For the specific case of access-control, access-control policies may be found in several components (databases, networksand applications) all, supposedly, working together. Nevertheless since most times these policies have been manually implemented and/or evolved separately they easily become inconsistent. In this context, discovering and understanding which security policies are actually being enforced by the information system comes out as a critical necessity. The main challenge to solve is bridging the gap between the vendor-dependent security features and a higher-level representation that express these policies in a way that abstracts from the specificities of concrete system components, and thus, it's easier to understand and reason with. This high-level representation would also allow us to implement all evolution/refactoring/manipulation operations on the security policies in a reusable way. In this work we propose such a reverse engineering and integration mechanism for access-control policies. We rely on model-driven technologies to achieve this goal.
207

Towards a localisation of trust framework for pervasive environments

Li, Jun January 2008 (has links)
Pervasive computing envisions an environment in which we are surrounded by many embedded computer devices. The existence of those networked devices provides us with a mobile, spontaneous and dynamic way to access various resources provided by different (security policy) domains. In recent years, we have witnessed the evolutionary development of numerous multiple domain applications. One of the richest examples is pervasive environments. Typi- cally, the conventional approach to secure access over multiple domains is to implement a unique trusted infrastructure, extending local identity or capa- bility based security systems and combining them with cross-domain authen- tication mechanisms. However, this does not adequately meet the security requirements of communicating with unknown players in pervasive environ- ments. Moreover, it is infeasible to define a global trust infrastructure and a set of assumptions that every player will trust in the multiple domain context. A powerful design technique to address those new security challenges posed by pervasive environments is to understand them from a domain perspective. This thesis presents Localisation of Trust (LoT), an architectural frame- work designed to address the security need of how to talk to correct strangers in pervasive environments. Based on the localising trust security principle, LoT provides a generic platform for building access control over multiple do- mains from two ends: authentication and authorisation. Firstly, LoT proposes a two-channel authentication protocol to replace traditional (strong) identity- based authentication protocols by exploring desirable contextual information for different pervasive applications. Then, delegation and localised authenti- cation are deployed to achieve authorisation in pervasive environments. The heart of this different semantic is to let the right domain get involved with its local players’ interactions by helping them to convert a “token” to a usable 2 access capability, whilst keeping revocation in mind. This is done by introduc- ing a domain-oriented Encryption-Based Access Control method, using ideas borrowed for Identity-based Encryption. The second part of this thesis describes several specific mechanisms and protocols including a Dual Capabilities Model to achieve the required anti- properties for LoT. Although novel, they are intended primarily as an existence proof rather than being claimed to be ideal. Depending upon the precise application and context, other mechanisms may be better. Most importantly, the architecture-focused LoT provides such a flexibility by introducing multiple domains as a primary concern but leaving untouched the security protocols underlying each single domain and system implementation. Finally, a single domain scenario, guest access, is examined with the light of LoT. The purpose of doing so is to enhance the understanding of domain and other concepts described in LoT and demonstrate the effectiveness and efficiency of LoT for the scenarios chosen.
208

Border monitoring based on a novel PIR detection model

Dikmen, Iskender 03 1900 (has links)
Approved for public release, distribution unlimited / Improvements in technology have enabled the development of cost-effective, low-power, multifunctional wireless sensor nodes, which are used in various applications including surveillance and intrusion detection. We have made experiments in order to discover the detection probability of the Crossbow MSP410 mote sensor nodes. We have developed a new PIR detection model, which has a high probability detection region and a low probability detection region, for MSP410 mote sensor nodes based on the observed probabilities. The PIR model is used in the proposed sensor placement strategy for MSP410 mote sensor nodes intended for a border monitoring scenario. The detection probability of the low probability region of the new PIR detection model is increased by overlapping with the low probability region of the neighboring sensor nodes in the proposed sensor placement strategy.
209

Model kontekstno zavisne kontrole pristupa u poslovnim sistemima / Context Sensitive Access Control Model TI for Business Processes

Sladić Goran 07 April 2011 (has links)
<p>Kontrola pristupa odnosno autorizacija, u &scaron;irem smislu, razmatra na koji način korisnici mogu pristupiti resursima računarskog sistema i na koji način ih koristiti. Ova disertacija se bavi problemima kontrole pristupa u poslovnim sistemima. Tema disertacije je formalna specifkacija modela kontekstno zavisne kontrole pristupa u poslovnim sistemima koji je baziran na RBAC modelu kontrole pristupa. Uvođenjem kontekstno zavisne kontrole pristupa omogućeno je defnisanje složenijih prava pristupa koje u postojećim modelima kontrole pristupa za poslovne sisteme nije bilo moguće realizovati ili bi njihova realizacija bila komplikovana. Dati model primenljiv je u različitim poslovnim sistemima, a podržava defnisanje prava pristupa kako za jednostavne tako i za slo&middot;zene poslovne tokove. Sistem je verifkovan na dva realna poslovna procesa pomoću razvijenog prototipa. Prikazana prototipska implementacija koja ispunjava ciljeve u&nbsp;pogledu funkcionalnosti postavljene pred sistem predstavlja potvrdu praktične vrednosti predloženog modela.</p> / <p>Access control is concerned with the way in which users can access to&nbsp;resources in the computer system. This dissertation focuses on problems of access control for business processes. The subject of the dissertation is a formal specification of the RBAC-based context sensitive access control model for business processes. By using a context-sensitive&nbsp;access control it is possible to define more complex access control policies whose implementation in existing access control models for business processes is not possible or is very complicated. The given model is&nbsp;applicable in diferent business systems, and supports the definition of&nbsp;access control policies for both simple and complex business processes.&nbsp;The model&#39;s prototype is verified by two case studies on real business&nbsp;processes. The presented prototype implementation represents a proof&nbsp;of the proposed model&#39;s practical value.</p>
210

The formalisation and transformation of access control policies

Slaymaker, Mark Arthur January 2011 (has links)
Increasing amounts of data are being collected and stored relating to every aspect of an individual's life, ranging from shopping habits to medical conditions. This data is increasingly being shared for a variety of reasons, from providing vast quantities of data to validate the latest medical hypothesis, to supporting companies in targeting advertising and promotions to individuals that fit a certain profile. In such cases, the data being used often comes from multiple sources --- with each of the contributing parties owning, and being legally responsible for, their own data. Within such models of collaboration, access control becomes important to each of the individual data owners. Although they wish to share data and benefit from information that others have provided, they do not wish to give away the entirety of their own data. Rather, they wish to use access control policies that give them control over which aspects of the data can be seen by particular individuals and groups. Each data owner will have access control policies that are carefully crafted and understood --- defined in terms of the access control representation that they use, which may be very different from the model of access control utilised by other data owners or by the technology facilitating the data sharing. Achieving interoperability in such circumstances would typically require the rewriting of the policies into a uniform or standard representation --- which may give rise to the need to embrace a new access control representation and/or the utilisation of a manual, error-prone, translation. In this thesis we propose an alternative approach, which embraces heterogeneity, and establishes a framework for automatic transformations of access control policies. This has the benefit of allowing data owners to continue to use their access control paradigm of choice. Of course, it is important that the data owners have some confidence in the fact that the new, transformed, access control policy representation accurately reflects their intentions. To this end, the use of tools for formal modelling and analysis allows us to reason about the translation, and demonstrate that the policies expressed in both representations are equivalent under access control requests; that is, for any given request both access control mechanisms will give an equivalent access decision. For the general case, we might propose a standard intermediate access control representation with transformations to and from each access control policy language of interest. However, for the purpose of this thesis, we have chosen to model the translation between role-based access control (RBAC) and the XML-based policy language, XACML, as a proof of concept of our approach. In addition to the formal models of the access control mechanisms and the translation, we provide, by way of a case study, an example of an implementation which performs the translation. The contributions of this thesis are as follows. First, we propose an approach to resolving issues of authorisation heterogeneity within distributed contexts, with the requirements being derived from nearly eight years of work in developing secure, distributed systems. Our second contribution is the formal description of two popular approaches to access control: RBAC and XACML. Our third contribution is the development of an Alloy model of our transformation process. Finally, we have developed an application that validates our approach, and supports the transformation process by allowing policy writers to state, with confidence, that two different representations of the same policy are equivalent.

Page generated in 0.1913 seconds