• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 31
  • 5
  • 4
  • 1
  • 1
  • Tagged with
  • 47
  • 47
  • 27
  • 22
  • 12
  • 7
  • 6
  • 6
  • 6
  • 6
  • 6
  • 6
  • 5
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Database security in the cloud / Databassäkerhet i molnet

Sakhi, Imal January 2012 (has links)
The aim of the thesis is to get an overview of the database services available in cloud computing environment, investigate the security risks associated with it and propose the possible countermeasures to minimize the risks. The thesis also analyzes two cloud database service providers namely; Amazon RDS and Xeround. The reason behind choosing these two providers is because they are currently amongst the leading cloud database providers and both provide relational cloud databases which makes the comparison useful. The focus of the analysis has been to provide an overview of their database services as well as the available security measurements. A guide has been appended at the end of the report to help with technical configurations of database migration and connecting applications to the databases for the two mentioned cloud database providers. The thesis has been conducted on behalf of the Swedish Armed Forces and after reviewing the security risks associated with cloud databases, it is recommended that the Armed Forces should refrain from public cloud database services. Security deficiencies such as vague physical security and access control procedures, unavailability of preferred monitoring tools and most importantly the absence of proper encryption and key management schemes make the public database services useless for an authority such as the Armed Forces.  The recommended solutions are therefore to either use a jointly-owned community cloud database solution for less confidential data only or to use on-premise private cloud database solution for all but the TOP SECRET classified data.     Keywords: Cloud computing, cloud database services, Swedish Armed Forces, security risks, Xeround, Amazon RDS
42

Mapping out the Key Security Components in Relational Databases (MK-SCoRe) : Enhancing the Security of Relational Database Technology / Kartläggning av Nyckelkomponenter för Säkerhet i Relationsdatabaser (MK-SCoRe) : Förbättring av Säkerheten i Relationsdatabasteknik

Alobaidi, Murtadha, Trabulsiah, Abdullah January 2024 (has links)
Relational database security has become an increasingly important issue for organizations worldwide in the current era of data-driven operations. The urgent need for an extensive knowledge of relational database security components in relational databases is addressed in this thesis. Database security is constantly improving, but there is still a lack of research that analyzes these important factors. Because of this gap, databases are not sufficiently secured from new cyber threats, which endangers its accessibility, confidentiality, and integrity. The problem that the thesis addresses is the lack of comprehensive research covering all key security components in relational databases which, presents a challenge for organizations seeking to comprehensively secure their database systems. The purpose of this thesis is to systematically map the key security components essential to relational databases. The goal is to assist organizations and Database professionals to secure their relational databases against diverse cyber threats. Using a qualitative and exploratory methodology, the research analyzes a wide range of literature on database security. The research offers a balanced and comprehensive perspective on the current security landscape in relational databases by integrating theoretical study with structured interviews. This method guarantees that all essential security components is fully investigated. The results of this thesis involve a detailed mapping of the key security components within relational databases, which are uniquely informed by a combination of academic research and empirical findings from structured interviews with Database security experts. This thesis analyzes these security components based on how well they address current security threats, how well they secure databases, and how well they can adapt to different organizational needs. / Säkerhet i relationsdatabaser har blivit en allt viktigare fråga för organisationer världen över i den nuvarande eran av datadriven verksamhet. I den här avhandlingen behandlas det akuta behovet av en omfattande kunskap om säkerhetskomponenter för relationsdatabaser i relationsdatabaser. Databassäkerheten förbättras ständigt, men det finns fortfarande en brist på forskning som analyserar dessa viktiga faktorer. På grund av denna brist är databaser inte tillräckligt skyddade mot nya cyberhot, vilket äventyrar deras tillgänglighet, konfidentialitet och integritet. Problemet som avhandlingen tar upp är bristen på omfattande forskning som täcker alla viktiga säkerhetskomponenter i relationsdatabaser, vilket utgör en utmaning för organisationer som vill säkra sina databassystem på ett heltäckande sätt. Syftet med denna avhandling är att systematiskt kartlägga de viktigaste säkerhetskomponenterna som är väsentliga för relationsdatabaser. Målet är att hjälpa organisationer och databasspecialister att säkra sina relationsdatabaser mot olika cyberhot. Med hjälp av en kvalitativ och explorativ metod analyseras ett brett spektrum av litteratur om databassäkerhet. Forskningen erbjuder ett balanserat och omfattande perspektiv på det nuvarande säkerhetslandskapet i relationsdatabaser genom att integrera teoretiska studier med strukturerade intervjuer. Denna metod garanterar att alla väsentliga säkerhetskomponenter undersöks fullständigt. Resultatet av denna avhandling innebär en detaljerad kartläggning av de viktigaste säkerhetskomponenterna inom relationsdatabaser, som är unikt informerade av en kombination av akademisk forskning och empiriska resultat från strukturerade intervjuer med databassäkerhetsexperter. Denna avhandling analyserar dessa säkerhetskomponenter utifrån hur väl de hanterar aktuella säkerhetshot, hur väl de säkrar databaser och hur väl de kan anpassas till olika organisatoriska behov.
43

A Robust Data Obfuscation Technique for Privacy Preserving Collaborative Filtering

Parameswaran, Rupa 10 May 2006 (has links)
Privacy is defined as the freedom from unauthorized intrusion. The availability of personal information through online databases, such as government records, medical records, and voters and #146; lists, pose a threat to personal privacy. The concern over individual privacy has led to the development of legal codes for safeguarding privacy in several countries. However, the ignorance of individuals as well as loopholes in the systems, have led to information breaches even in the presence of such rules and regulations. Protection against data privacy requires modification of the data itself. The term {em data obfuscation} is used to refer to the class of algorithms that modify the values of the data items without distorting the usefulness of the data. The main goal of this thesis is the development of a data obfuscation technique that provides robust privacy protection with minimal loss in usability of the data. Although medical and financial services are two of the major areas where information privacy is a concern, privacy breaches are not restricted to these domains. One of the areas where the concern over data privacy is of growing interest is collaborative filtering. Collaborative filtering systems are being widely used in E-commerce applications to provide recommendations to users regarding products that might be of interest to them. The prediction accuracy of these systems is dependent on the size and accuracy of the data provided by users. However, the lack of sufficient guidelines governing the use and distribution of user data raises concerns over individual privacy. Users often provide the minimal information that is required for accessing these E-commerce services. The lack of rules governing the use and distribution of data disallows sharing of data among different communities for collaborative filtering. The goals of this thesis are (a) the definition of a standard for classifying DO techniques, (b) the development of a robust cluster preserving data obfuscation algorithm, and (c) the design and implementation of a privacy-preserving shared collaborative filtering framework using the data obfuscation algorithm.
44

Implementation Of Database Security Features Using Bit Matrices

Gopal, K 04 1900 (has links)
Information security is of utmost concern in a multiuser environment. The importance of security is felt much more with the widespread use of distributed database. Information is by itself a critical resource of an enterprise and thus the successful operation of an enterprise demands that data be made accessible only by authorized users and that the data be made to reflect the state of the enterprise. Since many databases are online, accessed by multiple users concurrently, special mechanisms are needed to insure integrity and security of relevant information, This thesis describes a model for computer database security that supports a wide variety of security policies. The terms security policies and security mechanism are presented in Chapter I. The interrelated topics of security and integrity are discussed in some detail. The importance and means of insuring security of information is also presented in this chapter. In Chapter 2, the work done In the field of Computer Security and related topic has been presented. In general computer security models could be classified broadly under the two categories. (1) Models based on Access Control Matrix and (2) Models based on Information Flow Control. The development of the models baaed on the above two schemes as also the policies supported by some of the schemes are presented in this chapter. A brief description of the work carried out in database security as aim the definition of related terns are given in Chapter 3. The interrelationship between the operating system security and database security is also presented in this chapter. In general the database security mechanism depends on the existing operating system. The database security mechanism are thus only as strong as the underlying operating system on which it is developed. The various schemes used for implementing database security such as access controller and capability lists are described in this chapter. In Chapter 4, a model for database security has been described. The model provides for: (a) Delegation of access rights by a user and (b) Revocation of access rights previously granted by a user. In addition, algorithms for enforcing context dependent and content dependent rules are provided in this cheer. The context-dependent rules are stored in the form of elements of a bit matrix. Context-dependent rules could then be enforced by suitably manipulating the bit matrix and interpreting the value of me elements of the matrix, The major advantage of representing the rules using bit matrices is that the matrix itself could be maintalnet3 in main memory. The time taken to examine if a user is authorized to access an object is drastically reduced because of the reduced time required to inspect main memory. The method presented in this chapter, in addition to reducing the time requirement for enforcing security also presents a method for enforcing decentralized authorization control, a facility that is useful in a distributed database environment. Chapter 5 describes a simulation method that is useful for comparing the various security schemes. The tasks involved in the simulation are – 1. Creation of an arrival (job). 2. Placing the incoming job either in the wait queue or in the run state depending on the type of access needed for: the object. 3. Checking that the user on whose behalf the job is being executed is authorized to access the object in the mode requested. 4. Checking for the successful completion of the job and termination of the job. 5. Collection of important parameters such as number of jobs processed, average connect time. Simulation was carried out for timing both the access controller scheme and bit matrix scheme, The results of the simulation run bear the fact that the bit matrix scheme provides a faster method Six types of access were assumed to be permissible, three of the access types requiring shared lock and the rest requiring exclusive locks on the objects concerned, In addition the only type of operation allowed was assumed to be for accessing the objects. It is be noted that the time taken to check for security violation is but one of the factors for rating the security system. In general, various other factors such as cost of implementing the security system, the flexibility that offers enforcing security policies also have to be taken into account while comparing the security systems. Finally, in Chapter 6, a comparison of the security schemes are made. In conclusion the bit matrix approach is seen to provide the following features. (a) The time required to check if an access request should be honoured is very small. (b) The time required to find a11 users accessing an object viz, accountability is quite small. (c) The time required to find all objects accessible by a user is also quite small. (dl The scheme supports both decentralized and centralized authorization control. (e) Mechanism for enforcing delegation of access rights and revocation of access rights could be built in easily. ( f ) The scheme supports content-dependent, context-dependent controls and also provides a means for enforcing history-dependent control. Finally, some recommendations for further study in the field of Computer Database Security are presented.
45

Apsaugos nuo SQL injekcijų el.verslo svetainėse metodikos sudarymas ir tyrimas / Development and research of method of protection against SQL injections in e-commerce websites

Ramoška, Aidas 04 November 2013 (has links)
SQL injekcijos atakos taikinys – interaktyvios interneto programos, kurios naudoja duomenų bazės serverius. Šios programos leidžia vartotojams įvesti informaciją ir ją įvedus formuojamos SQL užklausos, kurios siunčiamos į duomenų bazės serverį. Darydamas SQL injekcijos ataką, atakuotojas per įvesties laukus suformuoja kenksmingą SQL užklausos segmentą, kuris modifikuoja buvusią užklausą. Naudodamas SQL injekcijos ataką, atakuotojas gali prieiti prie konfidencialios informacijos, ją modifikuoti ar, apeidamas autorizacijos scenarijų, prisijungti prie sistemos nežinodamas slaptažodžio. Šiame darbe pasiūlytas saugos modulis perima visą vartotojo įvedamą informaciją, pritaiko saugumo taisykles ir taip padidina saugumą apsisaugant nuo SQL injekcijų el. verslo žiniatinklio programose bei registruoja potencialius bandymus sutrikdyti normalų sistemos darbą. Norint įdiegti pasiūlytą saugos modulį, nereikia konfigūruoti serverio ar jo programinės įrangos – modulio diegimo metu keičiasi tik žiniatinklio programos failai. Darbui atlikti pasirinkta PHP programavimo kalba ir MySQL duomenų bazė. Tyrimo metu atlikti testavimo rezultatai parodo, kokius saugos modulio konfigūravimo parametrus reikia taikyti norint užtikrinti maksimalų saugumo lygį. / The target of SQL injection attack – interactive web programs, which use database servers. Those programs allow users to input information and as it is imputed, it forms SQL queries, which are sent into database server. With SQL injection help, the attacker using input fields forms harmful section of SQL query, which modifies previous query. Exploiting attack of SQL injection, the attacker may learn confidential information, modify it or connect to system without knowing the password by authorisation bypass. In this research-paper the proposed security model takes over all information inputted by user, adjusts the safety rules and that way it improves the safety in order to guard from SQL injections at electronic business web systems as well as it register potential attempts to disrupt normal work of the system. In order to install the proposed safety model there is no need to configure the server or its software because in the moment of installation it changes only files of website programs. For purpose of executing this work, we use PHP programming language and MySQL database. During the analysis, the received test results show what configuration parameters of safety model we need to use in order to guarantee the maximum level of safety.
46

Symmetric schemes for efficient range and error-tolerant search on encrypted data

Chenette, Nathan Lee 05 July 2012 (has links)
Large-scale data management systems rely more and more on cloud storage, where the need for efficient search capabilities clashes with the need for data confidentiality. Encryption and efficient accessibility are naturally at odds, as for instance strong encryption necessitates that ciphertexts reveal nothing about underlying data. Searchable encryption is an active field in cryptography studying encryption schemes that provide varying levels of efficiency, functionality, and security, and efficient searchable encryption focuses on schemes enabling sub-linear (in the size of the database) search time. I present the first cryptographic study of efficient searchable symmetric encryption schemes supporting two types of search queries, range queries and error-tolerant queries. The natural solution to accommodate efficient range queries on ciphertexts is to use order-preserving encryption (OPE). I propose a security definition for OPE schemes, construct the first OPE scheme with provable security, and further analyze security by characterizing one-wayness of the scheme. Efficient error-tolerant queries are enabled by efficient fuzzy-searchable encryption (EFSE). For EFSE, I introduce relevant primitives, an optimal security definition and a (somewhat space-inefficient, but in a sense efficient as possible) scheme achieving it, and more efficient schemes that achieve a weaker, but practical, security notion. In all cases, I introduce new appropriate security definitions, construct novel schemes, and prove those schemes secure under standard assumptions. The goal of this line of research is to provide constructions and provable security analysis that should help practitioners decide whether OPE or FSE provides a suitable efficiency-security-functionality tradeoff for a given application.
47

An Improved Utility Driven Approach Towards K-Anonymity Using Data Constraint Rules

Morton, Stuart Michael 14 August 2013 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / As medical data continues to transition to electronic formats, opportunities arise for researchers to use this microdata to discover patterns and increase knowledge that can improve patient care. Now more than ever, it is critical to protect the identities of the patients contained in these databases. Even after removing obvious “identifier” attributes, such as social security numbers or first and last names, that clearly identify a specific person, it is possible to join “quasi-identifier” attributes from two or more publicly available databases to identify individuals. K-anonymity is an approach that has been used to ensure that no one individual can be distinguished within a group of at least k individuals. However, the majority of the proposed approaches implementing k-anonymity have focused on improving the efficiency of algorithms implementing k-anonymity; less emphasis has been put towards ensuring the “utility” of anonymized data from a researchers’ perspective. We propose a new data utility measurement, called the research value (RV), which extends existing utility measurements by employing data constraints rules that are designed to improve the effectiveness of queries against the anonymized data. To anonymize a given raw dataset, two algorithms are proposed that use predefined generalizations provided by the data content expert and their corresponding research values to assess an attribute’s data utility as it is generalizing the data to ensure k-anonymity. In addition, an automated algorithm is presented that uses clustering and the RV to anonymize the dataset. All of the proposed algorithms scale efficiently when the number of attributes in a dataset is large.

Page generated in 0.0572 seconds