• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 22
  • 6
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 39
  • 39
  • 12
  • 11
  • 11
  • 11
  • 8
  • 7
  • 6
  • 6
  • 5
  • 5
  • 5
  • 5
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Data integrity in RFID systems

Alchazidis, Nikolaos. 09 1900 (has links)
One of the main problems that affect the data integrity of passive RFID systems is the collision between the tags. A popular anticollision algorithm which dominates the standards in HF and UHF passive RFID systems is Framed Slotted Aloha (FSA) and some variations of FSA. Throughput and Average time delay of the RFID system which determines the performance/efficiency of the system are reduced rapidly when the number of tags inside the interrogation zone is increased. Using larger frame sizes is not always the solution. This thesis discusses and compares the existing protocols, and proposes a variation of FSA, called Progressing Scanning (PS) algorithm. The PS algorithm divides the tags in the interrogation zone into smaller groups, and gives the ability to the reader to communicate each time with one of them. For performance analysis, the PS algorithm was evaluated with the parameters of a typical passive RFID system at 2.45 GHz . The results showed that the PS algorithm can improve the efficiency of the RFID system and provide a reliable solution for cases with a high density of tags in the area (over 800 tags ).
12

Projekt N10 : Projektsrapport

Simeon, Nika January 2007 (has links)
<p>DUE TO COPYRIGHT-RESTRICTIONS THIS PAPER IS NOT AVAILABLE FOR DOWNLOAD!</p><p>The thesis describes a system which communicates in real time with data loggers. The system has been streamlined and integrated with existing application so that each user can get a graphical presentation in real time on what has been sent to and from the units. The user interface and communication has been designed to be robust, user friendly, secure and offer functionality that yields the users of the system added value. The system is flexible from the design perspective and is low maintenance.</p>
13

Projekt N10 : Projektsrapport

Simeon, Nika January 2007 (has links)
DUE TO COPYRIGHT-RESTRICTIONS THIS PAPER IS NOT AVAILABLE FOR DOWNLOAD! The thesis describes a system which communicates in real time with data loggers. The system has been streamlined and integrated with existing application so that each user can get a graphical presentation in real time on what has been sent to and from the units. The user interface and communication has been designed to be robust, user friendly, secure and offer functionality that yields the users of the system added value. The system is flexible from the design perspective and is low maintenance.
14

Duomenų vientisumo apribojimų realizavimo strategijos didelėje įmonėje tyrimas / Investigation of strategy for implementation of data integrity constraints in large application system

Preibys, Justas 28 January 2008 (has links)
Darbe apžvelgiami duomenų vientisumo apribojimų tipai ir jų realizavimo būdai, analizuojami jų privalumai ir trūkumai. Kiekvienam apribojimo tipui, pagal veikimo trukmės ir įgyvendinimo sudėtingumo charakteristikas, parinktas labiausiai efektyvus realizavimo būdas bei eksperimentiškai patvirtintos analizės metu iškeltos hipotezės. Remiantis eksperimentinio tyrimo rezultatais, sudaryta ir aprašyta duomenų vientisumo apribojimų realizacijos strategijos parinkimo metodika, kuri galėtų padėti didelėms įmonėms efektyviai įgyvendinti duomenų vientisumo apribojimus, sunaudojant kuo mažiau laiko, darbo ir sistemos resursų, bet tuo pačiu užtikrinant duomenų vientisumą ir korektiškumą. / This thesis describes types of data integrity constraints, the ways for implementation these constraints, and their analyzed advantages and disadvantages. It is chosen most effective way for implementation every type of data integrity constraint, according to their run duration and complexity of implementation characteristics, and experimentally approved hypotheses, which were raised during analysis. With reference to the results of the experiment, is made and described the methodology of implementation strategy for data integrity constraints, which could help effectively implement data integrity constraints for large application systems, using as less as it is possible time, workforce and system resources, and at the same time ensuring data integrity and correctness.
15

A study regarding the effectiveness of game play as part of an information security awareness program for novices

Labuschagne, William Aubrey 09 1900 (has links)
Technology has become intertwined into society daily life which is not only limited to personal life but also extending into the business world. Availability, integrity and confidentiality are critical information security factors to consider when interacting with technology. Conversely many unsuspecting users have fallen prey to cyber criminals. The majority of threats encountered could have been prevented by the victims if they had sufficient knowledge to first identify and then mitigate the threat. The use of information security awareness programs provides a platform whereby users are informed about such threats. The success of these programs is significantly reduced if the content is not transferred in the most effective method to improve understanding and result in a change of behaviour. This dissertation addresses the effectiveness of using a gaming platform within an information security awareness program. The use of games allows for the users to apply knowledge within a potential scenario as seen with pilots using flight simulators. End users who have no information security background should have a safe platform where threats can be identified and methods taught to mitigate the threats. A wide selection of security awareness frameworks exist, the most appropriate framework should be considered first. The different phases of the framework would be applied within the dissertation with the main objective to ultimately determine the effectiveness of games within security awareness programs. Data was collected during the implemented information security awareness program using quantitative instruments. These included questionnaires and a developed online game designed from the literature reviewed during the study. The analysed data highlighted the effects of extrinsic motivation on knowledge transfer and validated the positive impact of game play. / Computing / M. Tech. (Information Technology)
16

Development of Methods for Improved Data Integrity and Efficient Testing of Wind Tunnel Models for Dynamic Test Conditions in Unsteady and Nonlinear Flight Regimes

Heim, Eugene Henry DeWendt 05 February 2004 (has links)
Today's high performance aircraft are operating in expanded flight envelopes, often maneuvering at high angular rates at high angles-of-attack, even above maximum lift. Current aerodynamic models are inadequate in predicting flight characteristics in the expanded envelope, such as rapid aircraft departures and other unusual motions. Unsteady flows of aircraft are of real concern. The ability to accurately measure aerodynamic loads directly impacts the ability to accurately model and predict flight. Current wind tunnel testing techniques do not adequately address the data fidelity of a test point under the influence of fluctuating loads and moments. Additionally, forced oscillation test techniques, one of the primary tools used to develop dynamic models, do not currently provide estimates of the uncertainty of the results during an oscillation cycle. Further, in testing models across a range of flight conditions, there are frequently parts of the envelope which are well behaved and require few data points to arrive at a sound answer, and other parts of the envelope where the responses are much more active and require a large sample of data to arrive at an answer with statistical significance. Currently, test methods do not factor changes of flow physics into data acquisition schemes, so in many cases data are obtained over more iterations than required, or insufficient data may be obtained to determine a valid estimate. Methods of providing a measure of data integrity for static and forced oscillation test techniques are presented with examples. A method for optimizing required forced oscillation cycles based on decay of uncertainty gradients and balance tolerances is also presented. / Master of Science
17

Understanding the Impacts of Data Integrity Attacks in the Context of Transactive Control Systems

Biswas, Shuchismita January 2018 (has links)
The rapid growth of internet-connected smart devices capable of exchanging energy price information and adaptively controlling the consumption of connected loads, has paved the way for transactive control to make inroads in the modern grid. Transactive control frameworks integrate the wholesale and retail energy markets, and enable active participation of end users, thereby playing a key role in managing the rising number of distributed assets.However, the use of internet for the communication of data among the building, distribution,and transmission levels makes the system susceptible to external intrusions. A skilled adversary can potentially manipulate the exchanged data with the intention to inflict damage to the system or increase financial gains. In this thesis, the effect of such data integrity attacks on information exchanged between the distribution systems operator and end-users is investigated. Impact on grid operations is evaluated using different categories like operational, financial, user comfort and reliability parameters. It is shown that attack impact depends on a number of factors like attack duration, time of attack, penetration rate etc besides the attack magnitude. The effect of an attack continues to persist for some time after its removal and hence effective detection and mitigation strategies will be required to ensure system resilience and robustness. / Master of Science / Transactive energy is a framework where price-responsive loads adjust their energy consumption at a certain time according to the real-time energy price sent by the utility. Field demonstrations in recent years have shown that transactive control can effectively manage grid objectives and also monetarily benefit both the electric utility and end-users. Therefore, transactive energy is expected to make inroads into conventional grid operations in the next few years. As successful operation of such a market depends on the information exchanged among different stakeholders, a malicious adversary may try to inject false data and affect system operations. This thesis investigates how manipulating data in the transactive energy platform affects system operations and financial gains of different stakeholders. Understanding system behavior under attack conditions will help in formulating effective detection and mitigation strategies and enhancing system resilience.
18

An investigation into data management as a strategic information tool and its importance at the Durban University of Technology

Francis, Ramani 26 November 2012 (has links)
Submitted in fulfilment of the requirements for the Degree of Master Technologiae: Public Management, Durban University of Technology, 2012. / The purpose of this study was to investigate data management as a strategic information tool and its importance at the Durban University of Technology. The problem revolved around, inter alia, data management and data accuracy as structured interventions impacting on sound decision making. There were many challenges identified with regard to data management at the Durban University of Technology. The research design adopted a quantitative methodological approach that was used for collecting data through the use of a precoded self administered questionnaire. The empirical component involved a survey method considering that in was an in-house investigation and the target population equated to only 174 respondents. A significant response rate of 74% was obtained using the personal method for the data collection. Several hypotheses were formulated relating to data quality initiatives, data owners and their responsibility and frequency of data analysis in order to determine accuracy. These were tested using the Pearson chi-square test as well as data that was analyzed to determine frequencies and percentages of responses. The data was analyzed using the computerized Statistical Program for Social Sciences (SPSS) program. There were some significant findings that emerged from the empirical analysis. A highly significant finding was that 95.31% of the respondents strongly agreed that data management and integrity is of utmost importance at the Durban University of Technology. One of the recommendation suggest that an imperative for the Durban University of Technology to manage its data as an asset, a policy on data integrity and integration policy should be developed and implemented. Another recommendation highlighted and staff should strive to attain proper classification on the database, considering that this directly impacts on the accuracy of the HEMIS submissions to the Ministry of Education for the state allocated subsidy. The study concludes with directions for further research as well.
19

A practical framework for harmonising welfare and quality of data output in the laboratory-housed dog

Hall, Laura E. January 2014 (has links)
In the UK, laboratory-housed dogs are primarily used as a non-rodent species in the safety testing of new medicines and other chemical entities. The use of animals in research is governed by the Animals (Scientific Procedures) Act (1986, amended 2012) and legislation is underpinned by the principles of humane experimental technique: Replacement, Reduction and Refinement. A link between animal welfare and the quality of data produced has been shown in other species (e.g. rodents, nonhuman primates), however, no established, integrated methodology for identifying or monitoring welfare and quality of data output previously existed in the laboratory-housed dog. In order to investigate the effects of planned Refinements to various aspects of husbandry and regulated procedures, this project sought to integrate behavioural, physiological and other measures (e.g. cognitive bias, mechanical pressure threshold) and to provide a means for staff to monitor welfare whilst also establishing the relationship between welfare and quality of data output. Affective state was identified using an established method of cognitive bias testing, before measuring welfare at ‘baseline’ using measures of behaviour and physiology. Dogs then underwent ‘positive’ and ‘negative’ behavioural challenges to identify the measures most sensitive to changing welfare and most suitable for use in a framework. The resulting Welfare Assessment Framework, developed in three groups of dogs from contrasting backgrounds within the facility, found a consistent pattern of behaviour, cardiovascular function, affect and mechanical pressure threshold (MPT). Dogs with a negative affective state had higher blood pressure at baseline than those with positive affective states, and the magnitude of the effect of negative welfare suggests that welfare may act as a confound in the interpretation of cardiovascular data. The responses to restraint included increases in blood pressure and heart rate measures which approached ceiling levels, potentially reducing the sensitivity of measurement. If maintained over time this response could potentially have a negative health impact on other organ systems and affecting the data obtained from those. Dogs with a negative welfare state also had a lower mechanical pressure threshold, meaning they potentially experienced greater stimulation from unpleasant physical stimuli. Taken together with the behaviours associated with a negative welfare state (predominantly vigilant or stereotypic behaviours) the data suggest that dogs with a negative welfare state have a greater behavioural and physiological response to stimuli in their environment; as such, data obtained from their use is different from that obtained from dogs with a positive welfare state. This was confirmed by examining the effect size (Cohen’s d ) resulting from the analysis of affective state on cardiovascular data. An increase in variance, particularly in the small dog numbers typical of safety assessment studies, means a reduction in the power of the study to detect the effect under observation; a decrease in variation has the potential to reduce the number of dogs use, in line with the principle of Reduction and good scientific practice. The development of the framework also identified areas of the laboratory environment suitable for Refinement (e.g. restriction to single-housing and restraint) and other easily-implemented Refinements (e.g. feeding toy and human interaction) which could be used to improve welfare. As a result of this, a Welfare Monitoring Tool (WMT) in the form of a tick sheet was developed for technical and scientific staff to identify those dogs at risk of reduced welfare and producing poor quality data, as well as to monitor the effects of Refinements to protocols. Oral gavage is a common regulated procedure, known to be potentially aversive and was identified as an area in need of Refinement. A program of desensitisation and positive reinforcement training was implemented in a study also comparing the effects of a sham dose condition versus a control, no-training, condition. A number of the measures used, including home pen behaviour, behaviour during dosing, MPT and the WMT showed significant benefits to the dogs in the Refined condition. Conversely, dogs in the sham dose condition showed more signs of distress and took longer to dose than dogs in the control condition. The welfare of control dogs was intermediate to sham dose and Refined protocol dogs. This project identified a positive relationship between positive welfare and higher quality of data output. It developed and validated a practical and feasible means of measuring welfare in the laboratory environment in the Welfare Assessment Framework, identified areas in need of Refinement and developed practical ways to implement such Refinements to husbandry and regulated procedures. As such it should have wide implications for the pharmaceutical industry and other users of dogs in scientific research.
20

The major security challenges to cloud computing.

Inam ul Haq, Muhammad January 2013 (has links)
Cloud computing is the computing model in which the computing resources such as software, hardware and data are delivered as a service through a web browser or light-weight desktop machine over the internet (Wink, 2012). This computing model abolishes the necessity of sustaining the computer resources locally hence cuts-off the cost of valuable resources (Moreno, Montero &amp; Llorente, 2012). A distinctive cloud is affected by different security issues such as Temporary Denial of Service (TDOS) attacks, user identity theft, session hijacking issues and flashing attacks (Danish, 2011). The purpose of this study is to bridge the research gap between the cloud security measures and the existing security threats. An investigation into the existing cloud service models, security standards, currently adopted security measures and their degree of flawless protection has been done. The theoretical study helped in revealing the security issues and their solutions whereas the empirical study facilitated in acknowledging the concerns of users and security analysts in regards to those solution strategies. The empirical methods used in this research were interviews and questionnaires to validate the theoretical findings and to grasp the innovativeness of practitioners dealing with cloud security.With the help of theoretical and empirical research, the two-factor mechanism is proposed that can rule out the possibility of flashing attacks from remote location and can help in making the cloud components safer. The problem of junk traffic can be solved by configuring the routers to block junk data packets and extraneous queries at the cloud outer-border. This security measure is highly beneficial to cloud security because it offers a security mechanism at the outer boundary of a cloud. It was evaluated that a DOS attack can become a huge dilemma if it affects the routers and the effective isolation of router-to-router traffic will certainly diminish the threat of a DOS attack to routers. It is revealed that the data packets that require a session state on the cloud server should be treated separately and with extra security measures because the conventional security measures cannot perform an in-depth analysis of every data packet. This problem can be solved by setting an extra bit in the IP header of those packets that require a state and have a session. Although this change should be done at universal level and would take time; it can provide a protocol-independent way to identify packets which require extra care. It will also assist firewalls to drop bits which are requesting a session sate without a state-bit being set. The cloud security analysts should consider that the interface and authentication layer should not be merged into a single layer because it endangers the authentication system as the interface is already exposed to the world. The use of login-aiding devices along with secret keys can help in protecting the cloud users. Moreover, a new cloud service model “Dedicated cloud” is proposed in this research work to reinforce the cloud security. It was discovered that the optimal blend of HTTPS and SSL protocols can resolve the problem of session hijacks. The client interface area should be protected by HTTPS protocols and the secure cookies should be sent through a SSL link along with regular cookies. Disallowing the multiple sessions and the use of trusted IP address lists will help even further. A reasonable amount of care has been paid to ensure clarity, validity and trustworthiness in the research work to present a verifiable scientific knowledge in a more reader-friendly manner. These security guidelines will enhance the cloud security and make a cloud more responsive to security threats. / Program: Masterutbildning i Informatik

Page generated in 0.0752 seconds