• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • No language data
  • Tagged with
  • 24
  • 24
  • 24
  • 9
  • 9
  • 9
  • 4
  • 4
  • 4
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Measurement and management of the impact of mobility on low-latency anonymity networks

Doswell, Stephen January 2016 (has links)
Privacy, including the right to privacy of correspondence, is a human right. Privacy-enhancing technologies, such as the Tor anonymity network, help maintain this right. The increasing use of Tor from mobile devices raises new challenges for the continued effectiveness of this low-latency anonymity network. Mobile Tor users may access the Internet from a range of wireless networks and service providers. Whenever a wireless network hands-off a mobile device’s connection from one access point to another, its external Internet Protocol (IP) address changes, and the connection to the Tor network is dropped. Every dropped connection requires the Tor circuit to be rebuilt. The time required to rebuild the circuit negatively impacts client performance. This research is the first to highlight this negative impact and to investigate the likely extent of the impact for typical usage scenarios and mobility models. The increased network churn caused by circuit rebuilding also negatively impacts anonymity. A novel metric (q-factor) is proposed here to measure the trade-off between anonymity and performance over the duration of a communication session. Two new solutions to the problems of managing mobility in a low-latency anonymity network are proposed in this thesis. The first solution relies on adaptive client throttling, based on a Kaplan-Meier estimator of the likelihood of a mobile network hand-off. The second solution relies on the use of a static bridge relay (mBridge) that acts as a persistent ‘home’ for a mobile Tor connection, so avoiding the need to recreate the Tor circuit whenever the mobile device is handed-off. The effectiveness of these solutions has been measured using the new q-factor metric. Both solutions provide better performance for mobile Tor clients than the standard Tor client implementation, although some performance reduction by comparison with static Tor clients remains. The bridge relay solution (mBridge) has been shown to offer better performance than client throttling, but is more vulnerable to certain types of attack. A strength of both solutions is that changes are restricted to client devices, the existing algorithms and protocols of the interior Tor network are unaffected.
12

The development and evaluation of a virtual simulation tool for testing emergency response planning strategies within the UK gas industry

Rogage, Kay January 2014 (has links)
Third party damage from activities such as work carried out by contractors’ poses risks to gas pipelines. Within the UK, emergency plans are drawn up in an attempt to mitigate the significant consequences of any pipeline failure. The Control of Major Accident Hazards 1999 and the Pipeline Safety Regulations 1996 place legislative requirements on UK gas infrastructure providers, to regularly test emergency plans with simulation exercises. The exercises are intended to support the preparation of responders for dealing with incidents of failure. Software simulation is not currently utilised to facilitate the testing of emergency response plans in the UK gas pipeline industry. This project serves to evaluate the user acceptance of a software simulation prototype to enable the testing of emergency response planning strategies in the UK gas industry. Current emergency planning legislation and strategies applied to satisfy legislation within the UK gas industry are reviewed. The adoption and application of software simulation for the development of applied skill in other industries is examined, to determine the potential for use in testing emergency response planning for gas incidents. The Technology Acceptance Model (TAM) is the theoretical framework that underpins the study of user acceptance, of a software simulation prototype designed for running exercises to test emergency response plans. A case study evaluation of the user acceptance of the prototype, by representatives experienced in testing emergency response planning strategies in the gas industry, is presented. The participants in this case study are drawn from the Police, Fire and Rescue Service, Local Authority and Gas infrastructure provider, that perform a range of job roles operating at Operational, Tactical and Strategical levels. The research findings demonstrate that the participants perceive software simulation of emergency response planning processes for gas incidents to be beneficial. The TAM claims that if users perceive a system to be useful they are likely to adopt that system. Furthermore if users don’t perceive a system to be easy to use, according to the TAM, they will still adopt it after the correct training has been provided. Users would be most likely to adopt and use the software to facilitate emergency response planning exercises, if the correct training is provided. Software simulation offers great potential for the testing of emergency plans, it provides a controlled environment where decisions and responses can be audited and mistakes can be made without serious consequence. Software simulation has been shown to enhance, rather than replace, existing emergency response planning processes.
13

RFID enabled constraint based scheduling for transport logistics

Choosri, Noppon January 2012 (has links)
This research aims to develop a realistic solution to enhance the efficiency of a transport logistics operation. The case study in this research is one of the largest agricultural suppliers in Northern Thailand. The cost of logistics in Thailand is relatively high compared to other countries, i.e. 11% of Gross Domestic Product (GDP) in 2007, and is particularly high in agricultural sector. The focus of the study is to enhance and improve transportation activities which typically account for the largest cost in logistics. The research is entitled ‘RFID enabled constraint based scheduling for transport logistics ’ The dissertation studies two important research components: 1) the data acquisition using Radio Frequency Identification Technology (RFID) for monitoring vehicles in a depot and 2) the scheduling by solving Constraint Satisfaction Optimisation Problem (CSOP) using Constraint Programming (CP). The scheduling problem of the research is to compose and schedule a fleet in which both private and subcontracting (outsourcing) vehicles are available, but to minimise the use of subcontractors. Several contributions from this study can be identified at each stage of the study ranging from extensively reviewing the literature, field studies, developing the RFID prototype system for vehicle tracking, modelling and solving the defined scheduling problems using Constraint Programming, developing a RFID-CP based real time scheduling, and validating the proposed methods. A number of validations are also carried out throughout the research. For instance, laboratory based experiments were conducted to measure the performance of the developed RFID tracking system in different configurations. Scenario tests were used to test the correctness of the proposed CP-based scheduling system, and structure interviews were used to collect feedbacks on the developed prototype from the case study company.
14

Subcarrier intensity modulated free-space optical communication systems

Popoola, Wasiu Oyewole January 2009 (has links)
This thesis investigates and analyses the performance of terrestrial free-space optical communication (FSO) system based on the phase shift keying pre-modulated subcarrier intensity modulation (SIM). The results are theoretically and experimentally compared with the classical On-Off keying (OOK) modulated FSO system in the presence of atmospheric turbulence. The performance analysis is based on the bit error rate (BER) and outage probability metrics. Optical signal traversing the atmospheric channel suffers attenuation due to scattering and absorption of the signal by aerosols, fog, atmospheric gases and precipitation. In the event of thick fog, the atmospheric attenuation coefficient exceeds 100 dB/km, this potentially limits the achievable FSO link length to less than 1 kilometre. But even in clear atmospheric conditions when signal absorption and scattering are less severe with a combined attenuation coefficient of less than 1 dB/km, the atmospheric turbulence significantly impairs the achievable error rate, the outage probability and the available link margin of a terrestrial FSO communication system. The effect of atmospheric turbulence on the symbol detection of an OOK based terrestrial FSO system is presented analytically and experimentally verified. It was found that atmospheric turbulence induced channel fading will require the OOK threshold detector to have the knowledge of the channel fading strength and noise levels if the detection error is to be reduced to its barest minimum. This poses a serious design difficulty that can be circumvented by employing phase shift keying (PSK) pre-modulated SIM. The results of the analysis and experiments showed that for a binary PSK-SIM based FSO system, the symbol detection threshold level does not require the knowledge of the channel fading strength or noise level. As such, the threshold level is fixed at the zero mark in the presence or absence of atmospheric turbulence. Also for the full and seamless integration of FSO into the access network, a study of SIM-FSO performance becomes compelling because existing networks already contain subcarrier-like signals such as radio over fibre and cable television signals. The use of multiple subcarrier signals as a means of increasing the throughput/capacity is also investigated and the effect of optical source nonlinearity is found to result in intermodulation distortion. The intermodulation distortion can impose a BER floor of up to 10-4 on the system error performance. In addition, spatial diversity and subcarrier delay diversity techniques are studied as means of ameliorating the effect of atmospheric turbulence on the error and outage performance of SIM-FSO systems. The three spatial diversity linear combining techniques analysed are maximum ratio combining, equal gain combining and selection combining. The system performance based on each of these combining techniques is presented and compared under different strengths of atmospheric turbulence. The results predicted that achieving a 4 km SIM-FSO link length with no diversity technique will require about 12 dB of power more than using a 4 × 4 transmitter/receiver array system with the same data rate in a weak turbulent atmospheric channel. On the other hand, retransmitting the delayed copy of the data once on a different subcarrier frequency was found to result in a gain of up to 4.5 dB in weak atmospheric turbulence channel.
15

Design and implementation of secure chaotic communication systems

Kharel, Rupak January 2011 (has links)
Chaotic systems have properties such as ergodicity, sensitivity to initial conditions/parameter mismatches, mixing property, deterministic dynamics, structure complexity, to mention a few, that map nicely with cryptographic requirements such as confusion, diffusion, deterministic pseudorandomness, algorithm complexity. Furthermore, the possibility of chaotic synchronization, where the master system (transmitter) is driving the slave system (receiver) by its output signal, made it probable for the possible utilization of chaotic systems to implement security in the communication systems. Many methods like chaotic masking, chaotic modulation, inclusion, chaotic shift keying (CSK) had been proposed however, many attack methods later showed them to be insecure. Different modifications of these methods also exist in the literature to improve the security, but almost all suffer from the same drawback. Therefore, the implementation of chaotic systems in security still remains a challenge. In this work, different possibilities on how it might be possible to improve the security of the existing methods are explored. The main problem with the existing methods is that the message imprint could be found in the dynamics of the transmitted signal, therefore by some signal processing or pattern classification techniques, etc, allow the exposition of the hidden message. Therefore, the challenge is to remove any pattern or change in dynamics that the message might bring in the transmitted signal.
16

Quantitative assessment of factors in sentiment analysis

Chalorthorn, Tawunrat January 2016 (has links)
Sentiment can be defined as a tendency to experience certain emotions in relation to a particular object or person. Sentiment may be expressed in writing, in which case determining that sentiment algorithmically is known as sentiment analysis. Sentiment analysis is often applied to Internet texts such as product reviews, websites, blogs, or tweets, where automatically determining published feeling towards a product, or service is very useful to marketers or opinion analysts. The main goal of sentiment analysis is to identify the polarity of natural language text. This thesis sets out to examine quantitatively the factors that have an effect on sentiment analysis. The factors that are commonly used in sentiment analysis are text features, sentiment lexica or resources, and the machine learning algorithms employed. The main aim of this thesis is to investigate systematically the interaction between sentiment analysis factors and machine learning algorithms in order to improve sentiment analysis performance as compared to the opinions of human assessors. A software system known as TJP was designed and developed to support this investigation. The research reported here has three main parts. Firstly, the role of data pre-processing was investigated with TJP using a combination of features together with publically available datasets. This considers the relationship and relative importance of superficial text features such as emoticons, n-grams, negations, hashtags, repeated letters, special characters, slang, and stopwords. The resulting statistical analysis suggests that a combination of all of these features achieves better accuracy with the dataset, and had a considerable effect on system performance. Secondly, the effect of human marked up training data was considered, since this is required by supervised machine learning algorithms. The results gained from TJP suggest that training data greatly augments sentiment analysis performance. However, the combination of training data and sentiment lexica seems to provide optimal performance. Nevertheless, one particular sentiment lexicon, AFINN, contributed better than others in the absence of training data, and therefore would be appropriate for unsupervised approaches to sentiment analysis. Finally, the performance of two sophisticated ensemble machine learning algorithms was investigated. Both the Arbiter Tree and Combiner Tree were chosen since neither of them has previously been used with sentiment analysis. The objective here was to demonstrate their applicability and effectiveness compared to that of the leading single machine learning algorithms, Naïve Bayes, and Support Vector Machines. The results showed that whilst either can be applied to sentiment analysis, the Arbiter Tree ensemble algorithm achieved better accuracy performance than either the Combiner Tree or any single machine learning algorithm.
17

Measuring the homogeneity and similarity of language corpora

Cavaglia, Gabriela Maria Chiara January 2005 (has links)
Corpus-based methods are now dominant in Natural Language Processing (NLP). Creating big corpora is no longer difficult and the technology to analyze them is growing faster, more robust and more accurate. However, when an NLP application performs well on one corpus, it is unclear whether this level of performance would be maintained on others. To make progress on these questions, we need methods for comparing corpora. This thesis investigates comparison methods based on the notions of corpus homogeneity and similarity.
18

Revitalising executive information systems for supporting executive intelligence activities

Ong, Koon Y. (Vincent) January 2006 (has links)
With the increasing amount, complexity and dynamism of operational and strategic information in electronic and distributed environment, executives are seeking assistance for continuous, self-reactive and self-adaptive activities or approaches of acquiring, synthesising and interpreting information for intelligence with a view to determining the course of action - executive intelligence activities. Executives Information Systems (EIS) were originally emerged as a computer-based tool to help senior executives to manage the search and process of information. EIS was popularised in 1990's but EIS study have not advanced to a great extent in either research or practice since its prevalence in the mid and late 1990's. Conventional EIS studies have established some views and guidelines for EIS design and development, but the guidelines underpinned by preceding research have failed to develop robust yet rational EIS for handling the current executive's information environment. The most common deficiency of traditional EIS is the static and inflexible function with predetermined information needs and processes designed for static performance monitoring and control. The current emergence of the intelligent software agent, as a concept and a technology, with applications, provides prospects and advanced solutions for supporting executive's information processing activities in a more integrated and distributed environment of the Internet. Although software agents offer the prospective to support information processing activities intelligently, executive's desires and perception of agent-based support must be elucidated in order to develop a system that is considered valuable for executives. This research attempts to identify executive criteria of an agent-based EIS for supporting executive intelligence activities. Firstly, four focus groups were conducted to explore and reveal the current state of executive's information environment and information processing behaviour in the light of Internet era, from which to examine the validity of the conventional views of EIS purpose, functions and design guidelines. Initial executive criteria for agent-based EIS design were also identified in the focus group study. Secondly, 25 senior managers were interviewed for deeper insights on value-added attributes and processes of executive criteria for building agent-based EIS. The findings suggest a "usability-adaptability-intelligence" trichotomy of agent-based EIS design model that comprises executive criteria of value-added attributes and processes for building a usable, adaptable and intelligent EIS.
19

An autoethnography exploring the engagement of records management through a computer mediated communication focused co-operative inquiry

Lomas, Elizabeth January 2013 (has links)
This thesis is an autoethnography exploring the engagement of records management (RM) through the vehicle of a computer mediated communication (CMC) focused co-operative inquiry. CMC is defined as, “communication that takes place between human beings via the instrumentality of computers” (Herring, 1996, p.81). The PhD stance was that with the advent of new technologies, such as CMC, the role and place of RM has been challenged. RM practitioners needed to evaluate their principles and practice in order to discover why RM is not uniformly understood and also why it fails to engage many CMC users and information professionals. The majority of today’s information is generated as the result of unstructured communications (AIIM, 2005 and 2006) that no longer have a fixed reality but exist across fragmented globalised spaces through the Cloud, Web 2.0 and software virtualisation. Organisational boundaries are permanently perforated and the division between public and private spaces are blurred. Traditional RM has evolved in highly structured organisational information environments. Nevertheless, RM could lie at the heart of the processes required for dealing with this splintered data. RM takes a holistic approach to information management, establishing the legislative requirements, technical requirements and the training and support for individuals to communicate effectively, simultaneously transmitting and processing the communications for maximum current and ongoing organisational benefits. However RM is not uniformly understood or practiced. The focus of the thesis was to understand how RM engagement can and should be achieved. The research was conducted by establishing a co-operative inquiry consisting of 82 international co-researchers, from a range of disciplines, investigating the question, ‘How do organisations maximise the information potential of CMC for organisational benefit, taking into account the impact of the individual?” The PhD established a novel approach to co-operative inquiry by separating, managing and merging three groups of co-researchers (UK Records Managers, UK CMC users, international Records Managers and CMC users). I was embedded as a co-researcher within this wider inquiry personally exploring as an autoethnography the relevance of RM to the wider research question, the ability of RM practitioners to advocate for RM and the co-researchers’ responses to the place of RM within this context. The thesis makes several contributions to the research field. It examines how records managers and RM principles and practice engaged through the inquiry, articulating the reasons why users sometimes failed to engage with RM principles and practice, and what assists users to successfully engage with RM. It was found that national perspectives and drivers were more significant as to whether or not individuals engaged with RM concepts than age, gender or professional experience. In addition, users engaged with RM when it was naturally embedded within processes. In addition, as a result of the inquiry’s discussions and actions, the thesis suggests that RM principles and practice need to be refined, for example in regards to the characteristics that define a record. In this respect it concludes that there is rarely likely to be an original archival record surviving through time given the need for migration. The research delivered a novel approach to co-operative inquiry whereby merging groups through time produced new learning at each merger point. The thesis recommends further research to build upon its findings.
20

How do sociomaterial networks involving large-scale automation come into being, persist and change over time, within a healthcare environment?

Shaw, Christopher January 2014 (has links)
The aim of this thesis is to develop a theoretical model to explore how sociomaterial networks, involving large-scale automation, come into being, persist and change over time, within a healthcare environment. It does so by bridging the gap between design, implementation and use of large-scale pathology automation (LSPA) within two United Kingdom (UK) National Health Service (NHS) laboratories. A longitudinal, multi-site, ethnographic approach was used, along with semi-structured interviews, template analysis and participant observation of LSPA ‘in-practice’. This research has suggested that design features, embedded within the material properties of LSPA, were purposefully intended to bring about organisational change. In both user organisations, the material affordances of LSPA resulted in anticipated skill mix changes. However, material constraints required the enforcement of changes to organisational routines, creating operational difficulties, which were then subsequently transferred across organisational boundaries by the researcher/manager. The identification of these sociomaterial affordances and constraints, in conjunction with humans acting as boundary objects, had the unintended consequence of influencing strategic decision making and initiating structural and cultural change. The development and practical application of the resulting SociomANTerial model allowed the researcher to trace the analytical history of these organisational changes over time and consider the impact of broader social structures such as power. Ultimately it is suggested that a greater emphasis on collaboration between users, designers and corporate agents will result in more innovative approaches for technology adoption and improved organisational design.

Page generated in 0.5197 seconds