71 |
Interval order enumerationHannah, Stuart A. January 2015 (has links)
This thesis continues the study of interval orders and related structures, containing results on both the labeled and unlabeled variants. Following a result of Eriksen and Sjöstrand (2014) we identify a link between structures following the Fishburn distribution and Mahonian structures. This is used to detail a technique for the construction of Fishburn structures (structures in bijection with unlabeled interval orders) from appropriate Mahonian structures. This technique is introduced on a bivincular pattern of Bousquet-Mélou et al. (2010) and then used to introduce a previously unconsidered class of matchings; explicitly, zero alignment matchings according to the number of arcs which are both right-crossed and left-nesting. The technique is then used to identify a statistic on the factorial posets of Claesson and Linusson (2011) following the Fishburn distribution. Factorial posets mapped to zero by this statistic are canonically labeled factorial posets which may alternatively be viewed as unlabeled interval orders. As a consequence of our approach we find an identity for the Fishburn numbers in terms of the Mahonian numbers and discuss linear combinations of Fishburn patterns in a manner similar to that of the Mahonian combinations of Babson and Steingrímsson (2001). To study labeled interval orders we introduce ballot matrices, a signed combinatorial structure whose definition naturally follows from the generating function for labeled interval orders. A sign reversing involution on ballot matrices is defined. Adapting a bijection of Dukes, Jelínek and Kibitzke (2011), we show that matrices fixed under this involution are in bijection with labeled interval orders and that they decompose to a pair consisting of a permutation and an inversion table. To fully classify such pairs results pertaining to the enumeration of permutations having a given set of ascent bottoms are given. This allows for a new formula for the number of labeled interval orders.
|
72 |
NFC mobile coupon protocols : developing, formal security modelling and analysis, and addressing relay attackAlshehri, Ali A. January 2015 (has links)
Near Field Communication} (NFC) is a Radio Frequency (RF) technology that allows data to be exchanged between devices that are in close proximity. An NFC-based mobile coupon (M-coupon) is a coupon that is retrieved by the user from a source such as a newspaper or a smart poster and redeemed afterwards. The NFC-based mobile coupon (M-coupon) is a cryptographically secured electronic message with some value stored at user's mobile. The M-coupon requires secure issuing and cashing (redeeming). Uncontrolled copies of the M-coupons would cause losses for a company and damage its reputation. The main goal of this thesis is to enhance the security of NFC mobile coupon protocols. In order to address the NFC M-coupon threats, there are specific and general security requirements. For the specific NFC M-coupon requirements, a number of protocols have been proposed in the literature. We perform a formal security analysis of NFC M-coupon protocols, using formal methods (CasperFDR), in an effort to check the the security of these protocols and whether they address their requirements. We develop a general framework of capturing the NFC M-coupon requirements and apply it to four existing protocols in the literature, and two new protocols that we have developed. The general security requirement that affects all NFC protocols is the issue of relay attacks. A relay attack happens when an intruder extends the distance between two NFC devices while both devices are under the impression they are close to each other. We propose three NFC User Key Confirmation Protocols (UKC) to address the NFC relay attack. The UKC protocols are a collaboration between the cryptographic protocols, the user and the NFC mobile in an effort to prove the proximity. We formally verify the three protocols using CasperFDR.
|
73 |
Semantic interoperability of large complex health datasets requires an ontological approach : a mixed method studyLiyanage, Harshana January 2015 (has links)
The 'connected world' forces us to think about 'interoperability' as a primary requirement when building health care databases in the present day. Whilst semantic interoperability has made a major contribution to data utilisation between systems it often has not been able to integrate some large heterogeneous datasets required for research. As health data gets 'bigger' and complex, we are required to shift to rapid and flexible ways of resolving problems related to semantic interoperability. Ontological approaches accelerate implementing interoperability due to the availability of robust tools and technology frameworks that promote reuse. This thesis reports the results of a mixed methods study that proposes a pragmatic methodology that maximises the use of ontologies across a multilayered research readiness model which can be used in data-driven health care research projects. The research examined evidence for the use of ontologies across a majority of layers in the reference model. The first part of the thesis examines the methods used for assessing readiness to participate in research across six dimensions of health care. It reports on existing ontological elements that boosts research readiness and also proposes ontological extensions for modelling the semantics of data sources and research study requirements. The second part of the thesis presents an ontology toolkit that supports rapid development of ontologies that can be used in health care research projects. It provides details of how an ontology toolkit for creating health care ontologies was developed through the consensus of a panel of informatics experts and clinicians. This toolkit evolved further to include a series of ontological building blocks that assist clinicians to rapidly build ontologies.
|
74 |
Formal modelling and analysis of mix net implementationsStathakidis, Efstathios January 2015 (has links)
Elections are at the heart of democratic societies and for this reason they should provide the voters the assurance that their votes have been cast as intended and that the final result is accurate, whilst at the same time, delivering voter anonymity and secrecy of the votes. On the other hand, the voters should trust the voting systems and be able to verify the correctness of the final tally and the integrity of the elections. In this regard, in recent years, thanks to the improve- ment of cryptographic techniques, different electronic voting schemes have been implemented. However, such schemes are rather complex, and in order to pre- serve their properties, rely on the integrity and trustworthiness of single points of trust and failure. With the aim of achieving anonymity and public verifiability, Mix Net protocols have been developed for use in conjunction with a trusted and publicly accessible Web site, on to which all the produced data are posted for ver- ification. However, implementing such distributed algorithms is not trivial and many Mix Net constructions have been broken after they were introduced. This thesis identifies the problems existing in Mix Net implementations and proposes sound solutions to address them. The work presented in this thesis increases the rigour with which Mix Net pro- tocols are verified against their security requirements. Moreover, it bridges the gap existing in the literature regarding the absence of formal modelling, analysis and automated verification of Mix Net implementations, by using the process algebra, Communicating Sequential Processes (CSP), and the model checker, Failures-Divergence Refinement (FDR). In particular, the version of the source code taken on 7 January 2014, which formed the basis of the analysis in this the- sis, did not meet all of the security requirements for the election to take place in November 2014, and solutions are proposed. In the event, the code was updated and the version for the election did not have the flaws identified here. More- over, a novel method is put forward for constructing and making conventional Mix Net implementations robust, by distributing the trust among its components and giving them the power to decide about the others’ honesty; an approach that can be adopted for all constructions that follow the same design principle. Ad- ditionally, different and more efficient methods for devising such protocols are demonstrated. Finally, the automated analysis conducted in this thesis has been performed under the existence of a powerful intruder, who can perform a number of active attacks.
|
75 |
Feature extraction and clustering techniques for digital image forensicsAlfraih, Areej S. January 2015 (has links)
This thesis proposes an adaptive algorithm which applies feature extraction and clustering techniques for cloning detection and localization in digital images. Multiple contributions have been made to test the performance of different feature detectors for forensic use. The �first contribution is to improve a previously published algorithm by Wang et al. by localizing tampered regions using the grey-level co-occurrence matrix (GLCM) for extracting texture features from the chromatic component of an image (Cb or Cr component). The main trade-off� is a diminishing detection accuracy as the region size decreases. The second contribution is based on extracting Maximally Stable Extremal Regions (MSER) features for cloning detection, followed by k-means clustering for cloning localization. Then, for comparison purposes, we implement the same approach using Speeded Up Robust Features (SURF) and Scale-Invariant Feature Transform (SIFT). Experimental results show that we can detect and localize cloning in tampered images with an accuracy reaching 97% using MSER features. The usability and effi�cacy of our approach is verified by comparing with recent state-of-the-art approaches. For the third contribution we propose a flexible methodology for detecting cloning in images, based on the use of feature detectors. We determine whether a particular match is the result of a cloning event by clustering the matches using k-means clustering and using a Support Vector Machine (SVM) to classify the clusters. This descriptor-agnostic approach allows us to combine the results of multiple feature descriptors, increasing the potential number of keypoints in the cloned region. Results using MSER, SURF and SIFT outperform state of the art where the highest true positive rate is achieved at approximately 99.60% and the false positive rate is achieved at 1.6%, when different descriptors are combined. A statistical �filtering step, based on computing the median value of the dissimilarity matrix, is also proposed. Moreover, our algorithm uses an adaptive technique for selecting the optimal k value for each image independently, allowing our method to detect multiple cloned regions. Finally, we propose an adaptive technique that chooses feature detectors based on the type of image being tested. Some detectors are robust in detecting features in textured images while other detectors are robust in detecting features in smooth images. Combining the detectors makes them complementary to each other and can generate optimal results. The highest value for the area under ROC curve is achieved at approximately 98.87%. We also test the performance of agglomerative hierarchical clustering for cloning localization. Hierarchical and k-means clustering techniques have a similar performance for cloning localization. The True Positive Rate (TPR) for match level localization is achieved at approximately 97.59% and 96.43% for k-means and hierarchical clustering techniques, respectively. The robustness of our technique is analyzed against additive white Gaussian noise and JPEG compression. Our technique is still reliable even when using a low signal-to-noise (SNR = 20 dB) or a low JPEG compression quality factor (QF = 50).
|
76 |
An enhanced semantic VLE based on schema.org and social mediaAldaej, Abdulaziz A. January 2015 (has links)
After the arrival of the web in the 1990s, educational institutions started to maintain their learning materials within Virtual Learning Environments (VLEs), as the web is a significant source of material for many students and teachers. However, there has been less development in the current VLEs in the past few years, which remain heavily centred on single institutions even though the web has been developing (e.g., web 2.0, web 3.0). There is a clear need to integrate VLEs with the wider and social Web and maintain its learning contents freely open in order to support the sharing and reuse of learning resources. In this PhD project, we have prototyped a simple VLE that makes use of Version 7 of the Semantic Content Management System (SCMS) Drupal in order to provide a more open, social and semantically structured learning environment. Essentially, we aim to add semantic markup based on Schema.org vocabularies (the semantic markup that is supported by major search providers including Bing, Google, Yahoo! and Yandex), and integrate social networking and media to develop and enhance VLEs by improving sharing, discovering and reusing of learning contents. In June 2011, the major search engines (Bing, Google, Yahoo! and Yandex) announced the new innovation of Schema.org. This PhD project focuses also on our proposal to Schema.org by proposing additional concepts to describe VLEs’ content with rich semantic information due the limited support for describing educational resources in the current schema. This proposal aims to extend to the previous work that has been included in the schema by The Learning Resource Metadata Initiative (LRMI) in order to provide an enhanced approach to describe learning contents with rich semantic data in a VLE context. Through this thesis project, we will introduce, describe, evaluate and discuss the prototyped VLE in order to demonstrate the advantages of social and semantic web technologies for VLEs. We demonstrate how an advanced SCMS such as Drupal can offer advantages over existing VLE platforms in terms of: sharing of learning content with social networks; provision of advanced media features. We also demonstrate how Drupal’s support for schema.org can be used to enhance the findability of on-line learning content, and propose enhancements to schema.org that will make it more relevant to the needs of learning platforms. This proposal has been evaluated by schema.org and LRMI and a working group set up to take the proposal forward.
|
77 |
Towards a better understanding of the evolution of senescence, apoptosis and tumour growthVinayak, Aasis January 2015 (has links)
Senescence (ageing) and apoptosis (programmed cell death) are phenomena that have troubled theoreticians and experimentalists. Previous research showed that the mortality curve of the yeast population followed the Gompertz-Makeham equation. We develop a generalised theoretical model which shows that the mortality of the organism can be expressed as a function of Ageing Factors such as ERCs. We use this idea to explain why senescence leads to apoptosis. Antagonistic pleiotropy and disposable soma theory suggest that senescence (and accordingly apoptosis) is a `side effect'. Although the altruistic benefits of apoptosis have been suggested before, we are attempting to show that in a resource-restricted environment, apoptosis can be a strategic choice. We show that the interactions between apoptotic and non-apoptotic organisms can be modelled using game theory and differential equations. We find that switching to apoptotic mode gives the organism an advantage over the non-apoptotic organisms in a resource-restricted environment. Mathematical analysis indicates that apoptosis is a stable strategy provided the conditions remain the same. We also find that one apoptotic organism can invade a population of non-apoptotic organisms. This begs the question - why do tumours (which are non-apoptotic) occur if apoptosis is the best strategy? We show that apoptosis and angiogensis play a significant role in the development of tumours. We studied the effects of these two parameters on the dynamics of tumour and apoptotic populations. We find that the mixed strategy of avoidance of apoptosis and angiogenesis gives neoplasms an advantage over apoptotic organisms in certain conditions. Accordingly, the tumour organisms can invade apoptotic tissues. We also find that this strategy is not beneficial in the long-term.
|
78 |
Haptic feedback of rigid tool/soft object interaction in medical training and robot-assisted minimally invasive surgeryLi, Min January 2014 (has links)
Sense of touch is crucial for surgeons to effectively identify tumours and boundaries, and, thus to achieve successful cancer resections. To overcome the touch information loss which occurs during robotic-assisted surgical procedures, researchers have proposed methods capable of acquiring partial haptic feedback and mimicking the physical interaction which takes place between surgical tools and human tissue during palpation. This thesis proposes and evaluates haptic palpation systems and suggests the combination of different feedback methods for tumour identification in medical training and robot-assisted minimally invasive surgery using tissue models based on rolling indentation. A real-time visual tissue stiffness feedback method is proposed and compared to the performance of direct force feedback using tumour identification performance based on user studies with human subjects. The trade-off problem between system transparency and stability, which is caused by direct force feedback using a tele-manipulation system, is circumvented with the introduction of an intra-operative haptic tissue model generation method capable of representing tissue stiffness distribution of the examined soft tissue. During palpation, force feedback is exerted based on this model. This thesis proposes pseudo-haptic feedback and visualization of tissue surface deformation as an effective method to provide realistic palpation experience, which does not require the use of expensive haptic devices and is capable of handling three-dimensional haptic information. The tumour identification results are compared using different input devices: a computer mouse, a 3-DOF motion tracking input device and force-sensitive 2D haptic surface input devices. Furthermore, it is shown that the performance of haptic systems can be improved beyond the performance of force-feedback-only haptic systems by intelligently combining force feedback and pseudo-haptic feedback. Multi-fingered palpation is more effective in detecting differences in stiffness in the examined tissue than single-fingered palpation methods. Two approaches of multi-fingered palpation are proposed, studied and evaluated in this thesis: (1) methods using pseudo-haptic feedback and (2) those that use stiffness actuators. The performance of these methods is compared with the performance of single-fingered palpation approaches.
|
79 |
Bioinformatic analysis of genomic sequencing data : read alignment and variant evaluationFrousios, Kimon January 2014 (has links)
The invention and rise in popularity of Next Generation Sequencing technologies has led to a steep increase of sequencing data and the rise of new challenges. This thesis aims to contribute methods for the analysis of NGS data, and focuses on two of the challenges presented by these data. The first challenge regards the need for NGS reads to be aligned to a reference sequence, as their short length complicates direct assembly. A great number of tools exist that carry out this task quickly and efficiently, yet they all rely on the mere count of mismatches in order to assess alignments, ignoring the knowledge that genome composition and mutation frequencies are biased. Thus, the use of a scoring matrix that incorporates the mutation and composition biases observed among humans was tested with simulated reads. The scoring matrix was implemented and incorporated into the in-house algorithm REAL, allowing side-by-side comparison of the performance of the biased model and the mismatch count. The algorithm REAL was also used to investigate the applicability of NGS RNA-seq data to the understanding of the relationship between genomic expression and the compartmentalisation of genomic base composition into isochores. The second challenge regards the evaluation of the variants (SNPs) that are discovered by sequencing. NGS technologies have caused a sharp rise in the rate with which new SNPs are discovered, rendering impossible the experimental validation of each one. Several tools exist that take into account various properties of the genome, the transcripts and the protein products relevant to the location of a SNP and attempt to predict the SNP's impact. These tools are valuable in screening and prioritising SNPs likely to have a causative association with a genetic disease of interest. Despite the number of individual tools and the diversity of their resources, no attempt had been made to draw a consensus among them. Two consensus approaches were considered, one based on a very simplistic vote majority of the tools considered, and one based on machine learning. Both methods proved to offer highly competitive classification both against the individual tools and against other consensus methods that were published in the meantime.
|
80 |
Robotic granular jammingJiang, Allen January 2014 (has links)
Granular jamming is a phenomenon where discrete granules can transition their macro behavior between a fluid-like and a solid-like state. In the context of soft robotics, this thesis examines granular jamming in three key aspects. The first aspect investigates the modeling of the granule behavior as it is jammed and unjammed. A simplified model was developed where the macro stiffness of the granules in different states is encompassed in one variable, E. In an engineering context, the usage of this one variable enables structures of different types to be quantitatively compared. The second aspect investigates the structure of mechanisms using granule jamming. For the granules, experiments were performed on different shapes, sizes, and materials. The results show that there can be a significant change in the stiffness range and profile when a parameter of the granule is changed. The properties of the membrane holding the granules and properties of the interparticle fluid were also studied for their affect on the behavior of the stiffnesses. For the membrane, the boundary layer applying the external stress, different materials and designs were found to have as much as a significant impact as changing granule type. For the interparticle fluid, the effects of air and water as the inter-granule fluid were compared, which showed that there is no significant difference between the stiffness range. However, the use of water can shrink the transition region between the fluid-like and solid-like states of granular behavior. While this smaller transition region limits the tunability of compliance for a granular jamming mechanism, it gives the added benefit of requiring less volume of fluid to create vacuum, due to the incompressibility to water. This effectively untethers granular jamming mechanisms from cumbersome vacuum pumps, a key step in mobilizing the technology in the context of robotics. Lastly, this thesis focuses on the aspect of control for granular jamming. This encompasses the development of granular jamming-based actuators and the control of those actuators though impedance control.
|
Page generated in 0.0657 seconds