• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 611
  • 468
  • 358
  • 179
  • 80
  • 13
  • 12
  • 12
  • 12
  • 12
  • 12
  • 12
  • 10
  • 9
  • 4
  • Tagged with
  • 2162
  • 896
  • 559
  • 556
  • 556
  • 260
  • 223
  • 200
  • 158
  • 149
  • 141
  • 136
  • 134
  • 125
  • 108
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
161

Wide spectrum attribution : using deception for attribution intelligence in cyber attacks

Nicholson, Andrew January 2015 (has links)
Modern cyber attacks have evolved considerably. The skill level required to conduct a cyber attack is low. Computing power is cheap, targets are diverse and plentiful. Point-and-click crimeware kits are widely circulated in the underground economy, while source code for sophisticated malware such as Stuxnet is available for all to download and repurpose. Despite decades of research into defensive techniques, such as firewalls, intrusion detection systems, anti-virus, code auditing, etc, the quantity of successful cyber attacks continues to increase, as does the number of vulnerabilities identified. Measures to identify perpetrators, known as attribution, have existed for as long as there have been cyber attacks. The most actively researched technical attribution techniques involve the marking and logging of network packets. These techniques are performed by network devices along the packet journey, which most often requires modification of existing router hardware and/or software, or the inclusion of additional devices. These modifications require wide-scale infrastructure changes that are not only complex and costly, but invoke legal, ethical and governance issues. The usefulness of these techniques is also often questioned, as attack actors use multiple stepping stones, often innocent systems that have been compromised, to mask the true source. As such, this thesis identifies that no publicly known previous work has been deployed on a wide-scale basis in the Internet infrastructure. This research investigates the use of an often overlooked tool for attribution: cyber de- ception. The main contribution of this work is a significant advancement in the field of deception and honeypots as technical attribution techniques. Specifically, the design and implementation of two novel honeypot approaches; i) Deception Inside Credential Engine (DICE), that uses policy and honeytokens to identify adversaries returning from different origins and ii) Adaptive Honeynet Framework (AHFW), an introspection and adaptive honeynet framework that uses actor-dependent triggers to modify the honeynet envi- ronment, to engage the adversary, increasing the quantity and diversity of interactions. The two approaches are based on a systematic review of the technical attribution litera- ture that was used to derive a set of requirements for honeypots as technical attribution techniques. Both approaches lead the way for further research in this field.
162

An empirical investigation to examine the usability issues of using adaptive, adaptable, and mixed-initiative approaches in interactive systems

Alshumari, Mansour January 2015 (has links)
The combination of graphical user interface (GUI) and usability evaluation presents an advantage to mastering every piece of software and ensuring perfect quality of work. The increasing demand for online learning is becoming more important, both individually and academically. This thesis introduces and describes an empirical study to investigate and compare how vocabulary can be learned by using different interactive approaches; specifically, a static learning website (with straightforward words and meanings), an adaptable learning website (allowing the user to choose a learning method), an adaptive learning website (a system-chosen way of learning), and a mixed-initiative (mixing approaches and techniques). The purpose of this study is to explore and determine the effects of these approaches in learning vocabu-lary achievement to enhance vocabulary learning for non-English speakers. The par-ticipants were Arabic speakers. The three levels of vocabulary learning activities were categorised as easy, medium, and hard. The independent variables (IVs) were controlled during the experiment to ensure consistency and were as follows: tasks, learning effects, and time. The dependent variables (DVs) were learning vocabulary achievements and scores. Two aims were explored in relation to the effects of these approaches to achievement. The first related to learning vocabularies for non-English speakers tackling the difficulties of the English language and the second related to studying system usability of learning English vocabulary in terms of usability measures (efficiency, frequency of error occurrence, effectiveness, and satisfaction). For this purpose, a vocabulary-learning language website was designed, implement-ed, and tested empirically. To fulfill these requirements, it was first necessary to measure two usability components (efficiency and effectiveness) with a within-subject design of n = 24 subjects recruited and, for users’ satisfaction, a between-subject design of n = 99 subjects recruited, while investigating satisfaction with a system usability scale (SUS) survey. The results and data analysis were described. Overall, the results shown were all satisfactory.
163

An edition, from the manuscripts, of The cloud of unknowing, with an introduction, notes and glossary

Hodgson, Phyllis January 1936 (has links)
No description available.
164

The transformation of the medieval sermon

D'Avray, D. L. January 1977 (has links)
In the last few years research on medieval sermons has entered a new phase, and the thesis needs to be placed in the context of recent work in Prance, Germany, and Italy; a few years ago there would have been little point in attempting to write on the subject I have chosen. The work of a German scholar has provided us with systematic guides to the authors, incipits, and manuscripts of medieval sermons, so that it is now possible to make efficient use of the manuscript sources; monographs on vernacular preaching have cleared the way for a study of the Latin 'popular' sermons and their international circulation; moreover work of a small group of scholars, based in different parts of Europe, has reached a stage at which an attempt to write a synthesis is desirable A general survey of the subject would be premature: instead I have tried to outline an interpretation of the main turning point in the history of high medieval preaching. The decisive change was the revival of preaching to the laity, which had ceased to play a major part in religious life between the fall of the Roman empire and the rise of the medieval towns. In the first two sub sections of the thesis (pp.1-22) I give a brief selective narrative of the external history of this transformation of preaching, but the greater part of the thesis is devoted to the less obvious changes which lay behind the revival. The remaining sub-sections of part I deal with the pocket books of sermons which itinerant preachers used (p.22 seqq.), the diffusion of stereotyped material - 'preaching aids' - to help the busy or inexperienced (p.36seqq.), the emergence of an educated lay 'sermon hearing public' (p.58 seqq.), and the training of preachers, especially the friars (p.73 seqq.). Parts II and III are on the form and content of sermons respectively The theme of part II (p.92 seqq.) is the new sermon form which came to maturity in the thirteenth century, and its relation to the revival of popular preaching. Here I state and attempt to explain the paradox that a form which was closely associated with academic milieux was alco used with success in the vast majority of sermons to the laity. Part III (p.134 seqq.) reaches a somewhat similar conclusion by a different route. I try to show how far the content of preaching was adapted to the new urban public, and here I discuss in detail a genre of preaching aid designed to provide ready made sermons specially adapted to different sorts and conditions of men. However, I go on to argue that, apart from this genre, the content of sermons was less affected by the auditory than might have been expected: popular sermons do not differ greatly from academic sermons. I conclude by trying to show why the' same sort of sermon could have been effective with both university and lay congregations. The fourth part of the thesis (p.212 seqq.) is a case study of a sermon collection, variously called Legifer and the 'Collectiones fratrum', whose history seems to be an exceptionally clear illustration of a theme which deserves special emphasis. For although I try to give an idea of the variety of different aspects of the preaching revival, I also argue a thesis in the older sense of the term. It seems to me that the close relation between the academic world and popular preaching is a theme which deserves special attention, and that the University of Paris contributed to the revival of preaching in two distinct ways. Firstly, Paris was a centre for the diffusion of model sermon collections. Popular preachers all over Europe preached from ready made model sermons written and/or copied at Paris. Secondly, Paris provided a training for the preachers themselves. Biblical lectures imparted preachable doctrine but in addition to this the system of university sermons ensured that theology students had a training of a more practice! kind. Bachelors and auditores could be called upon to preach before the University, and thir ordeal must have been a major hurdle - demanding careful preparation - for the more junior students. The evidence suggests that students would normally be asked to give a sermon after mid-day - a collatio - rather than a morning sermon. They were normally held in the houses of the Franciscans or Dominicans. The student friars, most of whom were destined to become 'professional' preachers, must have found the training especially valuable. It is the more significant in that it was the only direct and practical preparation for preaching that a friar was given. This argument presupposes the general similarity between academic and popular preaching which is discussed in Parts II and III. The Legifer collection has been singled out for special attention because it seema almost an 'ideal-type' of the link between the two types of preaching. It is a handbook of model sermons for popular preachers which appears to have been based on collations given at the houses of the friars, probably at Paris. Since it was diffused by the pecia system of the university stationers it also represents the other contribution of Paris to popular preaching. After outlining the evidence for Legifer's unusual history (p.212 seqq.) I make it the basis for a brief analysis of the theological culture which a section of the laity was beginning to share with educated clerics (p.225 seqq.). The thesis concludes with a selection of illustrative texts and an appendix. The documents are transcriptions, not editions, and thoir purpose is to illustrate points made in the main body of the thesis. The first two texts are examples of the old and the new sermon forms respectively. The third text is included to illustrate the use of the new 'scholastic' form in a sermon to a lay congregation. It is also an example of a sermon in which the content is specifically orientated to one kind of lay audience. The fourth, fifth, and sixth sermons show the other side of the coin. Text IV is a 'popular' sermon which could easily be mistaken for an academic one; texts V and VI are a university sermon and a model sermon for popular preachers respectively: they have the same theme, and when they are read together the general similarity of form and content is more striking than the differences. The last text is followed by an appendix, on franciscan preachers' pocket hooks, which gives some of the evidence too detailed to be included in the section (Part I,3, p.22 seqq.) on 'Preachers' books'.
165

Investigation of the biological mechanisms activated by CD40 in prostate cancer cells

Ateeg, Salim January 2018 (has links)
CD40 is a prominent member of the TNFR family due to its ability to be expressed by and regulate the fate of not only immunocytes, but also non-lymphoid cells. Previous studies have demonstrated that CD40 ligation by cell-surface presented agonists, and in particular membrane CD40L (mCD40L), caused extensive apoptosis specifically in a variety of malignant epithelial cells (including bladder and colorectal). By contrast, soluble CD40 agonists are weakly pro-apoptotic and only become significantly pro-apoptotic by pharmacological intervention. Recent work from our laboratory has shed light onto the tumour-specificity of CD40 as well as the differences in soluble versus membrane-presented agonists in terms of pro-apoptotic capacity. As the role of CD40 in prostate cancer remains unknown, the main aim of this study was to investigate the hypothesis that the CD40/CD40L dyad regulates prostate carcinoma (PCa) cell fate and to explore the mechanisms of this in a panel of well-characterised human PCa lines. In order to achieve CD40 ligation by mCD40L, a co-culture in vitro model was used, whereby target PCa cells were co-cultured with third-party (murine fibroblasts 3T3CD40L (engineered to express mCD40L). This mode of ligation was compared to agonistic CD40 antibody. Flow cytometry allowed detection of CD40 expression in a panel of PCa lines, comprising DU145, LNCaP and PC-3 cells. Apoptosis was detected using several assays, focusing on classical hallmarks of apoptosis (loss of cell membrane integrity, caspase activation, and DNA fragmentation). ELISA assays were employed for detection of pro-inflammatory cytokine secretion and spectrophotometry and flow cytometry were used for detection of ROS. Immunoblotting techniques were also standardised and utilised for the accurate and sensitive detection of intracellular proteins involved in CD40 signalling. Experiments using retroviruses were also employed to engineer CD40 expression in negative PCa cells. Ligation of CD40 caused apoptosis in DU145 cells and LNCaP cells. By contrast, CD40-ve cells PC-3 were refractory to CD40 ligation. Restoration of CD40 expression restores susceptibility to CD40 apoptosis. Importantly, receptor ligation by mCD40L, and not soluble agonist, could cause cell death, as soluble agonist (cross-linked G28-5 mAb) was not pro-apoptotic. mCD40L, but not G28-5, induced rapid secretion of pro-inflammatory cytokines IL-6, IL-8 and GM-CSF, thus CD40 killing was pro-inflammatory. CD40 induced apoptosis as evident by membrane integrity loss and DNA fragmentation, both hallmarks of apoptotic death. Yet, it was found that CD40 triggers a death type that is caspase-independent. The work showed that CD40 in PCa cells triggers death that does not involve cross-talk with the extrinsic pathway, but via a direct signal that involved the mitochondrial pathway as indicated by the induction of Bak and Bax proteins. mCD40L triggered rapid induction of TRAF1 and TRAF3 whilst TRAF2 expression was downregulated. ASK1 was activated which was subsequently followed by MKK7 but not MKK4 activation and this was followed by JNK phosphorylation. Functional inhibition experiments showed that both JNK/AP-1 and p38 are important for death induction. ROS production could not be detected upon CD40 activation and functional inhibition experiments showed ROS is not critical for CD40 mediated death in PCa cells, observations raising the possibility of ROS-independent ASK1 activation. Finally, preliminary experiments using prostate cancer stem cells (CSC), well-established ‘drivers’ of PCa, showed that CSCs were CD40+ve, however, within the time constraints of this project, it was not possible to assess whether CD40 ligation could induce CSC-targeted cell death. These findings have not only generated novel observations in terms of the ability of CD40 to induce PCa cell death, but have also added to our knowledge of the intriguingly multifaceted effects of CD40 in carcinoma cells. These fascinating observations imply that CD40, whilst engaging signalling pathways with some common intracellular mediators, its precise death pathways can differ both in their exact nature and their exact features. Moreover, in addition to providing biological evidence for the mechanisms of CD40 apoptosis, these observations may represent a promising targeted approach for PCa therapy as the ability to lead to extensive apoptosis in PCa cells. Equally importantly, by efficiently killing PCa cells and causing rapid pro-inflammatory cytokine secretion, whilst at the same time targeting what is potentially the cellular driver of carcinogenesis (CSCs), CD40-mediated killing represents a very promising potential therapeutic tool for PCa therapy in the near future.
166

The application of advanced metrology techniques to ballistics and toolmark identification

Addinall, Katie Anne January 2017 (has links)
Since the early 1900s, the field of ballistic toolmark evidence has been developing as the instrumentation and computational power available has advanced. However, the use of these advanced techniques has not yet been validated for use in criminal proceedings. This has resulted in ballistic toolmark evidence being presented using the same techniques that have been employed for decades, unable to utilise more advanced techniques that have currently not been deemed acceptable in courts worldwide. Ballistic toolmark evidence currently relies on the use of comparison microscopy. Magnified optical 2D (greyscale) images of two separate surfaces are viewed side by side to ascertain the degree of similarity using visual comparison. Only highly trained expert examiners are able to carry out this comparison, and as such it is a time expensive method. The technique is built on subjective methodology, there are no mathematical outputs as the results are based on the opinion and experience of the expert examiner. With the advance of computational power and measurement techniques, it is now possible to create a digital system for the measurement and correlation of ballistic toolmark evidence. Evidence can be acquired as high density topographical datasets, and these datasets can be correlated against one another using mathematical algorithms, resulting in a comparison result based on a mathematical score or percentage match. Consequently, using these techniques could result in a more time efficient, repeatable and accurate system without problems of subjectivity or user bias. The novel contribution in this thesis has been shifting ballistic toolmark evidence and correlation from subjective and 2D qualitative methods to the use of the most advanced topographical areal datasets and mathematical correlation. A direct comparison of the efficacy of 2D digital and areal based systems was achieved, this showed that when the correct data is processed via the advanced system, there is a significant increase in the efficiency of hit list information. Novel contributions to these findings also include optimising the pre-processing of areal datasets, for both cartridge and bullets, and the effect of cartridge case materials on the overall topography of the toolmark. It was found that using current methods of data pre-processing resulted in a less efficient correlation system. For both bullets and cartridge cases however, using the developed preprocessing methods detailed in this thesis resulted in a more efficient method of correlation. Bullet correlation was also achieved using a full areal dataset of the toolmark, and such a method has not been published previously. Material analysis was attained across various cartridge manufacturers, which was then compared to the overall topography of the toolmark. It was found that differences in material composition would lead to differences in the topography of the toolmark. This is the first instance of such findings being published. Finally, a direct comparison of two separate advanced measurement systems was obtained, using the Alicona G4 focus variation instrument and the Alias ballistic imaging system. Using the same pre-processing methods for all datasets acquired, it was found that the quality of the dataset is significantly affected by the measurement method. Extraneous data such as optical spiking and data dropout was found to affect the efficacy of evidence correlation. This thesis presents the use of advanced methods for ballistic toolmark evidence, while considering issues with data fidelity, substrate material differences and pre-processing techniques.
167

Betriebswirtschaftliche Versicherungswissenschaft (BwVersWiss). Konzeptionen für Forschung, Lehre und Organisation an Universitäten.

Eszler, Erwin January 2007 (has links) (PDF)
Durch Ausweitung des Bereiches der Erkenntnisobjekte und durch eine besondere methodologische Differenzierung wird aus der herkömmlichen "Versicherungsbetriebslehre" eine "Betriebwirtschaftliche Versicherungswissenschaft" (BwVersWiss) entwickelt. Die Einordnung dieser BwVersWiss in eine ebenfalls neu entwickelte Systematik risiko- und versicherungsbezogener Wissenschaften, die Unterteilung der BwVersWiss in verschiedene Tätigkeitsbereiche sowie organisatorische Aspekte der Umsetzung der BwVersWiss an Universitäten werden dargelegt. (Autorenref.) / Series: Arbeitspapiere zum Tätigkeitsfeld Risikomanagement und Versicherung / Institut für Versicherungswirtschaft
168

A method to enhance the accuracy of digital forensics in the absence of complete evidence in Saudi Arabia

Alanazi, Fahad Mosalm January 2017 (has links)
The tremendous increase in the use of digital devices has led to their involvement in the vast majority of current criminal investigations. As a result, digital forensics has increasingly become one of the most important aspects of criminal investigations. The digital forensics process involves consideration of a number of important phases in order to achieve the required level of accuracy and to reach a successful conclusion of the investigation into the digital aspects of crimes; through obtaining acceptable evidence for use in a court of law. There have been a number of models developed and produced since 1984 to support the digital investigation processes. In this submission, I introduce a proposed model for the digital investigation processes which is based on the scope of the Saudi Arabia investigation process, which has been integrated with existing models of digital investigation processes and has produced a new phase to deal with a situation where there is insufficient evidence. In this research, grounded theory has been adopted as a research method to investigate and explore the participant’s perspectives and their opinions regarding the adoption of a method of a digital forensics investigation process in the absence of complete evidence in the Saudi Arabian context. The interaction of investigators with digital forensics processes involves the social aspect of digital investigation which is why it was suitable to adopt a grounded theory approach. A semi-structured data collection approach has been adopted, to enable the participants to express their visions, concerns, opinions and feelings related to factors that impact the adoption of the DF model for use in cases where there is an absence of sufficient evidence in Saudi Arabia. The proposed model emerged after conducting a number of interviews and analysing the data of this research. The researcher developed the proposed model based on the answers of the participant which helped the researcher to find a solution for dealing with cases where there is insufficient evidence, through adding a unique step in the investigation process, the “TraceBack” Phase. This study is the first in Saudi Arabia to be developed to enhance the accuracy of digital forensics in the absence of sufficient evidence, which opens a new method of research. It is also the first time has been employed a grounded theory in a digital forensics study in the Saudi context, where it was used in a digital forensics study, which indicates the possibility of applying this methodology to this field.
169

A study of the diagnosis, treatment and epidemiology of Mycobacterium abscessus in patients with cystic fibrosis

Preece, Clair January 2016 (has links)
Members of the Mycobacterium abscessus complex (MABSC) are a highly antibiotic-resistant complex of organisms within the genus Mycobacterium, increasingly acknowledged as a significant cause of lung infection in patients with cystic fibrosis (CF) and associated with poor clinical outcomes. Current methods of isolation of MABSC are hindered by the fact that they grow at a slower rate in culture than other microorganisms with many patient samples having to be discarded due to the overgrowth of more rapidly growing species. Decontamination of samples has shown to have an adverse effect upon the viability of MABSC, therefore improvements in the isolation of MABSC are urgently required in order to offer the possibility of a more rapid and accurate diagnosis. A novel medium (RGM) was developed for the isolation of MABSC. Commercially available pre-poured media were compared with RGM and challenged with isolates of rapidly growing mycobacteria and other species. In addition, in a multi-centre study sputum samples collected from patients with CF were inoculated onto RGM medium, BCSA and standard automated liquid culture method and assessed for growth. RGM demonstrated superior sensitivity over currently used methods without any requirement for decontamination and could easily be incorporated into any laboratory alongside routine culture for other CF pathogens. Chromogenic and fluorogenic substrates were investigated for the possibility of differentiating between subspecies within the MABSC complex. However, the results established that these would not provide any additional benefit to RGM. Possible environmental sources were explored in order to establish how patients with CF were acquiring MABSC. Although person-to-person transmission has been suggested, there are very few reports to substantiate this at present and many questions remain unanswered. In this study, MABSC was not isolated from any of the environments screened. Finally, a selection of antimicrobials were investigated against MABSC with the purpose of ascertaining susceptibility and whether any may be used for a more successful treatment outcome. There were no clinically applicable results therefore further work is required in this area. To conclude, RGM is a novel culture medium, which can be embedded alongside routine culture for other CF pathogens without any requirement for decontamination. This means that all respiratory samples submitted from patients with CF can be conveniently cultured for NTM, considerably improving the service offered to clinicians and patients. Furthermore, it is likely that formal AFB culture methods could be replaced by use of such a medium, potentially enabling substantial savings in terms of materials and labour time.
170

[1,2]-Sigmatropic rearrangement of benzylic ammoniumy lids ; Catalytic sp3-sp3 functionalisation of sulfonamides ; Annulation of arynes in the synthesis of sultams

Abdulla, Othman January 2018 (has links)
The first chapter in this thesis describes research on the asymmetric [1,2]-sigmatropic rearrangement of benzylic ammonium ylids. Our group previousely developed method showing that DMSO as solvent, and BTPP as base, in the presence of 5Å molecular sieves, dramatically improves the yield of the reaction. Hence, we applied the developed the method using8-(–)-phenylmenthol and (2S)-camphorsultam as chiral auxiliaries. In the second chapter, a new application of Pd-catalysed allylation is reported. This enabled the synthesis of (30)of branched sp3-functionalised sulfonamides, a compound class for which few reported methods exist. By reacting benzyl sulfonamides with allylic acetates in the presence of Pd0 catalysts and a base, at room temperature, direct allylation was efficiently performed, yielding products that are analogues of structural motifs seen in biologically active small molecules. The reaction was performed under mild conditions and could be applied to nanomolar sigma-receptor binders, thus enabling a late-stage functionalisation and efficient expansion of drug-like chemical space. The third chapter described a synthesis of benzosultam without recourse to transition-metal catalysis, or stoichiometric amounts of organometallic building blocks. Iodomethane sulfonylamide adds to benzyne (generated using fluoride sources), and then the formed intermediate undergos an intramolecular cyclisation to afford sultam. Using this method that procceds under simple reaction conditions, (11) benzosultams were synthesised in modest yield.

Page generated in 0.0738 seconds