• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 43
  • 11
  • 6
  • 3
  • 1
  • 1
  • Tagged with
  • 89
  • 89
  • 56
  • 56
  • 36
  • 31
  • 30
  • 29
  • 28
  • 28
  • 27
  • 27
  • 27
  • 27
  • 27
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Using the UTAUT2 Model to Explain the Intention to Use Phone Biometrics

Lais A McCartney (9306860) 13 May 2022 (has links)
<p>Biometric technology is used in daily life, for authentication purposes. Perceptions about the privacy and security of biometrics are of great interest (Olorunsola et al., 2020). Ho et al. (2003) specifically added privacy to their biometric acceptance model as a potential influence on intention to use the technology since privacy about biometrics was found to be peoples’ primary concern. Surveys of perceptions and use of technology (Buckley & Nurse 2019; Carpenter et al. 2018; Olorunsola et al. 2020) have used many different models to predict people's willingness to use biometrics. Venkatesh, Thong, et al (2012) used the reliable and valid UTUAT2 (Unified Theory of Acceptance and Use of Technology), a consumer-based model, with phone biometrics. Could the UTAUT2 model explain variance in intention to use phone biometrics? Phone biometrics are defined as biometrics used on a mobile smartphone but are referred to as phone biometrics throughout this study. A survey using the UTAUT2 basic questions was posed to <em>n</em> = 329 people who owned a mobile phone, lived in the United States, and used phone biometrics, to see if the model explained the “intention to use” phone biometrics. An example application of phone biometrics was biometrics used on a personal phone. Example use cases included using biometrics to unlock a phone, using fingerprints or face, or opening or authenticating specific applications within the phone.</p> <p><br></p> <p>Venkatesh developed the UTAUT2 model to explain the intention to use in a consumer setting. His earlier model (UTAUT) examined intention to use in an organizational setting. The challenge was that these models are old (the UTAUT2 model is almost ten years old at the time of writing), and phone biometrics is a rapidly changing consumer technology. The overarching research question is whether the UTAUT2 model can explain the intention to use phone biometrics. The results showed that UTAUT2 constructs accounted for 79.1% of the variation in intention to use phone biometrics. </p>
52

WITNESSING A GAP: HOW DIGITAL FORENSIC EXPERT WITNESS QUALIFICATIONS DIFFER FROM ATTORNEY EXPECTATIONS

Megan Celeste Piper (15209908) 13 April 2023 (has links)
<p>Due to the increasing use of technology in everyday life, electronic evidence has become a vital part of nearly every criminal and civil trial. Such evidence can carry so much weight that it has been the “smoking gun” in a wide variety of cases, from traffic accidents to intellectual property disputes to murder. A computer forensics examiner is required to thoroughly analyze and interpret the data gathered from various sources; they can be an important ally in court by helping jurors understand the significance of the evidence. It is critical that expert witnesses aim to be highly qualified to testify in court. However, currently there are only minimal qualifications to testify as a digital forensics expert witness. Without proper standards, the wrong verdict could be determined. This research aimed to determine if there is a gap between the preference for qualifications for digital forensics experts and what qualifications such experts have actually attained. A group of attorneys and officers were asked to participate in an online survey study for which they were presented with questions on desired and attained qualifications. While this study showed that expert witnesses mostly met lawyer expectations, it also demonstrated that there is  need for standardized qualifications for expert testimony.</p>
53

Investors' and Analysts' Reactions to Other Information Disclosure on the Auditor's Report

Liu, Weiqing 22 December 2021 (has links)
New and revised Canadian Auditing Standards for audits of companies with fiscal periods ending on or after December 15, 2018 came into effect in April 2017. This paper examines the economic effects of one of the updates: the new auditor reporting requirement to disclose the auditor’s responsibilities over other information. We investigate the relationship between the existence of the auditor’s commentary about the MD&A within the other information paragraph on the auditor’s report and the reactions of users of the financial statements, namely investors and analysts, to the MD&A. We find that both investors and analysts do not respond to the auditor’s commentary about the MD&A within the other information paragraph present on the auditor’s report. Our result indicates that although the disclosure may not be providing additional information value to users of the financial statements as the standard setters intended, it is also not creating an increase in the audit expectation gap.
54

<b>DISCOVERY OF NOVEL DISEASE BIOMARKERS AND THERAPEUTICS USING MACHINE LEARNING MODELS</b>

Luopin Wang (14777575) 22 April 2024 (has links)
<p dir="ltr">In the fields of computational biology and bioinformatics, the identification of novel disease biomarkers and therapeutic strategies, especially for cancer diseases, remains a crucial challenge. The advancement in computer science, particularly machine learning techniques, has greatly empowered the study of computational biology and bioinformatics for their unprecedented prediction power. This thesis explores how to utilize classic and advanced machine learning models to predict prognostic pathways, biomarkers, and therapeutics associated with cancers.</p><p dir="ltr">Firstly, this thesis presents a comprehensive overview of computational biology and bioinformatics, covering past milestones to current groundbreaking advancements, providing context for the research. A centerpiece of this thesis is the introduction of the Pathway Ensemble Tool and Benchmark, an original methodology designated for the unbiased discovery of cancer-related pathways and biomarkers. This toolset not only enhances the identification of crucial prognostic components distinguishing clinical outcomes in cancer patients but also guides the development of targeted drug treatments based on these signatures. Inspired by Benchmark, we extended the methodology to single-cell technologies and proposed scBenchmark and PathPCA, which provides insights into the potential and limitations of how novel techniques can benefit biomarker and therapeutic discovery. Next, the research progresses to the development of DREx, a deep learning model trained on large-scale transcriptome data, for predicting gene expression responses to drug treatments across multiple cell lines. DREx highlights the potential of advanced machine learning models in drug repurposing using a genomics-centric approach, which could significantly enhance the efficiency of initial drug selection.</p><p dir="ltr">The thesis concludes by summarizing these findings and highlighting their importance in advancing cancer-related biomarkers and drug discovery. Various computational predictions in this work have already been experimentally validated, showcasing the real-life impact of these methodologies. By integrating machine learning models with computational biology and bioinformatics, this research pioneers new standards for novel biomarker and therapeutics discovery.</p>
55

The price of convenience : implications of socially pervasive computing for personal privacy

Ng-Kruelle, Seok Hian January 2006 (has links)
Literature has identified the need to study socially pervasive ICT in context in order to understand how user acceptability of innovation varies according to different inputs. This thesis contributes to the existing body of knowledge on innovation studies (Chapter 2) and proposes a methodology for a conceptual model, for representing dynamic contextual changes in longitudinal studies. The foundation for this methodology is the 'Price of Convenience' (PoC) Model (Chapter 4). As a theory development Thesis, it deals with two related studies of socially pervasive ICT implementation: (1) voluntary adoption of innovations and (2) acceptance of new socially pervasive and ubiquitous ICT innovations (Chapters 6 and 7).
56

Multicriteria analysis and GIS application in the selection of sustainable motorway corridor

Belka, Kamila January 2005 (has links)
<p>Effects of functioning transportation infrastructure are receiving more and more environmental and social concern nowadays. Nevertheless, preliminary corridor plans are usually developed on the basis of technical and economic criteria exclusively. By the time of environmental impact assessment (EIA), which succeeds, relocation is practically impossible and only preventative measures can be applied.</p><p>This paper proposes a GIS-based method of delimiting motorway corridor and integrating social, environmental and economic factors into the early stages of planning. Multiple criteria decision making (MCDM) techniques are used to assess all possible alternatives. GIS-held weighted shortest path algorithm enables to locate the corridor. The evaluation criteria are exemplary. They include nature conservation, buildings, forests and agricultural resources, and soils. Resulting evaluation surface is divided into a grid of cells, which are assigned suitability scores derived from all evaluation criteria. Subsequently, a set of adjacent cells connecting two pre-specified points is traced by the least-cost path algorithm. The best alternative has a lowest total value of suitability scores.</p><p>As a result, the proposed motorway corridor is routed from origin to destination. It is afterwards compared with an alternative derived by traditional planning procedures. Concluding remarks are that the location criteria need to be adjusted to meet construction</p><p>requirements as well as analysis process to be automated. Nevertheless, the geographic information system and the embedded shortest path algorithm proved to be well suited for preliminary corridor location analysis. Future research directions are sketched.</p>
57

Spatial Data Infrastructure (SDI) in China : Some potentials and shortcomings

Li, Tao January 2008 (has links)
<p>A Spatial Data Infrastructure (SDI) is required to make the spatial data be fully used and well shared by the society. In China, SDI’s has also been established progressively. A thorough understanding of the potentials and shortcomings about SDI in China has a positive significance to clearly identify the future direction and actions.In order to find out the potentials and shortcomings of SDI in China, the current situation of SDI and SDI in China have been assessed through literature review and interview. Then a Strengths, Weaknesses, Opportunities and Threats (SWOT) analysis has been developed. Based on the current experiences of SDI development in China, the thesis concludes that China have a good potential to develop its SDI function. It also points out that there still are some weaknesses needed to be surmounted, such as: lacking advanced technology, data duplication, and lack of skilled workforce. There is a big room and capability to improve the development of Chinese SDI well in the future.</p>
58

Enterprise Systems Modifiability Analysis : An Enterprise Architecture Modeling Approach for Decision Making

Lagerström, Robert January 2010 (has links)
Contemporary enterprises depend to great extent on software systems. During the past decades the number of systems has been constantly increasing and these systems have become more integrated with one another. This has lead to a growing complexity in managing software systems and their environment. At the same time business environments today need to progress and change rapidly to keep up with evolving markets. As the business processes change, the systems need to be modified in order to continue supporting the processes. The complexity increase and growing demand for rapid change makes the management of enterprise systems a very important issue. In order to achieve effective and efficient management, it is essential to be able to analyze the system modifiability (i.e. estimate the future change cost). This is addressed in the thesis by employing architectural models. The contribution of this thesis is a method for software system modifiability analysis using enterprise architecture models. The contribution includes an enterprise architecture analysis formalism, a modifiability metamodel (i.e. a modeling language), and a method for creating metamodels. The proposed approach allows IT-decision makers to model and analyze change projects. By doing so, high-quality decision support regarding change project costs is received. This thesis is a composite thesis consisting of five papers and an introduction. Paper A evaluatesa number of analysis formalisms and proposes extended influence diagrams to be employed for enterprise architecture analysis. Paper B presents the first version of the modifiability metamodel. InPaper C, a method for creating enterprise architecture metamodels is proposed. This method aims to be general, i.e. can be employed for other IT-related quality analyses such as interoperability, security, and availability. The paper does however use modifiability as a running case. The second version of the modifiability metamodel for change project cost estimation is fully described in Paper D. Finally, Paper E validates the proposed method and metamodel by surveying 110 experts and studying 21 change projects at four large Nordic companies. The validation indicates that the method and metamodel are useful, contain the right set of elements and provide good estimation capabilities. / QC20100716
59

Data mining of geospatial data: combining visual and automatic methods

Demšar, Urška January 2006 (has links)
Most of the largest databases currently available have a strong geospatial component and contain potentially useful information which might be of value. The discipline concerned with extracting this information and knowledge is data mining. Knowledge discovery is performed by applying automatic algorithms which recognise patterns in the data. Classical data mining algorithms assume that data are independently generated and identically distributed. Geospatial data are multidimensional, spatially autocorrelated and heterogeneous. These properties make classical data mining algorithms inappropriate for geospatial data, as their basic assumptions cease to be valid. Extracting knowledge from geospatial data therefore requires special approaches. One way to do that is to use visual data mining, where the data is presented in visual form for a human to perform the pattern recognition. When visual mining is applied to geospatial data, it is part of the discipline called exploratory geovisualisation. Both automatic and visual data mining have their respective advantages. Computers can treat large amounts of data much faster than humans, while humans are able to recognise objects and visually explore data much more effectively than computers. A combination of visual and automatic data mining draws together human cognitive skills and computer efficiency and permits faster and more efficient knowledge discovery. This thesis investigates if a combination of visual and automatic data mining is useful for exploration of geospatial data. Three case studies illustrate three different combinations of methods. Hierarchical clustering is combined with visual data mining for exploration of geographical metadata in the first case study. The second case study presents an attempt to explore an environmental dataset by a combination of visual mining and a Self-Organising Map. Spatial pre-processing and visual data mining methods were used in the third case study for emergency response data. Contemporary system design methods involve user participation at all stages. These methods originated in the field of Human-Computer Interaction, but have been adapted for the geovisualisation issues related to spatial problem solving. Attention to user-centred design was present in all three case studies, but the principles were fully followed only for the third case study, where a usability assessment was performed using a combination of a formal evaluation and exploratory usability. / QC 20110118
60

Data mining of geospatial data: combining visual and automatic methods

Demšar, Urška January 2006 (has links)
<p>Most of the largest databases currently available have a strong geospatial component and contain potentially useful information which might be of value. The discipline concerned with extracting this information and knowledge is data mining. Knowledge discovery is performed by applying automatic algorithms which recognise patterns in the data.</p><p>Classical data mining algorithms assume that data are independently generated and identically distributed. Geospatial data are multidimensional, spatially autocorrelated and heterogeneous. These properties make classical data mining algorithms inappropriate for geospatial data, as their basic assumptions cease to be valid. Extracting knowledge from geospatial data therefore requires special approaches. One way to do that is to use visual data mining, where the data is presented in visual form for a human to perform the pattern recognition. When visual mining is applied to geospatial data, it is part of the discipline called exploratory geovisualisation.</p><p>Both automatic and visual data mining have their respective advantages. Computers can treat large amounts of data much faster than humans, while humans are able to recognise objects and visually explore data much more effectively than computers. A combination of visual and automatic data mining draws together human cognitive skills and computer efficiency and permits faster and more efficient knowledge discovery.</p><p>This thesis investigates if a combination of visual and automatic data mining is useful for exploration of geospatial data. Three case studies illustrate three different combinations of methods. Hierarchical clustering is combined with visual data mining for exploration of geographical metadata in the first case study. The second case study presents an attempt to explore an environmental dataset by a combination of visual mining and a Self-Organising Map. Spatial pre-processing and visual data mining methods were used in the third case study for emergency response data.</p><p>Contemporary system design methods involve user participation at all stages. These methods originated in the field of Human-Computer Interaction, but have been adapted for the geovisualisation issues related to spatial problem solving. Attention to user-centred design was present in all three case studies, but the principles were fully followed only for the third case study, where a usability assessment was performed using a combination of a formal evaluation and exploratory usability.</p>

Page generated in 0.1147 seconds