• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 137
  • 45
  • 17
  • 9
  • 4
  • 4
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 259
  • 49
  • 34
  • 32
  • 29
  • 29
  • 28
  • 27
  • 25
  • 23
  • 21
  • 19
  • 19
  • 18
  • 18
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Improve Requirement Prioritization By End-user Demands : Model Building and Evaluation

He, Yiyang, Zhong, Jiasong January 2021 (has links)
Background: The selection and prioritizing of requirements is the most difficult challenge insoftware development. Prioritizing requirements is a difficult task. Due to the importance of thepriority of requirements, many methods have been developed on how to prioritize requirements.However, with the increase of software modules and the expansion of software platforms, thesingle requirement prioritization method can no longer match the increase in the number ofrequirements. Little is know in how to find and develop integrated requirement prioritizationmethod. Objectives: The main purpose of this research is to explore the main challenges and successcriteria that practitioners consider when determining the priority of product requirements. Builda good requirement prioritization model to tackle these challenges. And evaluate the strengthsand limitations of this model. Method: We conducted a questionnaire survey to learn more about the major problems andsuccess criteria for prioritizing product requirements. After that, we presented a model thatcombined the KANO model and Analytic Hierarchy Process (AHP), and we examined its practicality. Finally, using Focus Group Research, we analyzed the benefits and limitations of theintegrated model and improved solutions. Result: The results show that practitioners face many challenges in product requirement prioritization. The model we developed is suitable for a variety of scenarios. It helps practitionersmanage priorities and improve end-user satisfaction, which can solve these challenges to a certain extent. Conclusion: Our research collected many major challenges encountered by requirement analysts and product managers in the process of requirement prioritization. And developed a newrequirement prioritization model, got a better understanding of requirement prioritization whichcan inspire practitioners to build more better requirement prioritization models.
32

Investigations of Cleaning Product Sprays

Benjamin, Michael L. January 2017 (has links)
No description available.
33

Work Order Prioritization Using Neural Networks to Improve Building Operation

Ensafi, Mahnaz 20 October 2022 (has links)
Facility management involves a variety of processes with a large amount of data for managing and maintaining facilities. Processing and prioritizing work orders constitute a big part of facility management, given the large number of work orders submitted daily. Current practices for prioritizing work orders are mainly user-driven and lack consistency in collecting, processing, and managing a large amount of data. Decision-making methods have been used to address challenges such as inconsistency. However, they have challenges, including variations between comparisons during the actual prioritization task as opposed to those outside of the maintenance context. Data-driven methods can help bridge the gap by extracting meaningful and valuable information and patterns to support future decision-makings. Through a review of the literature, interviews, and survey questionnaires, this research explored different industry practices in various facilities and identified challenges and gaps with existing practices. Challenges include inconsistency in data collection and prioritizing work orders, lack of data requirements, and coping strategies and biases. The collected data showed the list of criteria and their rankings for different facilities and demonstrated the possible impact of facility type, size, and years of experience on criteria selection and ranking. Based on the results, this research proposed a methodology to automate the process of prioritizing work orders using Neural Networks. The research analyzed the work order data obtained from an educational facility, explained data cleaning and preprocessing approaches, and provided insights. The data exploration and preprocessing revealed challenges such as submission of multiple work orders as one, missing data for certain criteria, long durations for work orders' execution, and lack of correlation between collected criteria and the schedule. Through hyperparameter tuning, the optimum neural network configuration was identified. The developed neural network predicts the schedule of new work orders based on the existing data. The outcome of this research can be used to develop requirements and guidelines for collecting and processing work order data, improve the accuracy of work order scheduling, and increase the efficiency of existing practices using data-driven approaches. / Doctor of Philosophy / Facility Management (FM) is a profession that integrates various disciplines to ensure the comfort and safety of the occupants, efficiency of the built environment, and functionality of the building while meeting the main objectives of the owners. It involves various functions, including space management, communication, contract management, inspection, etc. Among many of these FM functions, maintenance-related tasks occupy 79% of the facility managers' responsibilities and %60 of the building cost in its whole lifecycle (design, construction, and operation). Prioritizing and processing work orders constitute a big part of facility maintenance management and requires a large amount of information submitted with hundreds of orders that need to be prioritized and turned into actions on a daily basis. Although vast amounts of work orders are submitted daily, the process of prioritizing orders has been done manually or partially through management systems rendering the process very challenging. The existing practices are highly dependent on the extent of knowledge, experience, and judgment of responsible staff available, are impacted by human cognitive workload and coping strategies and are challenged by inconsistency in data collection and uncertainty in decision-making. Delays in processing work orders can lead to asset downtimes and failure impacting occupants' comfort, health, and safety while increasing the cost of operation. Additionally, based on the results of previous studies, the alternative comparison for prioritizing work orders varies and is more realistic when performed during the actual work order prioritization task as opposed to outside of the maintenance context. Artificial Intelligence (AI) and Machine Learning (ML) algorithms have provided opportunities to benefit from the historical data collected and stored by the facilities. Since a large number of work orders are generated and stored by facilities, such methods can be used to address the challenges with existing practices to reduce errors, downtimes, and asset failures and improve the operation of the buildings by supporting automation within the systems. This dissertation first aims to explore the existing practices for processing and prioritizing work orders and identifying their gaps and challenges. Second, it investigates the implementation of Artificial Neural Networks (ANNs) to automate the prioritization of future work orders. The ANN is one type of machine learning model which reflects and mimics the behavior of the human brain to understand the relationship between a set of data allowing computer programs to solve complex problems. This research will improve the existing practices for processing work orders by allowing the automation of future work order prioritization. It also provides the basis for the development of data requirements to further support existing practices.
34

A Context-Aware Dynamic Spectrum Access System for Spectrum Research and Development

Kumar, Saurav 03 January 2024 (has links)
Our hunger for data has grown tremendously over the years which has led to a demand for the increase in the available radio spectrum for communications. The Federal Communications Commission in the United States allowed for the sharing of the CBRS band (3550-3700 MHz) a few years ago. Since then, research has been done by both industry and academia to identify similar opportunities in other radio bands as well. This research is, however, being hampered due to a lack of experimental frameworks where the various aspects of spectrum sharing can be studied. To this end, we propose to develop an open-source spectrum access system that incorporates context awareness and multi-band operational support and serves as an RandD tool for the research community. We have developed a novel Prioritization Framework that takes the current operational context of each user into account to determine their relative priority, within or outside their user class/group, for transmission in the network. We also introduce a Policy Engine for the configuration and management of dynamic policies (or rules) for defining the relationships between the various forms of context information and their relative impact on a user's overall priority. We have performed several experiments to show how context awareness impacts the spectrum sharing efficiency and quality of service. Due to its modular and extensible nature, we expect that this tool will be used by researchers and policy-makers to implement their own policies and algorithms and test their efficacy in a simulated radio environment. / Master of Science / Over the years, the advancements in the internet and communication technology have made us more and more data-hungry. Consequently, the electromagnetic spectrum on which data is transmitted has become a sparse resource. Governments worldwide are working together with academia and industry to find the most efficient utilization of this resource. If the current users of protected spectrum could share their bands with other licensed or opportunistic users, then a tremendous amount of spectrum could be freed up for public and private use. To facilitate rapid research and development in this field, this thesis proposes the development of an open-source, modular, and extensible Context-Aware Dynamic Spectrum Access System. In this system, we explore the usage of several traditional and novel context information in spectrum allocation, which in turn helps us improve the efficiency and resiliency of spectrum management while ensuring that incumbent users are not adversely affected by other licensed or unlicensed users. We develop cognitive modules for context-based prioritization of users for allocation through a novel Prioritization Framework and for enabling the use of dynamic policies or rules (governing spectrum allocation) instead of static policies that most systems use today. We simulate several operational scenarios and depict our tool's performance in them. Through our experiments and discussions, we highlight the significance of this tool for researchers, policy-makers, and regulators for studying spectrum sharing in general, and context-aware, dynamic policy-based spectrum sharing in particular.
35

Hybrid Approaches in Test Suite Prioritization

Nurmuradov, Dmitriy 05 1900 (has links)
The rapid advancement of web and mobile application technologies has recently posed numerous challenges to the Software Engineering community, including how to cost-effectively test applications that have complex event spaces. Many software testing techniques attempt to cost-effectively improve the quality of such software. This dissertation primarily focuses on that of hybrid test suite prioritization. The techniques utilize two or more criteria to perform test suite prioritization as it is often insufficient to use only a single criterion. The dissertation consists of the following contributions: (1) a weighted test suite prioritization technique that employs the distance between criteria as a weighting factor, (2) a coarse-to-fine grained test suite prioritization technique that uses a multilevel approach to increase the granularity of the criteria at each subsequent iteration, (3) the Caret-HM tool for Android user session-based testing that allows testers to record, replay, and create heat maps from user interactions with Android applications via a web browser, and (4) Android user session-based test suite prioritization techniques that utilize heuristics developed from user sessions created by Caret-HM. Each of the chapters empirically evaluate the respective techniques. The proposed techniques generally show improved or equally good performance when compared to the baselines, depending on an application under test. Further, this dissertation provides guidance to testers as it relates to the use of the proposed hybrid techniques.
36

Biological and clinical data integration and its applications in healthcare

Hagen, Matthew 07 January 2016 (has links)
Answers to the most complex biological questions are rarely determined solely from the experimental evidence. It requires subsequent analysis of many data sources that are often heterogeneous. Most biological data repositories focus on providing only one particular type of data, such as sequences, molecular interactions, protein structure, or gene expression. In many cases, it is required for researchers to visit several different databases to answer one scientific question. It is essential to develop strategies to integrate disparate biological data sources that are efficient and seamless to facilitate the discovery of novel associations and validate existing hypotheses. This thesis presents the design and development of different integration strategies of biological and clinical systems. The BioSPIDA system is a data warehousing solution that integrates many NCBI databases and other biological sources on protein sequences, protein domains, and biological pathways. It utilizes a universal parser facilitating integration without developing separate source code for each data site. This enables users to execute fine-grained queries that can filter genes by their protein interactions, gene expressions, functional annotation, and protein domain representation. Relational databases can powerfully return and generate quickly filtered results to research questions, but they are not the most suitable solution in all cases. Clinical patients and genes are typically annotated by concepts in hierarchical ontologies and performance of relational databases are weakened considerably when traversing and representing graph structures. This thesis illustrates when relational databases are most suitable as well as comparing the performance benchmarks of semantic web technologies and graph databases when comparing ontological concepts. Several approaches of analyzing integrated data will be discussed to demonstrate the advantages over dependencies on remote data centers. Intensive Care Patients are prioritized by their length of stay and their severity class is estimated by their diagnosis to help minimize wait time and preferentially treat patients by their condition. In a separate study, semantic clustering of patients is conducted by integrating a clinical database and a medical ontology to help identify multi-morbidity patterns. In the biological area, gene pathways, protein interaction networks, and functional annotation are integrated to help predict and prioritize candidate disease genes. This thesis will present the results that were able to be generated from each project through utilizing a local repository of genes, functional annotations, protein interactions, clinical patients, and medical ontologies.
37

Exploring coordinated software and hardware support for hardware resource allocation

Figueiredo Boneti, Carlos Santieri de 04 September 2009 (has links)
Multithreaded processors are now common in the industry as they offer high performance at a low cost. Traditionally, in such processors, the assignation of hardware resources between the multiple threads is done implicitly, by the hardware policies. However, a new class of multithreaded hardware allows the explicit allocation of resources to be controlled or biased by the software. Currently, there is little or no coordination between the allocation of resources done by the hardware and the prioritization of tasks done by the software.This thesis targets to narrow the gap between the software and the hardware, with respect to the hardware resource allocation, by proposing a new explicit resource allocation hardware mechanism and novel schedulers that use the currently available hardware resource allocation mechanisms.It approaches the problem in two different types of computing systems: on the high performance computing domain, we characterize the first processor to present a mechanism that allows the software to bias the allocation hardware resources, the IBM POWER5. In addition, we propose the use of hardware resource allocation as a way to balance high performance computing applications. Finally, we propose two new scheduling mechanisms that are able to transparently and successfully balance applications in real systems using the hardware resource allocation. On the soft real-time domain, we propose a hardware extension to the existing explicit resource allocation hardware and, in addition, two software schedulers that use the explicit allocation hardware to improve the schedulability of tasks in a soft real-time system.In this thesis, we demonstrate that system performance improves by making the software aware of the mechanisms to control the amount of resources given to each running thread. In particular, for the high performance computing domain, we show that it is possible to decrease the execution time of MPI applications biasing the hardware resource assignation between threads. In addition, we show that it is possible to decrease the number of missed deadlines when scheduling tasks in a soft real-time SMT system.
38

The role and use of experts and expert knowledge in spatial conservation prioritization

Galloway, Craig James 03 1900 (has links)
Thesis (MScConsEcol)--Stellenbosch University, 2013. / ENGLISH ABSTRACT: Decision-making in conservation should be efficient and effective as time and resources are typically limited. Conservation planning is one process by which stakeholders collaboratively make decisions when attempting to ensure the persistence of biodiversity. Spatial prioritization is the activity of applying quantitative data to spatial analysis to select locations for conservation investment and is a distinct process within conservation planning. The use of experts in spatial prioritization, and more generally in conservation planning, is widely accepted and advocated, but there is no general operational model for how best to involve them. Acceptable standards of practice in selecting experts and in applying specific techniques for eliciting expert knowledge need to be developed and tested in different contexts to ensure robust and defensible results of spatial prioritization processes. Although experts and expert knowledge have limitations, including them in spatial prioritization can produce many benefits, such as increased robustness of decisions and time and cost savings. Timeous, decisive, cost-efficient and robust decision-making is essential when attempting to stem the continued loss of biodiversity across the world. Although widely used, very little research has been conducted into the role of experts in spatial prioritization processes. In this research, the role and use of experts and expert knowledge in spatial prioritization was explored through the following processes: 1) a review of the peer-reviewed literature examining the role, and different types, of experts included in spatial prioritization studies (Chapter 2) to identify the patterns of their involvement; 2) a study examining the process and the differences between individual and group expert outputs and outcomes produced from a typical spatial prioritization workshop to provide a baseline against which experts might be identified for future involvement (Chapter 3); and 3) a study examining the knowledge of local community and land management experts and their ability to predict private landowners self-reported attitudes towards conservation, willingness to partner with organizations and behavior relating to conservation, aimed to test if expert knowledge might replace interviews when mapping conservation opportunity (Chapter 4). The main findings of this research are, firstly, that experts primarily contributed to spatial prioritization through mapping of species, habitats and ecosystems (that can be input into spatial prioritization analyses), and specifically also the selection of areas important for implementing conservation action (e.g., priority conservation areas). Secondly, individual experts contributed different expertise to the spatial prioritization processes in which they were involved, sometimes despite being considered experts in the same field. Individual experts differed to each other in the knowledge they contributed, decisions they made, and in the information content and its spatial representation. Groups of experts collaborating to produce the same information were more effective at capturing expert knowledge than individuals. Thirdly, when seeking to map human and social data to inform the mapping of conservation opportunity, experts were unable to reliably score private landowner's attitudes towards conservation, willingness to partner with organizations and behavior relating to conservation. Experts were able though, to provide accurate knowledge on the general attitudes of landowners, the context of the area in which the research was conducted, and the challenges that landowners in the area face. Collectively, this research can be used to inform the development of standards of best practice to ensure the most effective and cost efficient approach to integrating spatial prioritization software with expert knowledge. / AFRIKAANSE OPSOMMING: Besluitneming in bewaring moet doeltreffend en effektief wees omdat tyd en hulpbronne skaars is. Bewaringsbeplanning is een proses waardeur aandeelhouers gesamentlik besluite kan neem wanneer hul poog om die voortbestaan van biodiversiteit te verseker. Ruimtelike prioritisering verwys na die proses waar kwantitatiewe data toegepas word op ruimtelike analise om areas vir bewaringsbelegging te selekteer. Dit is 'n afsonderlike proses binne bewaringsbeplanning. Die gebruik van deskundiges in die bepaling van ruimtelike prioritisering en meer algemene bewaringsbeplanning word wyd aanvaar en bepleit maar daar is geen algemene operationele model wat bepaal hoe om hul ten beste in te sluit nie. Aanvaarbare standaarde in die praktyk van die seleksie van kundiges en die aanwending van spesifieke tegnieke om kundige kennisstelsels uit te lok moet ontwikkel en getoets word in verskillende kontekste om robuuste en verdedigbare resultate te verseker. Daar is baie voordele verbonde aan die gebruik van deskundiges en kundige kennisstelsels, ten spyte van hul beperkings. Voordele sluit onder andere tyd- en kostebesparings in. Tydige, beslissende, koste-effektiewe en robuuste besluitneming is noodsaaklik wanneer daar gepoog word om die voortdurende wêreldwye verlies aan biodiversiteit te stuit. Al word hul algemeen gebruik is daar nog baie min navorsing gedoen oor die rol van kundiges tydens die proses van ruimtelike prioritisering. Die rol en gebruik van kundiges en kundige kennisstelsels in die bepaling van ruimtelike prioritisering is deur die volgende prosesse ondersoek: 1) 'n Oorsig van portuurbeoordeelde literatuur wat die rol van en verskillende tipes kuniges wat in ruimtelike prioritisering studies ingesluit word, bestudeer (Hoofstuk 2) ten einde die patrone van hul betrokkenheid te identifiseer; 2) 'n studie wat die proses en verskille tussen die insette en uitkomste van individuele en groepe kundiges, soos geproduseer by 'n tipiese ruimtelike prioritisering werkwinkel, bestudeer ten einde 'n grondlyn daar te stel waarteen kundiges vir toekomstige betrokkenheid geïdentifiseer kan word (Hoostuk 3); en 3) 'n bestudering van die kennis van plaaslike gemeenskaps- en grondbestuur kundiges en hul vermoë om privaat grondeienaars se selfgerapporteerde houdings teenoor bewaring, hul bereidwilligheid om met organisasies saam te werk en gedrag wat verband hou met bewaring te voorspel om te toets of kundige kennis onderhoude sal kan vervang tydens die kartering van bewaringsmoontlikhede (Hoofstuk 4). Die vernaamste bevindinge van hierdie navorsing is, ten eerste, dat kundiges se primêre bydrae tot ruimtelike prioritisering plaasvind deur die kartering van spesies, habitats en ekosisteme (wat alles in ruimtelike prioritisering analise vervat kan word), en meer bepaald die seleksie van areas wat belangrik is vir die implementering van bewaringsaksie (bv. prioriteit bewaringsareas). Tweedens, individuele kundiges se bydrae tot die ruimtelike prioritisering prosesse waar hul betrokke was, het verskil, selfs waar hul as kundiges in dieselfde veld beskou word. Individuele kundiges het van mekaar verskil ten opsigte van die kennis wat hul bygedra het, die besluite wat hul geneem het, die inhoud van inligting en die ruimtelike voorstelling daarvan. Groepe kundiges wat saamwerk om dieselfde inligting op te lewer was meer effektief in die vaslegging van kundige kennis as individuele kundiges. Derdens, tydens die soeke na menslike en maatskaplike data om die kartering van bewaringsmoontlikhede in te lig was kundiges nie in staat is om 'n betroubare skatting van privaat grondeienaars se houdings teenoor bewaring, bereidwilligheid om saam met organisasies te werk en gedrag wat verband hou met bewaring te maak nie. Deskundiges kon egter akkurate kennis meedeel ten opsigte van die algemene houdings van grondeienaars, die konteks van die area van navorsing en die uitdagings wat grondeienaars in die spesifieke areas in die gesig staar. Hierdie navorsing kan gesamentlik gebruik word om die ontwikkeling van standaarde van beste praktyk vas te stel om die mees doeltreffende en koste-effektiewe benadering tot die integrasie van ruimtelike prioritisering sagteware met deskundige kennis te verseker.
39

Weighing Animal Lives : A Critical Assessment of Justification and Prioritization in Animal-Rights Theories

Karlsson, Fredrik January 2009 (has links)
The project underlying this dissertation aims at analyzing three pro-animal-rights theories, evaluating the theories, and outlining an alternative theoretical account of animal rights. The analytical categories are justification and function of animal rights, the definition of the right holder, and the resolution approach to rights conflict. The categories are applied to a naturalist, a theocentric, and a contractarian approach to defend animal rights. The evaluation is substantiated by the assumption that rights are meant to protect less powerful beings against more powerful aggressors. The constructive segment is an investigation into what extent identified disadvantages of the theories can be avoided by outlining a new model for animal rights. The analyses and evaluation suggest that all three theories are at risk of contradicting the proper function of rights-based theories. Tom Regan’s naturalist account of animal rights includes a logical possibility to sacrifice less capable beings for the sake of more capable beings. Andrew Linzey’s theocentric case for animal rights may sometimes mean that vulnerable human persons should be sacrificed for more powerful non-human beings. Mark Rowlands’ outlined contractarian model, further reconstructed in this work, fails to provide a way to resolve rights conflicts, making the function of rights inapplicable to conflicts. In conclusion, it is suggested that defining the right holder as a self-preservative being can be supported by, at least, the contractarian rationale. That would also conform to the proper function of rights-based theories. It is also suggested that this means that rights conflicts should be resolved by a voluntary sacrifice of the most powerful being. Practical circumstances should be created where such voluntarity is both genuine and rationally possible.
40

Conservation and land use planning applications in Gabon, Central Africa

Lee, Michelle E. January 2014 (has links)
Spatial prioritization and systematic conservation planning methods are designed to improve land use decisions and conservation outcomes, yet remain underutilized in many biologically-rich places that need them most. This thesis applies the theory and methods developed in the discipline of spatial prioritization to conservation and land use decisions in the Central African country of Gabon. Creating a spatial information base of priority species, habitats and land uses in a region that is notoriously data-poor, I reveal that many features important for both conservation and natural resource production are highly localized; their coincidence has important implications for management. Setting conservation targets for species and habitats, I find that representation in existing protected areas is relatively low, and identify a number of near-optimal solutions that meet all targets, with minimal impact on land used for local livelihoods. I distill these solutions down to a handful of critical biodiversity sites that are top priority to protect, and make management actions explicit for the species and habitats they contain. To make the work more widely applicable, I also develop a novel method to identify where field surveys are most likely to improve decisions about protected area expansion, providing decision-makers with more options of places that could be protected to achieve conservation goals. This study contributes to the research, development and practice of conservation prioritization and spatial planning, particularly in data-poor contexts like Gabon, which still have a wealth of biodiversity, and need to carefully plan for its conservation alongside development.

Page generated in 0.0931 seconds