• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 684
  • 252
  • 79
  • 57
  • 42
  • 37
  • 30
  • 26
  • 25
  • 14
  • 9
  • 8
  • 7
  • 7
  • 7
  • Tagged with
  • 1504
  • 1030
  • 249
  • 238
  • 223
  • 215
  • 195
  • 185
  • 167
  • 163
  • 151
  • 124
  • 123
  • 122
  • 111
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
771

Computing Least Common Subsumer in Description Logics with Existential Restrictions

Baader, Franz, Küsters, Ralf, Molitor, Ralf 20 May 2022 (has links)
Computing the least common subsumer (lcs) is an inference task that can be used to support the \bottom-up' construction of knowledge bases for KR systems based on description logics. Previous work on how to compute the lcs has concentrated on description logics that allow for universal value restrictions, but not for existential restrictions. The main new contribution of this paper is the treatment of description logics with existential restrictions. More precisely, we show that, for the description logic ALE (which allows for conjunction, universal value restrictions, existential restrictions, negation of atomic concepts, as well as the top and the bottom concept), the lcs always exists and can efiectively be computed. Our approach for computing the lcs is based on an appropriate representation of concept descriptions by certain trees, and a characterization of subsumption by homomorphisms between these trees. The lcs operation then corresponds to the product operation on trees. / An abridged version of this technical report is published in the Proceedings of IJCAI'99.
772

GROCERY PRODUCT RECOMMENDATIONS : USING RANDOM INDEXING AND COLLABORATIVE FILTERING / Produktrekommendationer för matvaror med Random Indexing och Collaborative Filtering

Orrenius, Axel, Wiebe Werner, Axel January 2022 (has links)
The field of personalized product recommendation systems has seen tremendous growth in recent years. The usefulness of the algorithms’ abilities to filter out data from vast sets has been shown to be crucial in today’s information-heavy online experience. Our goal is therefore to compare two recommender models, one based on Random Indexing, the other on Collaborative Filtering, in order to find out if one is better suited to the task than the other. We bring up relevant previous research to set the context for our study, its limitations and possibilities. We then explain the theories, models and algorithms underlying our two recommender systems and finally we evaluate them, partly through empirical data collection from our employer Kavall’s platform, and partly through analysing data from interviews. We judge that our study is scientifically relevant as it compares an algorithm that is rarely used in this context, Random Indexing, to a more established recommendation algorithm, Collaborative Filtering, and as such the result of this comparison might give useful insights into the further development of new or existing algorithms. While more testing is required, the study did show signs that Random Indexing does have the potential of outperforming Collaborative Filtering in some areas, and further development of the model might be a worthwhile endeavor. / Området för personliga produktrekommendationer har sett en enorm tillväxt under de senaste Åren. Användbarheten av algoritmernas förmåga att filtrera ut data ur stora uppsättningar har visat sig vara avgörande i dagens informationstunga onlineupplevelse. Vårt mål Är därför att jämföra två rekommendatormodeller, en baserad på Random Indexing, den andra på Collaborative Filtering, för att ta reda på om den ena Är bättre lämpad för uppgiften Än den andra. Vi tar upp relevant tidigare forskning för att sätta sammanhanget för vår studie, dess begränsningar och möjligheter. Vi förklarar sedan de teorier, modeller och algoritmer som ligger till grund för våra två rekommendationssystem och slutligen utvärderar vi dem, dels genom empirisk datainsamling från vår arbetsgivare Kavalls plattform, dels genom att analysera data från intervjuer. Vi bedömer att vår studie Är vetenskapligt relevant då den jämför en algoritm som sällan används i detta sammanhang, Random Indexing, med en mer etablerad rekommendationsalgoritm, Collaborative Filtering, och som sådan kan resultatet av denna jämförelse ge användbara insikter i den fortsatta utvecklingen av nya eller befintliga algoritmer. även om fler tester krävs, visade studien tecken på att Random Indexing har potentialen att överträffa Collaborative Filtering på vissa områden, och vidareutveckling av modellen kan vara ett givande åtagande.
773

Arc-Completion of 2-Colored Best Match Graphs to Binary-Explainable Best Match Graphs

Schaller, David, Geiß, Manuela, Hellmuth, Marc, Stadler, Peter F. 24 April 2023 (has links)
Best match graphs (BMGs) are vertex-colored digraphs that naturally arise in mathematical phylogenetics to formalize the notion of evolutionary closest genes w.r.t. an a priori unknown phylogenetic tree. BMGs are explained by unique least resolved trees. We prove that the property of a rooted, leaf-colored tree to be least resolved for some BMG is preserved by the contraction of inner edges. For the special case of two-colored BMGs, this leads to a characterization of the least resolved trees (LRTs) of binary-explainable trees and a simple, polynomial-time algorithm for the minimum cardinality completion of the arc set of a BMG to reach a BMG that can be explained by a binary tree.
774

An Empirical Investigation on the Critical Success Factors for Kaizen Events in Hospitals

Harry, Kimberly D.M. 06 September 2023 (has links)
A Kaizen event (KE) may be defined as a structured improvement project that uses a cross-functional team and specific improvement goals to improve a targeted work area or process in an accelerated time frame. KEs, also known as Rapid Improvement Events (RIEs), have been utilized within hospitals to achieve beneficial operations, stakeholder (i.e., social), financial, and clinical outcomes. Due to their potential to achieve positive results in a rapid timeframe, understanding the determinants of KE success within a hospital environment is a valuable research undertaking. To date there has been limited rigorous empirical quantitative research focused on identifying success factors (SFs) influencing socio-technical outcomes of hospital-based KEs. Hence, this empirical research study seeks to determine the critical success factors (CSFs) for KEs in hospitals. For the first phase of this research work, a comprehensive systematic literature review (SLR) was conducted to identify the success factors (SFs) for KEs in hospitals as reported in the literature. This SLR resulted in the identification of 54 unique success factors mapping to four broad success factor categories, KE Task Design, KE Team Design, Organization, and KE Process. Thereafter, the second phase, which involved the variable reduction process, was performed to determine the strength of effect, or importance, of the SFs in order to determine a feasible number of SFs to include in further empirical work. Two robust methods were applied; a Meta-synthesis Evaluation and an Expert Survey, to query the SFs and to determine high priority factors for the empirical study. As a result, a total of 30 factors were finalized for empirical study. Next, the last phase, the empirical study to investigate and determine the CSFs for KEs in hospitals, was executed using a retrospective field study survey research design. Specifically, a survey questionnaire was designed to elicit feedback on perceptual measures from targeted hospital KE facilitators/leaders on the criticality of SFs on socio-technical outcomes for KEs in hospitals. Sixty usable responses were obtained, which were subjected to Exploratory Factor Analysis (EFA) and Partial Least Squares-Structural Equation Modeling (PLS-SEM), which were used to identify latent factor constructs and to determine the significance of the SFs, respectively. The results of this study identified seven significant direct relationships. Kaizen Event Design Characteristics (KEDC) and Target Area Buy-in (TABI) were found to have significant direct effects with both dependent variables, Performance Impact (PI) and Growth in Kaizen Capabilities (KCG). In addition, KEDC also had a significant direct relationship with Performance Culture (PC) and Team Dynamics (TD), respectively. Also, PC has a significant direct relationship with TD. Furthermore, Logistic Regression was utilized to test the SFs impact on the one objective technical outcome measure in the study, Goal Attainment (GOALATT). This analysis revealed one significant negative relationship occurring between TD and GOALATT. Overall, the study's findings provide evidence-based results for informing hospital managers, leaders, and continuous improvement practitioners on the key factors or value-added practices that can be adopted in their hospital KE initiatives to achieve beneficial socio-technical outcomes, as well as overall hospital KE success. Furthermore, this research can enable academia/researchers to strategize more confirmatory analysis approaches for theory validation and generalizability. / Doctor of Philosophy / The focus of this research study is to identify the most significant factors for Kaizen events (KEs) in hospitals, referred herein as critical success factors (CSFs). A KE may be defined as a structured improvement project that uses a cross-functional team and specific improvement goals to improve a targeted work area or process in an accelerated time frame. The aim of the study is to ultimately improve KE practice in hospitals through increased understanding of CSFs that can be planned or designed into KE processes to increase the likelihood of successful event outcomes. Various research formulation, development, and testing techniques are applied to frame the research study according to the aims and objectives and to achieve targeted research outcomes. The overall research design encompasses a retrospective study approach, performing a large-scale field study using a survey questionnaire to empirically identify the CSFs for KEs in hospitals. To help frame the research, a systematic literature review (SLR) along with bibliometric analyses were conducted. To help refine and select the success factors for empirical study, a meta-synthesis evaluation and an expert survey study were conducted. Exploratory factor analysis (EFA) and partial least squares-structural equation modeling (PLS-SEM) along with mediation analyses (MA) were performed to identify key factors, determine the significance of those factors, and to understand the influential relationships of those factors to hospital KE success. Results from this study aim to inform healthcare managers, healthcare improvement practitioners, researchers, and other relevant stakeholders about the critical components needed to achieve hospital KE success. The dissertation is documented according to a "manuscript style," using a journal/conference paper format to organize and report on the key findings and results obtained from the investigation. The Introduction chapter is provided to introduce the research study topic, study significance, indicate the overall research aims and objectives, present the overall research approach and design methodology, and to enumerate the main publication outputs and outcomes from this dissertation work. The Conclusions chapter summarizes the overall research outcomes, key study findings, study limitations, and provides areas for future research.
775

An Investigation into the Relationship Between Economic Growth, Energy Consumption, and the Environment: Evidence from Nigeria

Ahmad, Ahmad January 2023 (has links)
This thesis employs the Autoregressive Distributed Lag model (ARDL), Toda-Yamamoto causality analysis, and ordinary least square (OLS for robust estimation) techniques to empirically investigate the impact of economic growth and energy consumption on the environment in Nigeria from 1980 to 2020. The results of cointegration demonstrate a long-term link between the model's input variables. The outcome of the first objective of the study shows that trade and economic development in Nigeria worsen the state of the environment. Environmental quality is accelerated by financial development; nevertheless, FDI is proven to be insignificant in predicting environmental quality. The result demonstrates that FDI and energy use both have the potential to significantly speed up the rate of environmental degradation. Nevertheless, trade has a negligible impact on the environment in the country, and financial development slows down environmental deterioration. The study also finds that the combination between energy and economic development improves Nigeria's environmental quality. The outcome of the fourth objective shows that economic expansion and energy consumption have a favorable impact on the environment. Additionally, environmental degradation, energy use, and economic growth are all causally related. Moreover, the outcome of the robust estimation reveals a positive and significant relationship between economic growth and energy consumption in the environment. Therefore, the study suggests economic policies with environmental control measures. This could be through an emphasis on the use of other alternatives of low-emission energy, that will mitigate the level of C02 and enhance energy utilization for a better environment in the nation.
776

Validation and Optimization of Hyperspectral Reflectance Analysis-Based Predictive Models for the Determination of Plant Functional Traits in Cornus, Rhododendron, and Salix

Valdiviezo, Milton I 01 January 2020 (has links)
Near infrared spectroscopy (NIR) has become increasingly widespread throughout various fields as an alternative method for efficiently phenotyping crops and plants at rates unparalleled by conventional means. With growing reliability, the convergence of NIR spectroscopy and modern machine learning represent a promising methodology offering unprecedented access to rapid, high throughput phenotyping at negligible costs, representing prospects that excite agronomists and plant physiologists alike. However, as is true of all emergent methodologies, progressive refinement towards optimization exposes potential flaws and raises questions, one of which is the cornerstone of this study. Spectroscopic determination of plant functional traits utilizes plants' morphological and biochemical properties to make predictions, and has been validated at the community (inter-family) and individual crop (intraspecific) levels alike, yielding equally reliable predictions at both scales, yet what lies amid these poles on the spectrum of taxonomic scale remains unexplored territory. In this study, we replicated the protocol used in studies of the aforementioned taxonomic scale extremes and applied it to an intermediate scale. Interestingly, we found that predictive models built upon hyperspectral reflectance data collected across three genera of woody plants: Cornus, Rhododendron, and Salix, yielded inconsistent predictions of varying accuracy within and across taxa. Identifying the potential cause(s) underlying variability in predictive power at this intermediate taxonomic scale may reveal novel properties of the methodology, potentially permitting further optimization through careful consideration.
777

Application of Data-Driven Modeling Techniques to Wastewater Treatment Processes

Hermonat, Emma January 2022 (has links)
Wastewater treatment plants (WWTPs) face increasingly stringent effluent quality constraints as a result of rising environmental concerns. Efficient operation of the secondary clarification process is essential to be able to meet these strict regulations. Treatment plants can benefit greatly from making better use of available resources through improved automation and implementing more process systems engineering techniques to enhance plant performance. As such, the primary objective of this research is to utilize data-driven modeling techniques to obtain a representative model of a simplified secondary clarification unit in a WWTP. First, a deterministic subspace-based identification approach is used to estimate a linear state-space model of the secondary clarification process that can accurately predict process dynamics, with the ultimate objective of motivating the use of the subspace model in a model predictive control (MPC) framework for closed-loop control of the clarification process. To this end, a low-order subspace model which relates a set of typical measured outputs from a secondary clarifier to a set of typical inputs is identified and subsequently validated on simulated data obtained via Hydromantis's WWTP simulation software, GPS-X. Results illustrate that the subspace model is able to approximate the nonlinear process behaviour well and can effectively predict the dynamic output trajectory for various candidate input profiles, thus establishing its candidacy for use in MPC. Subsequently, a framework for forecasting the occurrence of sludge bulking--and consequently clarification failure--based on an engineered interaction variable that aims to capture the relationship between key input variables is proposed. Partial least squares discriminant analysis (PLS-DA) is used to discriminate between process conditions associated with clarification failure versus effective clarification. Preliminary results show that PLS-DA models augmented with the interaction variable demonstrate improved predictions and higher classification accuracy. / Thesis / Master of Applied Science (MASc)
778

HYPER-RECTANGLE COVER THEORY AND ITS APPLICATIONS

Chu, Xiaoxuan January 2022 (has links)
In this thesis, we propose a novel hyper-rectangle cover theory which provides a new approach to analyzing mathematical problems with nonnegativity constraints on variables. In this theory, two fundamental concepts, cover order and cover length, are introduced and studied in details. In the same manner as determining the rank of a matrix, we construct a specific e ́chelon form of the matrix to obtain the cover order of a given matrix efficiently and effectively. We discuss various structures of the e ́chelon form for some special cases in detail. Based on the structure and properties of the constructed e ́chelon form, the concepts of non-negatively linear independence and non-negatively linear dependence are developed. Using the properties of the cover order, we obtain the necessary and sufficient conditions for the existence and uniqueness of the solutions for linear equations system with nonnegativity constraints on variables for both homogeneous and non-homogeneous cases. In addition, we apply the cover theory to analyze some typical problems in linear algebra and optimization with nonnegativity constraints on variables, including linear programming problems and non-negative least squares (NNLS) problems. For linear programming problem, we study the three possible behaviors of the solutions for it through hyper-rectangle cover theory, and show that a series of feasible solutions for the problem with the zero-cover e ́chelon form structure. On the other hand, we develop a method to obtain the cover length of the covered variable. In the process, we discover the relationship between the cover length determination problem and the NNLS problem. This enables us to obtain an analytical optimal value for the NNLS problem. / Thesis / Doctor of Philosophy (PhD)
779

Statistical quality assurance of IGUM : Statistical quality assurance and validation of IGUM in a steady and dynamic gas flow prior to proof of concept

Kornsäter, Elin, Kallenberg, Dagmar January 2022 (has links)
To further support and optimise the production of diving tables for the Armed Forces of Sweden, a research team has developed a new machine called IGUM (Inert Gas UndersökningsMaskin) which aims to measure how inert gas is taken up and exhaled. Due to the new design of machine, the goal of this thesis was to statistically validate its accuracy and verify its reliability.  In the first stage, a quality assurance of the linear position conversion key of IGUM in a steady and known gas flow was conducted. This was done by collecting and analysing data in 29 experiments followed by examination with ordinary least squares, hypothesis testing, analysis of variance, bootstrapping and Bayesian hierarchical modelling. Autocorrelation among the residuals were detected but concluded to not have an impact on the results due to the bootstrap analysis. The results showed an estimated conversion key equal to 1.276 ml/linear position which was statistically significant for all 29 experiments.  In the second stage, it was examined if and how well IGUM could detect small additions of gas in a dynamic flow. The breathing machine ANSTI was used to simulate the sinus pattern of a breathing human in 24 experiments where 3 additions of 30 ml of gas manually was added into the system. The results were analysed through sinusoidal regression where three dummy variables represented the three additions of gas in each experiment. To examine if IGUM detects 30 ml for each input, the previously statistically proven conversion key at 1.276ml/linear position was used. An attempt was made to remove the seasonal trend in the data, something that was not completely successful which could influence the estimations. The results showed that IGUM indeed can detect these small gas additions, where the amount detected showed some differences between dummies and experiments. This is most likely since not enough trend has been removed, rather than IGUM not working properly.
780

School Administrator Impact Upon Physical Restraints in Public Schools

Dowell, Richard Marshall 19 June 2014 (has links)
No description available.

Page generated in 0.0821 seconds