• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 296
  • 24
  • 21
  • 18
  • 9
  • 7
  • 7
  • 5
  • 4
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 489
  • 489
  • 120
  • 106
  • 99
  • 88
  • 70
  • 67
  • 62
  • 56
  • 52
  • 47
  • 47
  • 46
  • 43
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
291

Designing Applications for Smart Cities: A designerly approach to data analytics

Bücker, Dennis January 2017 (has links)
The purpose of this thesis is to investigate the effects of a designerly approach to data analytics. The research was conducted during the Interaction Design Master program at Malmö University in 2017 and follows a research through design approach where the material driven design process in itself becomes a way to acquire new knowledge. The thesis uses big data as design material for designers to ideate connected products and services in the context of smart city applications. More specifically, it conducts a series of material studies that show the potential of this new perspective to data analytics. As a result of this research a set of designs and exercises are presented and structured into a guide. Furthermore, the results emphasize the need for this type of research and highlights data as a departure material as of special interest for HCI.
292

Conditions and Practices of Data-Driven Innovation in Swedish Healthcare

Shenouda, Ramy, Herz, Stefan January 2022 (has links)
This study explores the conditions and practices for data-driven innovation in the Swedish healthcare sector. While innovation nowadays often is based on big amounts of data, the laws for handling data in healthcare are restrictive. However, we see that remarkable innovation is happening in Swedish healthcare anyway. Therefore, we decided to explore the legal situation and talk to different actors to examine how they deal with the circumstances and how they innovate. We related those findings to the literature on data-driven innovation, platforms, artificial intelligence and about the healthcare sector context. It turned out that legislation is only one of the barriers to innovation. In addition, organizational and structural factors play a big role, too. Furthermore, we point out the strategic responses and their usage of artificial intelligence. Our contributions are that we mapped the Swedish healthcare landscape with a focus on data-driven innovation and that we applied Huang’s et al. (2017) model of rapid platform scaling to this context. Moreover, we point out how the healthcare sector differs from commercial business and how that is reflected in the innovation practices. Finally, we show which barriers need to be removed in order to improve the conditions for data-driven innovation.
293

Bottleneck Identification using Data Analytics to Increase Production Capacity

Ganss, Thorsten Peter January 2021 (has links)
The thesis work develops an automated, data-driven bottleneck detection procedure based on real-world data. Following a seven-step process it is possible to determine the average as well as the shifting bottleneck by automatically applying the active period method. A detailed explanation of how to pre-process the extracted data is presented which is a good guideline for other analysists to customize the available code according to their needs. The obtained results show a deviation between the expected bottleneck and the bottleneck calculated based on production data collected in one week of full production. The expected bottleneck is currently determined by the case company by measuring cycle times physically at the machine, but this procedure does not represent the whole picture of the production line and is therefore recommended to be replaced by the developed automated analysis. Based on the analysis results, different optimization potentials are elaborated and explained to improve the data quality as well as the overall production capacity of the investigated production line. Especially, the installed gantry systems need further analysis to decrease their impact on the overall capacity. As for data quality, especially, the improvement of the machines data itself as well as the standardization of timestamps should be focused to enable better analysis in the future. Finally, future recommendations mainly suggest to run the analysis several times with new data sets to validate the results and improve the overall understanding of the production lines behavior. / Detta examensarbete utvecklar en process för en automatiserad, datadriven flaskhalsidentifiering baserad på verkliga data. Följt av en sjustegsprocess ges det möjlighet att bestämma den genomsnittliga och den varierande flaskhalsen genom en automatisk implementering av ”the active period method”. En detaljerad förklaring av hur man förbehandlar informationen som extraherats är presenterat vilket är en god riktlinje för andra analytiker för att anpassa den tillgängliga koden utifrån deras behov. Det samlade resultatet illustrerar en avvikelse mellan den förväntade flaskhalsen och den flaskhalsen som utgår ifrån beräkningar av tillverkningsdata ansamlat i en vecka av full produktion. Den förväntade flaskhalsen är för nuvarande bestämt av fallets företag genom en fysisk mätning av cykeltiderna på maskinen, däremot är denna process inte representativ för helhetsbilden på tillverkningslinjen och det är därvid rekommenderat att ersätta den föregående flaskhalsidentifieringen med den utvecklade automatiserade analysen. Baserat på analysens resultat framkom det olika optimiseringsmöjligheter som är utvecklade och klargjorda för att förbättra kvaliteten på data samt den övergripande produktionskapaciteten av den undersökta produktionslinjen. Speciellt när det gällerde installerade portalsystemen så behövs det en fördjupande analys för att minimera dess verkan på den översiktliga kapaciteten. När det gäller datakvalitet, speciellt förbättringen av maskindata, behövs det en standardiserad tidsstämpling för att utföra enbättre analys i framtiden. De framtida rekommendationerna föreslår huvudsakligen att köra analysen ett flertal gånger med nya datauppsättningar för att validera resultaten och förbättra den övergripliga uppfattningen av produktionslinjens beteende.
294

Stochastic distribution tracking control for stochastic non-linear systems via probability density function vectorisation

Liu, Y., Zhang, Qichun, Yue, H. 08 February 2022 (has links)
Yes / This paper presents a new control strategy for stochastic distribution shape tracking regarding non-Gaussian stochastic non-linear systems. The objective can be summarised as adjusting the probability density function (PDF) of the system output to any given desired distribution. In order to achieve this objective, the system output PDF has first been formulated analytically, which is time-variant. Then, the PDF vectorisation has been implemented to simplify the model description. Using the vector-based representation, the system identification and control design have been performed to achieve the PDF tracking. In practice, the PDF evolution is difficult to implement in real-time, thus a data-driven extension has also been discussed in this paper, where the vector-based model can be obtained using kernel density estimation (KDE) with the real-time data. Furthermore, the stability of the presented control design has been analysed, which is validated by a numerical example. As an extension, the multi-output stochastic systems have also been discussed for joint PDF tracking using the proposed algorithm, and the perspectives of advanced controller have been discussed. The main contribution of this paper is to propose: (1) a new sampling-based PDF transformation to reduce the modelling complexity, (2) a data-driven approach for online implementation without model pre-training, and (3) a feasible framework to integrate the existing control methods. / This paper is partly supported by National Science Foundation of China under Grants (61603262 and 62073226), Liaoning Province Natural Science Joint Foundation in Key Areas (2019- KF-03-08), Natural Science Foundation of Liaoning Province (20180550418), Liaoning BaiQianWan Talents Program, i5 Intelligent Manufacturing Institute Fund of Shenyang Institute of Technology (i5201701), Central Government Guides Local Science and Technology Development Funds of Liaoning Province (2021JH6/10500137).
295

School Practices and Student Achievement

Atkins, Rosa Stocks 08 December 2008 (has links)
After implementing a statewide standardized testing program in 1998, the Virginia Department of Education realized that some schools were making great gains in student achievement while other schools continued to struggle. The Department conducted a study to identify the practices used by schools showing improvement. Six effective practice domains were identified. The current study was a follow-up to the research conducted by the Virginia Department of Education. A questionnaire measuring the six effective practice domains: (a) curriculum alignment, (b) time and scheduling, (c) use of data, (d) professional development, (e) school culture, and (f) leadership was administered to teachers in 148 schools in Virginia; 80 schools participated. Two questions guided the study: (1) How frequently do schools use the Virginia Department of Education effective practices, and (2) what is the relationship between the use of the effective practices and school pass rates on the 3rd grade 2005 Standards of Learning (SOL) reading test? Descriptive statistics, linear regression, and discriminant function analysis were applied to explore the relationships between the predictor variables (percentage of students receiving free or reduced-price lunch and the use of the effective practices) and the criterion variable (school pass rate on the 2005 SOL 3rd grade reading test). Academic culture and the percentage of students receiving free or reduced-price lunch accounted for significant amounts of the variance in school pass rates. The remaining five effective practice measures were not related to school pass rates. The measures may have affected the results. In most cases, one person was used as the proxy for the school, and this person may have provided a biased assessment of what was happening in the school. / Ed. D.
296

Numerical Analysis for Data-Driven Reduced Order Model Closures

Koc, Birgul 05 May 2021 (has links)
This dissertation contains work that addresses both theoretical and numerical aspects of reduced order models (ROMs). In an under-resolved regime, the classical Galerkin reduced order model (G-ROM) fails to yield accurate approximations. Thus, we propose a new ROM, the data-driven variational multiscale ROM (DD-VMS-ROM) built by adding a closure term to the G-ROM, aiming to increase the numerical accuracy of the ROM approximation without decreasing the computational efficiency. The closure term is constructed based on the variational multiscale framework. To model the closure term, we use data-driven modeling. In other words, by using the available data, we find ROM operators that approximate the closure term. To present the closure term's effect on the ROMs, we numerically compare the DD-VMS-ROM with other standard ROMs. In numerical experiments, we show that the DD-VMS-ROM is significantly more accurate than the standard ROMs. Furthermore, to understand the closure term's physical role, we present a theoretical and numerical investigation of the closure term's role in long-time integration. We theoretically prove and numerically show that there is energy exchange from the most energetic modes to the least energetic modes in closure terms in a long time averaging. One of the promising contributions of this dissertation is providing the numerical analysis of the data-driven closure model, which has not been studied before. At both the theoretical and the numerical levels, we investigate what conditions guarantee that the small difference between the data-driven closure model and the full order model (FOM) closure term implies that the approximated solution is close to the FOM solution. In other words, we perform theoretical and numerical investigations to show that the data-driven model is verifiable. Apart from studying the ROM closure problem, we also investigate the setting in which the G-ROM converges optimality. We explore the ROM error bounds' optimality by considering the difference quotients (DQs). We theoretically prove and numerically illustrate that both the ROM projection error and the ROM error are suboptimal without the DQs, and optimal if the DQs are used. / Doctor of Philosophy / In many realistic applications, obtaining an accurate approximation to a given problem can require a tremendous number of degrees of freedom. Solving these large systems of equations can take days or even weeks on standard computational platforms. Thus, lower-dimensional models, i.e., reduced order models (ROMs), are often used instead. The ROMs are computationally efficient and accurate when the underlying system has dominant and recurrent spatial structures. Our contribution to reduced order modeling is adding a data-driven correction term, which carries important information and yields better ROM approximations. This dissertation's theoretical and numerical results show that the new ROM equipped with a closure term yields more accurate approximations than the standard ROM.
297

Data-driven Parameter Estimation of Stochastic Models with Applications in Finance

Ayorinde, Ayoola January 2024 (has links)
Parameter estimation is a powerful and adaptable framework that addresses the inherent complexities and uncertainties of financial data. We provide an overview of likelihood functions, and likelihood estimations, as well as the essential numerical approximations and techniques. In the financial domain, where unpredictable and non-stationary market dynamics prevail, parameter estimations of relevant SDE models prove highly relevant. We delve into practical applications, showcasing how SDEs can effectively capture the inherent uncertainties and dynamics of financial models related to time evolution of interest rates. We work with the Vašíček model and Cox-Ingersoll-Ross (CIR) model which describes the dynamics of interest rates over time. We incorporate the Maximum likelihood and Quasi-maximum likelihood estimation methods in estimating the parameters of our models.
298

Dissertation_XiaoquanGao.pdf

Xiaoquan Gao (12049385) 04 December 2024 (has links)
Public sector services often face challenges in allocating limited resources effectively. Despite their fundamental importance to societal welfare, these systems often operate without sufficient analytical support, and their decision-making processes remain understudied in academic literature. While data-driven analytical approaches offer promising solutions for addressing complex tradeoffs and resource constraints, the unique characteristics of public systems create significant challenges for modeling and developing efficient solutions. This dissertation addresses these challenges by applying stochastic models to enhance decision-making in two critical areas: emergency medical services in healthcare and jail diversion in the criminal justice system.The first part focuses on integrating drones into emergency medical services to shorten response times and improve patient outcomes. We develop a Markov Decision Process (MDP) model to address the coordination between aerial and ground vehicles, accounting for uncertain travel times and bystander availability. To solve this complex problem, we develop a tractable approximate policy iteration algorithm that approximates value function through neural networks, with basis functions tailored to the spatial and temporal characteristics of the EMS system. Case studies using historical data from Indiana provide valuable insights for managing real-time EMS logistics. Our results show that drone augmentation can reduce response times by over 30% compared to traditional ambulances. This research provides practical guidelines for implementing drone-assisted emergency medical services while contributing to the literature on hybrid delivery systems.The second part develops data-driven analytical tools to improve placement decisions in jail diversion programs, balancing public safety and individual rehabilitation. Community corrections programs offer promising alternatives to incarceration but face their own resource constraints. We develop an MDP model that captures the complex tradeoffs between individual recidivism risks and the impacts of overcrowding. Our model extends beyond traditional queueing problems by incorporating criminal justice-specific features, including deterministic service times and convex occupancy-dependent costs. To overcome the theoretical challenges, we develop a novel unified approach that combines system coupling with policy deviation bounds to analyze value functions, ultimately establishing the superconvexity. This theoretical foundation enables us to develop an efficient algorithm based on time-scale separation, providing practical tools for optimizing diversion decisions. Case study based on real data from our community partner shows our approach can reduce recidivism rates by 28% compared to current practices. Beyond academic impact, this research has been used by community partners to secure program funding for future staffing.<p></p>
299

Utan data är HR bara en funktion med en åsikt? : En kvalitativ studie om datadriven HR inom offentlig sektor

Clemensson, Lisa January 2019 (has links)
HR-funktionen har genomgått en del förändringar under de senaste åren. Förändringarna har främst skett genom att HR har gått från att vara en personaladministrativ funktion till en allt mer strategisk funktion. Detta har ställt nya krav på HR-funktionens roll och dess arbete. Bland annat har HR behövt bli mer datadrivna. Störst utmaning har detta inneburit för HR-funktioner inom den offentliga sektorn, som till skillnad från privat sektor, fortfarande ligger efter i den datadrivna utvecklingen, och lite forskning har gjorts i den offentliga kontexten. Syftet med denna masteruppsats är därför att öka förståelsen för datadrivet HR-arbete inom offentlig sektor. För att göra detta undersöktes tre forskningsfrågor: (1) Hur påverkas HR-funktionens roll av datadrivet HR-arbete? (2) Vilka möjligheter och utmaningar finns med datadrivet HR-arbete? (3) Vad är unikt med datadriven HR i offentlig sektor? För att få svar på forskningsfrågorna har en kvalitativ flerfallsstudie genomförts i nio olika kommuner i Västra Götaland och Halland. Data samlades in genom semistrukturerade intervjuer och en tematisk analysmetod tillämpades vid sammanställningen av resultatet. Resultatet visar att HR-funktionens roll och deras arbete drivs mer i linje med rationell styrfilosofi när ett mer datadrivet HR-arbete tillämpas. Denna studie visar att datadriven HR riskerar att ta bort HR-funktionens fokus på mjuk HR, och istället främja hård HR inom HR-arbetet, där mätning och kontroller förekommer i större grad. Möjligheten med datadrivet HR-arbete är att kunna använda och nyttja stora mängder tillgänglig data för att ta bättre beslut och genomföra effektiva insatser inom HR-området. Det möjliggör ett mer strategiskt HR-arbete som kan bidra till ökad legitimitet och förtroende för HR-funktionen. Utmaningen är att det ofta saknas kompetens hos HR-personal att arbeta datadrivet. Resultatet visar också att offentlig sektor påverkas av det faktum att de förväntas tillämpa en allt mer tillitsbaserad styrning, vilket är en styrfilosofi som kan anses vara oförenlig med datadriven HR. Studien bidrar med djupare förståelse, och fyller en kunskapslucka, gällande datadrivet HR-arbete inom offentlig sektor. Studien bidrar med insikter gällande den stora avvägning HR-funktionen är på väg in i, när HR måste hitta sätt att integrera både mjuk och hård. HR-funktionerna står mitt emellan en betydande möjlighet av att använda data, samtidigt som det finns en kraft i tillitsbaserad styrning som menar att offentlig sektor måste sluta att mäta allting, då det kan skapa ett alltför stort fokus på kontroll och detaljstyrning. Utmaningen med datadriven HR är därför hur HR-funktionerna ska använda stora mängder data utan att bli allt för datadrivna. Studien belyser att det i datadriven HR måste finnas kunskap kring när mätningar kan anses vara säkra och användas för att göra förutsägelser eller utvärderingar, men också när mätningar kan skapa felaktiga incitament eller distraktioner. Denna studies resultat kan användas som ett hjälpmedel för HR-funktioner inom den offentliga sektorn och fungera som ett ramverk för dessa HR-funktioner i syfte att utvärdera sitt datadrivna HR-arbete.
300

Development of a data-driven algorithm to Determine the W+Jets Background in tt - events in ATLAS

Mehlhase, Sascha 30 August 2010 (has links)
Die Physik des Top-Quarks ist eine Schlüsselkomponente im Forschungsprogramm des ATLAS-Experiments am CERN. In dieser Arbeit werden Untersuchungen zur Leistungfähigkeit von Jet-Triggern für Top-Quark-Ereignisse präsentiert und zwei datenbasierte Methoden zur Abschätzung der Multijet-Triggereffizienz und des W+Jets-Untergrundes in Top-Quark-Ereignissen in ATLAS eingeführt. In einer tag-and-probe Methode, basierend auf einer einfachen und allgemeinen Ereignisselektion und einem hochenergetischen Lepton als Tag, wird die Möglichkeit zur Bestimmung der Multijet-Triggereffizienz aus Daten heraus evaluiert, und es wird gezeigt, dass die Methode in der Lage ist, die Effizienz ohne signifikante Verfälschung durch die Tag-Selektion zu bestimmen. In der zweiten datenbasierten Analyse wird eine neue Methode zur Abschätzung des W+Jets-Untergrundes in ATLAS eingeführt. Durch die Definition von signal- und untergrunddominierten Bereichen in Jet-Muliplizität und Pseudorapidität des Leptons wird der Anteil der W+Jets-Ereignisse aus der untergrunddominierten in die signaldominierte Region extrapoliert. Es wird gezeigt, dass die Methode, mit einer integrierten Luminosität von 100 pb^−1 bei sqrt(s) = 10 TeV, in der Lage ist den Untergrundbeitrag als Funktion der Jet-Muliplizität mit etwa 25% Genauigkeit im Großteil der signaldominierten Region zu bestimmen. Diese Arbeit umfaßt zudem eine Studie zum thermischen Verhalten und der erwarteten thermischen Leistung des Pixel-Detektors in ATLAS. Alle Messungen, durchgeführt während der Inbetriebnahme des Systems in 2008/09, zeigen Ergebnisse innerhalb der Spezifikationen beziehungweise deuten auf deren Einhaltung auch nach mehreren Betriebsjahren unter LHC-Bedingungen hin. / The physics of the top quark is one of the key components in the physics programme of the ATLAS experiment at the Large Hadron Collider at CERN. In this thesis, general studies of the jet trigger performance for top quark events using fully simulated Monte Carlo samples are presented and two data-driven techniques to estimate the multi-jet trigger efficiency and the W+Jets background in top pair events are introduced to the ATLAS experiment. In a tag-and-probe based method, using a simple and common event selection and a high transverse momentum lepton as tag object, the possibility to estimate the multijet trigger efficiency from data in ATLAS is investigated and it is shown that the method is capable of estimating the efficiency without introducing any significant bias by the given tag selection. In the second data-driven analysis a new method to estimate the W+Jets background in a top-pair event selection is introduced to ATLAS. By defining signal and background dominated regions by means of the jet multiplicity and the pseudo-rapidity distribution of the lepton in the event, the W+Jets contribution is extrapolated from the background dominated into the signal dominated region. The method is found to estimate the given background contribution as a function of the jet multiplicity with an accuracy of about 25% for most of the top dominated region with an integrated luminosity of above 100 pb^−1 at sqrt(s) = 10 TeV. This thesis also covers a study summarising the thermal behaviour and expected performance of the Pixel Detector of ATLAS. All measurements performed during the commissioning phase of 2008/09 yield results within the specification of the system and the performance is expected to stay within those even after several years of running under LHC conditions.

Page generated in 0.0397 seconds