Spelling suggestions: "subject:"datadriven"" "subject:"datadrivet""
261 |
Evaluation of Data-Driven Gating for 68Ga-ABY-025 PET/CT in Breast Cancer PatientsNcuti Nobera, Alain-Klaus January 2020 (has links)
Respiratory motion during PET acquisition degrades image quality. It is mainly the area around the thorax and abdomen which is affected. External devices do provide respiratory gating solutions but are time-consuming to set up on patients and may not always be available. A data-driven gating (DDG) method based on principal component analysis (PCA) was found to provide a reliable respiratory gating signal, discriminating the need for external gating systems with FDG, but it remains to be investigated how well it performs with other PET tracers. The HER2-targeting radiotracer 68Ga-ABY-025 is currently in phase 3 development and is aimed to develop methods to select breast cancer patients that benefit from HER2-targeted treatment. Hence, absolute quantification is important. Respiratory motion correction will be important for improved quantitative accuracy since many patients have metastases in the lower part of the lungs or the liver. DDG was applied to PET/CT list mode data retrospectively using quiescent period gating. Gated images were then compared to reconstructions without gating with a matched number of coincidences. Two iterative reconstructions were evaluated, TOF OSEM (3 iterations, 16 subsets, and a 5 mm gaussian postprocessing filter) and TOF BSREM β 400. Images were evaluated for standardized uptake value (SUV) changes for well-defined lesions in thorax and abdomen where respiratory motion is prevalent. Respiratory motion was detected in a mean 2.1 bed positions per examination. DDG application resulted in a mean increase of 12.7% in SUVmax for TOF OSEM reconstruction (p=0.0156).
|
262 |
Data-driven and real-time prediction models for iterative and simulation-driven design processesArjomandi Rad, Mohammad January 2022 (has links)
The development of more complex products has increased dependency on virtual/digital models and emphasized the role of simulations as a means of validation before production. This level of dependency on digital models and simulation togetherwith the customization level and continuous requirement change leads to a large number of iterations in each stage of the product development process. This research, studies such group of products that have multidisciplinary, highly iterative, and simulation-driven design processes. It is shown that these high-level technical products, which are commonly outsourced to suppliers, commonly suffer from a long development lead time. The literature points to several research tracks including design automation and data-driven design with possible support. After studying the advantages and disadvantages of each track, a data-driven approachis chosen and studied through two case studies leading to two supporting tools that are expected to improve the development lead time in associated design processes. Feature extraction in CAD as a way to facilitate metamodeling is proposed as the first solution. This support uses the concept of the medial axis to find highly correlated features that can be used in regression models. As for the second supporting tool, an automated CAD script is used to produce a library of images associated with design variants. Dynamic relaxation is used to label each variant with its finite element solution output. Finally, the library is used to train a convolutions neural network that maps screenshots of CAD as input to finite element field answers as output. Both supporting tools can be used to create real-time prediction models in the early conceptual phases of the product development process to explore design space faster and reduce lead time and cost. / Utvecklingen av mer komplexa produkter har ökat beroendet av virtuella/digitala modeller och ökat betydelsen av simuleringar för att validera en produkt inför produktion. Ett stort beroende av digitala modeller och simulering tillsammans med den individuella anpassningen och kontinuerliga kravförändringar leder till ett stort antal iterationer i varje steg i produktutvecklingsprocessen. Forskningen som presenteras i denna avhandling studerar denna typ av produkter som har multidisciplinära, mycket iterativa och simuleringsdrivna designprocesser. Det har visat sig att dessa tekniska produkter på hög nivå, som vanligtvis tillhandahålls av underleverantörer, vanligtvis har en lång ledtid för utveckling. Litteraturstudien pekar på flera forskningsspår, exempelvis designautomation och datadriven design, eventuellt med stöd. Efter att ha studerat fördelarna och nackdelarna med varje spår, väljs det datadrivna tillvägagångssättet och studeras genom två fallstudier som leder till att två stödjande verktyg tas fram. De förväntas förbättra utvecklingsledtiden i tillhörande designprocesser. Feature extraktion i CAD som ett sätt att underlätta metamodellering föreslås som det första verktyget. Detta stöd använder medial axis för att hitta korrelerade features som kan användas i regressionsmodeller. När det gäller det andra stödjande verktyget används ett automatiserat CAD-skript för att producera ett stort bibliotek med bilder som är associerade olika designvarianter. Dynamisk relaxation används för att märka varje variant med dess finita elementlösning. Slutligen används detta bibliotek för att träna ett konvolutionerande neuralt nätverk som kartlägger skärmdumpar av CAD som indata till finita elementfältsvar som utdata. Båda stödverktygen kan användas för att skapa modeller för förutsägelser i realtid i de tidiga konceptuella faserna av produktutvecklingsprocessen för att utforska designrymden snabbare och minska ledtid och kostnader.
|
263 |
The transition to data-driven production logistics:Opportunities and challengesZafarzadeh, Masoud January 2021 (has links)
A data-driven approach is considered a viable means of dealing with thehigh degree of dynamics caused by the constant changes that occur withinproduction logistics systems. However, there is a dearth of knowledgeregarding the consequences of employing a data-driven approach inproduction logistics in real industrial environments. This thesis aims toextend the existing body of knowledge concerning the opportunities andchallenges of a transition to a data-driven state in relation to productionlogistics through investigating real industrial cases.In addition to reviewing the literature, this thesis aims to answer threeresearch questions. First, it seeks to determine how enabling technologiescontribute to value creation in a data-driven production logistics system.Second, it studies three industrial companies, analyses their productionlogistics flows and compares the tradition approach to a data-drivenapproach by means of discrete event simulation. Third, through interviewswith several experts with different competences who work for the casecompanies, it aims to identify the challenges associated with the transitionto a data-driven approach.The results show that following a systematic and balanced approach totechnology implementation is important with regard to value creation. Thepotential benefits include improved operational performance, improvedvisibility through real-time control and the possibility for dynamicscheduling and planning. The challenges associated with the transition canbe divided into two major categories: organisational and technical.Moreover, the identified challenges can be mapped against each step in theproduction logistics data life-cycle.Among the identified challenges, some represent potentially valuableavenues for future research. Investigating the possibilities for addressingthe data ownership challenge among stakeholders is one such avenue.Additionally, future studies could address the fact that the technologiesrelated to data analytics, such as artificial intelligence, big data andblockchain, lack a large-scale implementation history when compared withtechnologies such as radio frequency identification. Given the limitations ofprior studies, another possible research avenue involves analysing the dataanalytics use cases in more detail within real industrial environments. / Datadrivna metoder betraktas som ett sätt att hantera den höga dynamik som orsakas av ständiga förändringar i industriella system för produktionslogistik. Dock finns det idag begränsad kunskap gällande konsekvenser av att tillämpa datadrivna tillvägagångssätt på produktionslogistik i industriell miljö. Denna avhandling syftar till att utöka den befintliga kunskapen om möjligheter och utmaningar vid övergången till datadriven produktionslogistik genom att utreda verkliga industrifall och genomföra litteraturstudier. Tre forskningsfrågor har formulerats för att nå detta syfte. Först, att utreda hur den möjliggörande tekniken bidrar till värdeskapande i ett datadrivet produktionslogistiksystem. För det andra, att utreda potentiella förbättringar i och med en övergång till datadrivet produktionslogistiksystem, där studier har genomförts på tre industriföretag, deras produktionslogistikflöde samt en jämförelse (genom diskret händelsestyrd simulering) mellan nuläge och börläge. För det tredje, att identifiera utmaningarna vid en övergång till datadrivet produktionslogistiksystem, där flera experter med olika kompetenser har intervjuats i företagen. Resultatet visar att ett systematiskt balanserat tillvägagångssätt för teknikimplementering är viktigt för värdeskapande. Potentiella fördelar inkluderar förbättrad driftsprestanda, förbättrad synlighet genom att ha realtidskontroll och underlätta dynamisk schemaläggning och planering. Övergångsutmaningar är indelade i två huvudkategorier; organisatoriska och tekniska. De identifierade utmaningarna kartläggs mot varje steg i produktionslogistikens livscykel. Bland de identifierade utmaningarna har vissa en särskild potential för framtida forskning. Att undersöka möjligheten att ta itu med utmaningen för dataägande bland intressenter är en av möjligheterna för vidare forskning. Dessutom, teknologier relaterade till dataanalys, såsom AI, big data och block chain har mindre storskalig implementeringshistorik jämfört med annan teknik, såsom RFID. Ett möjligt alternativ för vidare forskning är att analysera användningsfall av dataanalys i mer detalj, givet alla begränsningar som finns inom verklig industriell produktionsmiljö. Nyckelord Produktionslogistik, Data-driven, Smart, Transition, Teknologi, Simulering
|
264 |
Transitioning Business Intelligence from reactive to proactive decision-making systems : A qualitive usability study based on Technology Acceptance ModelAbormegah, Jude Edem, Bahadin Tarik, Dashti January 2020 (has links)
Nowadays companies are in a dynamic environment leading to competition in finding new revenue streams to strengthen their positions in their markets by using new technologies to provide capabilitiesto organize resources whilst taking into account changes that can occur in their environment. Therefore, decision making is inevitable to combat uncertainties where taking the optimal action by leveraging concepts and technologies that support decision making such as Business Intelligence (BI)tools and systems could determine a company’s future. Companies can optimize their decision making with BI features like Data-Driven Alerts that sends messages when fluctuations occur within a supervised threshold that reflects the state of business operations. The purpose of this research was to conduct an empirical study on how Swedish companies and enterprises located in different industries apply BI tools and with Data-driven Alerts features for decision making whereby we further studied the characteristics of Data-driven Alerts in terms of usability from the perspectives of different industry professionals through the thematic lens of the Technology acceptance model (TAM) in a qualitative approach. We conducted interviews with professionals from diverse organizations where we applied the Thematic Coding technique on empirical results for further analysis. We found out that by allowing possibilities for users to analyze data in their own preferences for decisions, it will provide managers and leaders with sufficient information needed to empower strategic and tactical decision-making. Despite the emergence of state-of-the-art predictive analytics technologies such as Machine Learning and AI, the literature clearly states that these processes are technical and complex to be comprehended by the decision maker. At the end of the day, prescriptive analytics will end up providing descriptive options being presented to the end user as we move towards automated decision making. This we see as an opportunity for reporting tools and data-driven alerts to be in contemporary symbiotic relationship with advanced analytics in decision making contexts to improve its outcome, quality and user friendliness.
|
265 |
Toward predictive maintenance in surface treatment processes : A DMAIC case study at Seco Tools / Mot prediktivt underhåll inom ytbehandlingsprocesser : En fallstudie enligt DMAIC vid Seco ToolsBerg, Martin, Eriksson, Albin January 2021 (has links)
Surface treatments are often used in the manufacturing industry to change the surface of a product, including its related properties and functions. The occurrence of degradation and corrosion in surface treatment processes can lead to critical breakdowns over time. Critical breakdowns may impair the properties of the products and shorten their service life, which causes increased lead times or additional costs in the form of rework or scrapping. Prevention of critical breakdowns due to machine component failure requires a carefully selected maintenance policy. Predictive maintenance is used to anticipate equipment failures to allow for maintenance scheduling before component failure. Developing predictive maintenance policies for surface treatment processes is problematic due to the vast number of attributes to consider in modern surface treatment processes. The emergence of smart sensors and big data has led companies to pursue predictive maintenance. A company that strives for predictive maintenance of its surface treatment processes is Seco Tools in Fagersta. The purpose of this master's thesis has been to investigate the occurrence of critical breakdowns and failures in the machine components of the chemical vapor deposition and post-treatment wet blasting processes by mapping the interaction between its respective process variables and their impact on critical breakdowns. The work has been conducted as a Six Sigma project utilizing the problem-solving methodology DMAIC. Critical breakdowns were investigated combining principal component analysis (PCA), computational fluid dynamics (CFD), and statistical process control (SPC) to create an understanding of the failures in both processes. For both processes, two predictive solutions were created: one short-term solution utilizing existing dashboards and one long-term solution utilizing a PCA model and an Orthogonal Partial Least Squares (OPLS) regression model for batch statistical process control (BSPC). The short-term solutions were verified and implemented during the master's thesis at Seco Tools. Recommendations were given for future implementation of the long-term solutions. In this thesis, insights are shared regarding the applicability of OPLS and Partial Least Squares (PLS) regression models for batch monitoring of the CVD process. We also demonstrate that the prediction of a certain critical breakdown, clogging of the aluminum generator in the CVD process, can be accomplished through the use of SPC. For the wet blasting process, a PCA methodology is suggested to be effective for visualizing breakdowns.
|
266 |
Akvizice nákladné informace při rozhodování na základě dat / Acquisition of Costly Information in Data-Driven Decision MakingJanásek, Lukáš January 2021 (has links)
This thesis formulates and solves an economic decision problem of the acquisi- tion of costly information in data-driven decision making. The thesis assumes an agent predicting a random variable utilizing several costly explanatory vari- ables. Prior to the decision making, the agent learns about the relationship between the random variables utilizing its past realizations. During the deci- sion making, the agent decides what costly variables to acquire and predicts using the acquired variables. The agent's utility consists of the correctness of the prediction and the costs of the acquired variables. To solve the decision problem, the thesis divides the decision process into two parts: acquisition of variables and prediction using the acquired variables. For the prediction, the thesis presents a novel approach for training a single predictive model accepting any combination of acquired variables. For the acquisition, the thesis presents two novel methods using supervised machine learning models: a backward es- timation of the expected utility of each variable and a greedy acquisition of variables based on a myopic increase in the expected utility of variables. Next, the thesis formulates the decision problem as a Markov decision process which allows approximating the optimal acquisition via deep...
|
267 |
Värdet av data : en studie på hur skidanläggningar kan dra nytta av data / The value of data : a study on how ski resorts can benefit from dataNeu Jönsson, Yvonne, Lindström, Oskar January 2021 (has links)
I takt med digitaliseringen blir datadrivet beslutsfattande det nya normala i många branscher. Konkurrensfördelarna är allmänt kända eftersom det hjälper företag att utvecklas. Denna fallstudie syftar till att belysa de möjligheter som datadriven optimering bidrar med för skidorter när det kommer till att förbättra tjänster och anpassa skidanläggningar för framtiden. Huvudfokuset är att studera rörelsemönster hos skidåkare med hjälp av processutvinningsverktyg och andra metoder för visualisering. Detta har lett till följande forskningsfrågor: Vilken information går att utvinna ur data från liftsystem? Hur skulle denna typ av information kunna skapa värde i en organisation? Tidigare studier inom detta forskningsområde visar på stora möjligheter med användning av datautvinning och uppmanar till fortsatt forskning. Studien bidrar till forskningen genom att studera specifika åldersgrupper vilket tidigare inte genomförts. Studien visar att det finns skillnader i rörelsemönster hos olika åldersgrupper av skidåkare, vilket i sin tur visar på potentiella optimeringsområden hos skidanläggningarna. Utöver att belysa potentiella förbättringsområden med hjälp av datadrivna beslut visar studien även på en markant förändring hos typen av skidåkare som besöker svenska skidorter 2021, vilket troligtvis berodde på att Alperna höll stängt under skidsäsongen. I framtiden kan studien spela en viktig roll för forskning gällande hur Covid-19 påverkade svenska skidorter. / Given the digitalization, data-driven decision making is becoming the new normal in many industries. The competitive advantages are widely known as it helps companies to evolve. This case study aims to highlight the possibilities data-driven optimization provides when it comes to improving services and adapting to the future for ski resorts. Our focus is skier movement patterns which we generated by analyzing ski lift transportation data with a process mining tool and other methods for visualizations. Hence, our research questions: What information can be extracted from lift usage data? In what way can this information create value in an organization? Previous studies done in the field demonstrate many possibilities with data mining and urges for continued research. The research provided by this study is a contribution to the field through the research done on specific age-groups as this has not previously been done. This study introduces findings based on differences in the movement patterns based on skier age groups which lead to possible areas of optimization. In addition to highlighting possible ways to improve decision making using data, this study shows a significant shift in the type of skier visiting the Swedish ski-resorts 2021, possibly due to The Alps being closed this season. In the future, this study could play an essential role in studying how Covid-19 impacted Swedish ski-resorts.
|
268 |
Quantifying Uncertainty in the Residence Time of the Drug and Carrier Particles in a Dry Powder InhalerBadhan, Antara, Krushnarao Kotteda, V. M., Afrin, Samia, Kumar, Vinod 01 September 2021 (has links)
Dry powder inhalers (DPI), used as a means for pulmonary drug delivery, typically contain a combination of active pharmaceutical ingredients (API) and significantly larger carrier particles. The microsized drug particles-which have a strong propensity to aggregate and poor aerosolization performance-are mixed with significantly large carrier particles that cannot penetrate the mouth-throat region to deagglomerate and entrain the smaller API particles in the inhaled airflow. Therefore, a DPI's performance depends on the carrier-API combination particles' entrainment and the time and thoroughness of the individual API particles' deagglomeration from the carrier particles. Since DPI particle transport is significantly affected by particle-particle interactions, particle sizes and shapes present significant challenges to computational fluid dynamics (CFD) modelers to model regional lung deposition from a DPI. We employed the Particle-In-Cell method for studying the transport/deposition and the agglomeration and deagglomeration for DPI carrier and API particles in the present work. The proposed development will leverage CFD-PIC and sensitivity analysis capabilities from the Department of Energy laboratories: Multiphase Flow Interface Flow Exchange and Dakota UQ software. A data-driven framework is used to obtain the reliable low order statics of the particle's residence time in the inhaler. The framework is further used to study the effect of drug particle density, carrier particle density and size, fluidizing agent density and velocity, and some numerical parameters on the particles' residence time in the inhaler.
|
269 |
Man-Hour Estimations in ETO : A case study involving the use of regression to estimate man-hours in an ETO environmentAnand Alagamanna, Aravindh, Juneja, Simarjit Singh January 2020 (has links)
The competition in the manufacturing industry has never been higher. Owing to the technological changes and advancements in the market, readily available data is no longer a thing of the past. Numerous studies have discussed the impact of industry 4.0, digital transformation as well as better production planning methods in the manufacturing industry. The Mass-Manufacturing industry, in specific, has gained efficiency levels in production that were previously unimaginable. Industry 4.0 has been discussed as the ‘next big thing’ in the manufacturing context. In fact, it is seen as a necessity for manufacturing companies to stay competitive. However, efficient production planning methodologies are a preliminary requirement in order to successfully adopt the new manufacturing paradigms. The Engineering-to-order (ETO) industry is still widely unexplored by the academia ETO industries, barely have any production planning methodologies to rely on owing to their complex production processes and high reliance on manual-labour. Regression techniques have repeatedly been used in the production planning context. Considering its statistical prowess, it is no surprise that even the newer machine-learning techniques are based on regression. Considering its success in the mass-manufacturing industry for production planning, is it possible that its usage in the ETO industry might lead to the same results? This thesis involves a case study that was performed at an electrical transformer manufacturing plant in Sweden. After understanding the several operations that are performed in the production process, regression techniques are employed to estimate man-hours. The results from the study reconfirm the statistical prowess of regression and show the possibility of using regression in order to estimate man-hours in the ETO industry. In addition, several factors that can affect successful adoption of this tool in the production planning context are discussed. It is hoped that this study will lay the foundation for better production planning methodologies for the ETO industries in the future which might subsequently result in more data-driven decision making rather than instincts.
|
270 |
Comparing Fountas and Pinnell's Reading Levels to Reading Scores on the Criterion Referenced Competency TestWalker, Shunda F. 01 January 2016 (has links)
Reading competency is related to individuals' success at school and in their careers. Students who experience significant problems with reading may be at risk of long-term academic and social problems. High-quality measures that determine student progress toward curricular goals are needed for early identification and interventions to improve reading abilities and ultimately prevent subsequent failure in reading. The purpose of this quantitative nonexperimental ex post facto research study was to determine whether a correlation existed amongst student achievement scores on the Fountas and Pinnell Reading Benchmark Assessment and reading comprehension scores on the Criterion Reference Competency Test (CRCT). The item response theory served as the conceptual framework for examining whether a relationship exists between Fountas and Pinnell Benchmark Instructional Reading Levels and the reading comprehension scores on the CRCT of students in Grades 3, 4, and 5 in the year 2013-2014. Archival data for 329 students in Grades 3-5 were collected and analyzed through Spearman's rank-order correlation. The results showed positive relationships between the scores. The findings promote positive social change by supporting the use of benchmark assessment data to identify at-risk reading students early.
|
Page generated in 0.0695 seconds