Spelling suggestions: "subject:"predictive 3dmodeling"" "subject:"predictive bymodeling""
61 |
What Attracts Students To A Small, Private University?Schumacher, Ronald M., Jr 17 December 2015 (has links)
No description available.
|
62 |
Automated Detection and Prediction of Sleep Apnea EventsShewinvanakitkul, Prapan 05 June 2017 (has links)
No description available.
|
63 |
Human Inspection Variability in Infrastructure Asset Management: A Focus on HVAC SystemsPratt, Clayton Michael 05 January 2024 (has links)
Human inspection is a pivotal component of infrastructure asset management within a systems thinking approach to civil engineering. Skilled inspectors are tasked with the evaluation of various civil infrastructure components, conducting assessments of their conditions, identifying maintenance needs, and determining necessary repairs. Despite the growing interest in advanced technologies and automated inspections, the use of human-in-the-loop procedures is still widely practiced. Humans are susceptible to cognitive bias, variability, or uncertainty when inspecting infrastructure, and finding solutions to reduce these factors is paramount.
This study presents a comprehensive exploration of inspection variability within infrastructure asset management, drawing insights from datasets of the BUILDER Sustainment Management System (SMS) program. The research delves into infrastructure inventory, inspector data, and inspection data components of an asset management database, shedding light on variability in human inspection. Variations in inspection ratings revealed significant concerns, particularly in Mechanical, Electrical, and Plumbing (MEP) systems, with notable disparities between inspection ratings and condition ratings. Inspector variability analysis, through Coefficient of Variation calculations, indicated substantial disparities within and among inspectors. Further analysis, including Tukey's HSD test, pinpointed significant variability in heating, ventilation, and air conditioning (HVAC) and Fire Protection system inspections.
Moreover, this study addresses the specific challenge of reducing inspection uncertainty in HVAC systems. HVAC systems play a critical role in facility energy consumption, and their maintenance is vital to energy efficiency and occupant comfort. However, HVAC-specific inspections primarily require human involvement, making them time-consuming and prone to error. Addressing the challenges surrounding human inspection of HVAC systems, this research presents a multifaceted approach to reduce variability. Drawing from a review of existing literature on HVAC inspection uncertainty, this study extends its focus to the development of predictive models. These models considered parameters including inspection ratings, age-based obsolescence, section condition indices, component characteristics, and unique inspectors . Utilizing Linear Regression, Random Forest, and Gradient Boosting Regression, this model accurately predicted Variability Ratings, signifying the potential for implementation as a decision support tool. Importantly, the findings highlight the need to not only understand the factors affecting HVAC inspection variability but to actively implement technological solutions that can reduce human error and variability in inspections. / Master of Science / Infrastructure inspection is crucial for maintaining buildings and facilities, but it often comes with human errors and uncertainties. This study looks at the inspection process, focusing on case studies and data from the BUILDER Sustainment Management System (SMS) program. It reveals that inspectors sometimes evaluate the condition of parts of a building differently, leading to inconsistencies and poor overall management.
One significant area of concern is heating, ventilation, and air conditioning (HVAC) systems. These systems play a critical role in facility energy use and can be challenging to inspect accurately. Previous research has shown that work experience, training, education, and other factors tend to contribute to variability in how inspectors assess HVAC systems.
This research not only highlights these issues but also develops predictive models to reduce the variability of HVAC inspections. By doing so infrastructure can be managed correctly and ultimately lead to improved building lifecycles.
|
64 |
Using a connectome-based model of working memory to predict emotion regulation in older adultsFisher, Megan E. 25 July 2022 (has links)
No description available.
|
65 |
A Predictive Time-to-Event Modeling Approach with Longitudinal Measurements and Missing DataZhu, Lili January 2019 (has links)
An important practical problem in the survival analysis is predicting the time to a future event such as the death or failure of a subject. It is of great importance for the medical decision making to investigate how the predictor variables including repeated measurements of the same subjects are affecting the future time-to-event. Such a prediction problem is particularly more challenging due to the fact that the future values of predictor variables are unknown, and they may vary dynamically over time. In this dissertation, we consider a predictive approach based on modeling the forward intensity function. To handle the practical difficulty due to missing data in longitudinal measurements, and to accommodate observations at irregularly spaced time points, we propose a smoothed composite likelihood approach for estimations. The forward intensity function approach intrinsically incorporates the future dynamics in the predictor variables that affect the stochastic occurrence of the future event. Thus the proposed framework is advantageous and parsimonious from requiring no separated modeling step for the stochastic mechanism of the predictor variables. Our theoretical analysis establishes the validity of the forward intensity modeling approach and the smoothed composite likelihood method. To model the parameters as continuous functions of time, we introduce the penalized B-spline method into the proposed approach. Extensive simulations and real-data analyses demonstrate the promising performance of the proposed predictive approach. / Statistics
|
66 |
Analytics Models for Corporate Social Responsibility in Global Supply ChainsHabboubi, Sameh 12 March 2019 (has links)
There have been several infamous incidences where world-renowned corporations have been caught by surprise when a low-tier downstream supplier has been publicly found to be non-compliant with basic corporate social responsibilities (CSR) codes. In such instances, the company reputation, and consequently financial health, suffer greatly. Motivated by the advances in predictive modeling, we present a predictive analytics model for detecting possible supplier deviations before they become a corporate liability. The model will be built based on publicly available data such as news and online content. We apply text mining and machine learning tools to design a corporate social responsibility "early warning system" on the upstream side of the supply chain. In our literature review we found that there is a lack of studies that focus on the social aspect of sustainability.
Our research will help fill this gap by providing performance measures that can be used to build prescriptive analytics models to help in the selection of suppliers. To this end, we use the output of the predictive model to create a supplier selection optimization model that takes into account CSR compliance in global supply chain context. We propose a heuristic to solve the problem and computationally study its effectiveness as well as the impact of introducing CSR on procurement costs as well as ordering and supplier selection patterns. Our models provide analytics tools to companies to detect supplier deviance behaviour and act upon it so as to contain its impact and possible disruptions that can shake the whole supply chain. / Thesis / Master of Science (MSc)
|
67 |
Addressing Challenges of Modern News Agencies via Predictive Modeling, Deep Learning, and Transfer LearningKeneshloo, Yaser 22 July 2019 (has links)
Today's news agencies are moving from traditional journalism, where publishing just a few news articles per day was sufficient, to modern content generation mechanisms, which create more than thousands of news pieces every day.
With the growth of these modern news agencies comes the arduous task of properly handling this massive amount of data that is generated for each news article.
Therefore, news agencies are constantly seeking solutions to facilitate and automate some of the tasks that have been previously done by humans.
In this dissertation, we focus on some of these problems and provide solutions for two broad problems which help a news agency to not only have a wider view of the behaviour of readers around the article but also to provide an automated tools to ease the job of editors in summarizing news articles.
These two disjoint problems are aiming at improving the users' reading experience by helping the content generator to monitor and focus on poorly performing content while allow them to promote the good-performing ones.
We first focus on the task of popularity prediction of news articles via a combination of regression, classification, and clustering models.
We next focus on the problem of generating automated text summaries for a long news article using deep learning models.
The first problem aims at helping the content developer in understanding of how a news article is performing over the long run while the second problem provides automated tools for the content developers to generate summaries for each news article. / Doctor of Philosophy / Nowadays, each person is exposed to an immense amount of information from social media, blog posts, and online news portals. Among these sources, news agencies are one of the main content providers for each person around the world. Contemporary news agencies are moving from traditional journalism to modern techniques from different angles. This is achieved either by building smart tools to track the behaviour of readers’ reaction around a specific news article or providing automated tools to facilitate the editor’s job in providing higher quality content to readers. These systems should not only be able to scale well with the growth of readers but also they have to be able to process ad-hoc requests, precisely since most of the policies and decisions in these agencies are taken around the result of these analytical tools. As part of this new movement towards adapting new technologies for smart journalism, we have worked on various problems with The Washington Post news agency on building tools for predicting the popularity of a news article and automated text summarization model. We develop a model that monitors each news article after its publication and provide prediction over the number of views that this article will receive within the next 24 hours. This model will help the content creator to not only promote potential viral article in the main page of the web portal or social media, but also provide intuition for editors on potential poorly performing articles so that they can edit the content of those articles for better exposure. On the other hand, current news agencies are generating more than a thousands news articles per day and generating three to four summary sentences for each of these news pieces not only become infeasible in the near future but also very expensive and time-consuming. Therefore, we also develop a separate model for automated text summarization which generates summary sentences for a news article. Our model will generate summaries by selecting the most salient sentence in the news article and paraphrase them to shorter sentences that could represent as a summary sentence for the entire document.
|
68 |
Risk assessment of severe weather events affecting animal production facilitiesHassan Swamy Gowda, Impana 13 August 2024 (has links) (PDF)
Our research primarily focuses on the risk assessment and vulnerabilities of commercial broiler farms/houses due to severe weather events. The area of focus is counties within a 60-mile radius of broiler processing plants, as most broiler farms/houses are centrally located around these plants. Using data from the National Oceanic and Atmospheric Administration (NOAA), we calculate a risk index to evaluate the vulnerability of these counties. This study incorporates historical weather event data to identify patterns and predict future risks, providing valuable insights for disaster preparedness and mitigation strategies. Through a detailed analysis of various severe weather events, including tornadoes, hurricanes, floods, and lightning, we aim to highlight the specific vulnerabilities of broiler farms/houses and propose targeted strategies to enhance their resilience against these climatic threats. By focusing on high-risk areas, we ensure that our findings are both relevant and actionable, enabling stakeholders to implement effective risk management practices. This research fills an important gap in the existing literature. and also contributes significantly to the development of robust predictive models and comprehensive risk assessments for the poultry industry. Key Words: broiler production plants, severe weather events, risk assessment, NOAA data, vulnerability analysis, predictive modeling, climatic threats, resilience enhancement, machine learning models, Bayesian risk, geospatial analysis, hypothesis testing, nowcasting.
|
69 |
Predicting Post-School Outcomes of Transition Aged Students With High Incidence DisabilitiesBrewer, Jacob 01 January 2024 (has links) (PDF)
This dissertation explores predictive modeling to forecast post-school outcomes for transition-aged students with high-incidence disabilities, such as Specific Learning Disabilities (SLD) and Other Health Impairments (OHI). Transitioning from secondary to post-secondary environments is critical for students with disabilities, significantly impacting their independence, self-confidence, and employability (Morningstar et al., 2017; Carter et al., 2021). Through a mixed-methods approach, this study integrates Office of Special Education Programs (OSEP) indicators to determine which factors most effectively predict post-secondary enrollment and graduation. Key variables such as parental involvement, socio-economic status (SES), and self-determination skills are analyzed using logistic regression and machine learning techniques, including neural networks, decision trees, and Naive Bayes models (Raschka & Mirjalili, 2019; Chan et al., 2023).
The findings indicate that parental involvement, SES, and self-determination are significant predictors of post-school success (Sirin, 2005; Anders et al., 2020). Additionally, machine learning models outperform traditional methods in terms of accuracy and precision, providing nuanced insights into student trajectories (Raschka & Mirjalili, 2019). The research underscores the importance of early identification and customized interventions tailored to the specific needs of students with disabilities, emphasizing the need for stronger family-school partnerships and targeted policies to address socio-economic disparities (Kearns et al., 2020; Schmidt et al., 2020). These insights offer educators and policymakers evidence-based strategies to improve educational and vocational outcomes for students with disabilities, facilitating smoother transitions into adulthood.
|
70 |
MRI functional connectivity-based predictive models of brain organization and cognitive state for healthy and clinical populationsTripathi, Vaibhav 05 February 2025 (has links)
2023 / Connectome Fingerprinting (CF) and Connectome-Based Predictive Modeling (CPM) are emerging data science approaches within cognitive neuroscience that utilize brain connectivity data to predict the functional organization of the brain and behavior. Here, I examined the robustness and data needs of these approaches and developed applications for specific clinical populations, including neurosurgical glioma patients and early-stage Alzheimer’s Disease (AD) patients.In the first study, I utilized the Human Connectome Project dataset (n=208) to understand the relationship between function and connectivity using CF across cognitive tasks, and functional connectivity paradigms of fixation and movie-watching. I benchmarked the CF method to establish the quantity and quality of connectivity and task activation data required to build robust models across different cortical regions, cognitive tasks, scan quality, learning algorithms, and scanners. I also modeled the link of cerebellar-cortical connectivity with function, suggesting an integrated role of the cerebellum across cognitive domains.
Neurosurgeons routinely use task fMRI protocols to map out motor and language networks to aid in presurgical planning for brain tumor resection to minimize cognitive damage. In the second study, I investigated the possibility of using CF models to help neurosurgeons predict motor and language networks in presurgical patients with gliomas (n=16), who are unable to perform complex tasks in the scanner. I tested the model validity across healthy control adults (n=16) from different scanners with varying data quality and quantity.
The deposition of tau and amyloid-β plaques in AD results in brain degeneration, memory loss, and behavioral changes. Resting-state functional connectivity gets affected due to the progression of AD. In the third study, I analyzed the relationship between functional connectivity and tau/amyloid depositions and tested the predictive ability of CPM across two different cohorts: Autosomal Dominant with Presenilin1 mutation from the Colombia-Boston study with early-onset AD (n=32) and late-onset sporadic AD with APOE4 marker from the Harvard Aging Brain Study (n=78) and compared them with healthy age-matched adults (n=35/n=206) and healthy young adults (n=1570) from the Genomic Superstruct Project.
Combined, these studies highlight the potential of data-driven approaches to model human brain function and connectivity, facilitating fMRI's translational applications. / 2027-02-04T00:00:00Z
|
Page generated in 0.2994 seconds