Spelling suggestions: "subject:"conlinear regression"" "subject:"collinear regression""
41 |
Evaluating Program Diversity and the Probability of Gifted Identification Using the Torrance Test of Creative ThinkingLee, Lindsay Eryn 08 1900 (has links)
Multiple criteria systems are recommended as best practice to identify culturally, linguistically, economically diverse students for gifted services, in which schools often incorporate measures of creativity. However, the role of creativity in identification systems and its recruitment of diverse student populations is unclear. The Torrance Test of Creative Thinking (TTCT) is the most widely used norm-referenced creativity test in gifted identification. Although commonly used for identifying talent, little is known on the variability in composite scores on the TTCT-Figural and student demographics (i.e., race/ethnicity, sex, socioeconomic status, English language learning status). This study evaluated student demographic subgroup differences that exist after the initial phase of an identification process (i.e., universal screening, referrals) and examined the relationship among student demographics (i.e., race/ethnicity, free/reduced lunch status, English language learning status, sex), cognitive ability, academic achievement, and creativity, as measured by the TTCT-Figural Form A or B, to the probability of being identified for gifted programs. In a midsized school district in the state of Texas, findings indicate several demographic differences for students who were referred or universally screened across the measures of cognitive ability, academic achievement, and creativity. However, there were lower differences when using the TTCT-Figural. Results of a hierarchical generalized linear regression indicate underrepresented groups showed no difference in the probability of being identified after controlling for measures of cognitive ability, academic achievement, and creativity. Though, cognitive ability and academic achievement tests were more predictive of identification compared to the TTCT-Figural. Implications and recommendations for future research are discussed.
|
42 |
Knotilus: A Differentiable Piecewise Linear Regression FrameworkGormley, Nolan D. 27 May 2021 (has links)
No description available.
|
43 |
A Computer Program for Survival Comparisons to a Standard PopulationMoon, Steven Y., Woolson, Robert F., Bean, Judy A. 01 January 1979 (has links)
PROPHAZ is a computer program created for the analysis of survival data using the general proportional hazards model. It was designed specifically for the situation in which the underlying hazard function may be estimated from the mortality experience of a large reference population, but may be used for other problems as well. Input for the program includes the variables of interest as well as the information necessary for estimating the hazard function (demographic and mortality data). Regression coefficients for the variables of interest are obtained iteratively using the Newton-Raphson method. Utilizing large sample asymptotic theory, χ2 statistics are derived which may be used to test hypotheses of the form Cβ = 0. Input format is completely flexible for the variables of interest as well as the mortality data.
|
44 |
Stock market estimation : Using Linear Regression and Random ForestKastberg, Daniel January 2022 (has links)
Stock market speculation is captivating to many people. Millions of people worldwide sell and buy stocks in the hope of turning a profit. By using machine learning could Random Forest or Linear Regression estimate which direction the trend of the stock market is heading, and would Random Forest outperform Linear Regression since it involves more complex methods. To explore the subject, several stocks from Nasdaq and the index of Swedish OMX are studied and used to evaluate the machine learning models. The data was modified to measure the change in percentage to accommodate the Random Forests inability to extrapolate. The return on investment in percentage was chosen as a dependent variable. Without a technical analysis both models performed poorly, but when RSI 14, EMA 10 and SMA 10 was added, both models proved significant, while Random Forest proved the superior of them both. Hyperparameter optimization was applied on Random Forest to evaluate if it was possible to prove it even more superior to Linear Regression, but alas, it only gave an improvement in half of the datasets, which made it inconclusive. This thesis adds to the already existing papers of predicting stock prices, but goes into exploring the difference between Random Forest and Linear Regression to see if there are any obvious differences in their ability to estimate the direction of a stock’s price in a near future.
|
45 |
Leveraging Overtime Hours to Fit an Additional Arthroplasty Surgery Per Day: A Feasibility StudyKhalaf, Georges 30 June 2023 (has links)
The COVID-19 pandemic resulted in the cancellation of many hip and knee replacements, creating a backlog of patients on top of an existing long waiting list. To reduce wait lists with no financial burden, we aim to evaluate the possibility of leveraging our previous efficiency-improving work to add an additional case to a typical 4-joint day with no extra cost. To do this, 761 total operation days were analyzed from 2012 to 2019, capturing variables such as case number, success (completion of 4 cases before 3:45pm), and patient out of room time. Linear regression was used on 301 successful days to predict 5th cases, while overtime hours saved were calculated from the remaining unsuccessful days. Different cost distributions were then analyzed for a 77% 4-joint day success rate (our baseline), and a 100% 4-joint day success rate. Our predictions show that increasing performance to a 77% success rate can lead to approximately 35 extra cases per year at our institution, while a 100% success rate can produce 56 extra cases per year. Overall, this shows the extent of resources wasted by overtime costs, and the potential for their use in reducing wait times. Future work can explore optimal staffing procedures to account for these extra cases.
|
46 |
ASSESSMENT AND MODELING OF INDOOR AIR QUALITYGREEN, CHRISTOPHER FRANK 15 September 2002 (has links)
No description available.
|
47 |
A Model to Predict Student Matriculation from Admissions DataKhajuria, Saket 20 April 2007 (has links)
No description available.
|
48 |
Using data analytics and laboratory experiments to advance the understanding of reservoir rock propertiesLi, Zihao 01 February 2019 (has links)
Conventional and unconventional reservoirs are both critical in oilfield developments. After waterflooding treatments over decades, the petrophysical properties of a conventional reservoir may change in many aspects. It is crucial to identify the variations of these petrophysical properties after the long-term waterflooding treatments, both at the pore and core scales. For unconventional reservoirs, the productivity and performance of hydraulic fracturing in shales are challenging because of the complicated petrophysical properties. The confining pressure imposed on a shale formation has a tremendous impact on the permeability of the rock. The correlation between confining pressure and rock permeability is complicated and might be nonlinear. In this thesis, a series of laboratory tests was conducted on core samples extracted from four U.S. shale formations to measure their petrophysical properties. In addition, a special 2D microfluidic equipment that simulates the pore structure of a sandstone formation was developed to investigate the influence of injection flow rate on the development of high-permeability flow channels. Moreover, the multiple linear regression (MLR) model was applied with the predictors based on the development stages to quantify the variations of reservoir petrophysical properties. The MLR model outcome indicated that certain variables were effectively correlated to the permeability. The 2D microfluidic model demonstrated the development of viscous fingering when the injection water flow rate was higher than a certain level, which resulted in reduced overall sweep efficiency. These comprehensive laboratory experiments demonstrate the role of confining pressure, Klinkenberg effect, and bedding plane direction on the gas flow in the nanoscale pore space in shales. / Master of Science / Conventional and unconventional hydrocarbon reservoirs are both important in oil-gas development. The waterflooding treatment is the injection of water into a petroleum reservoir to increase reservoir pressure and to displace residual oil, which is a widely used enhanced oil recovery method. However, after waterflooding treatments for several decades, it may bring many changes in the properties of a conventional reservoir. To optimize subsequent oilfield development plans, it is our duty to identify the variations of these properties after the long-term waterflooding treatments, both at the pore and core scales. In unconventional reservoirs, hydraulic fracturing has been widely used to produce hydrocarbon resources from shale or other tight rocks at an economically viable production rate. The operation of hydraulic fracturing in shales is challenging because of the complicated reservoir pressure. The external pressure imposed on a shale formation has a tremendous impact on the permeability of the rock. The correlation between pressure and rock permeability is intricate. In this thesis, a series of laboratory tests was conducted on core samples to measure their properties and the pressure. Moreover, a statistical model was applied to quantify the variations of reservoir properties. The results indicated that certain reservoir properties were effectively correlated to the permeability. These comprehensive investigations demonstrate the role of pressure, special gas flow effect, and rock bedding direction on the gas flow in the extremely small pore in shales.
|
49 |
Improving Turbidity-Based Estimates of Suspended Sediment Concentrations and LoadsJastram, John Dietrich 12 June 2007 (has links)
As the impacts of human activities increase sediment transport by aquatic systems the need to accurately quantify this transport becomes paramount. Turbidity is recognized as an effective tool for monitoring suspended sediments in aquatic systems, and with recent technological advances turbidity can be measured in-situ remotely, continuously, and at much finer temporal scales than was previously possible. Although turbidity provides an improved method for estimation of suspended-sediment concentration (SSC), compared to traditional discharge-based methods, there is still significant variability in turbidity-based SSC estimates and in sediment loadings calculated from those estimates. The purpose of this study was to improve the turbidity-based estimation of SSC. Working at two monitoring sites on the Roanoke River in southwestern Virginia, stage, turbidity, and other water-quality parameters and were monitored with in-situ instrumentation, suspended sediments were sampled manually during elevated turbidity events; those samples were analyzed for SSC and for physical properties; rainfall was quantified by geologic source area. The study identified physical properties of the suspended-sediment samples that contribute to SSC-estimation variance and hydrologic variables that contribute to variance in those physical properties. Results indicated that the inclusion of any of the measured physical properties, which included grain-size distributions, specific surface-area, and organic carbon, in turbidity-based SSC estimation models reduces unexplained variance. Further, the use of hydrologic variables, which were measured remotely and on the same temporal scale as turbidity, to represent these physical properties, resulted in a model which was equally as capable of predicting SSC. A square-root transformed turbidity-based SSC estimation model developed for the Roanoke River at Route 117 monitoring station, which included a water level variable, provided 63% less unexplained variance in SSC estimations and 50% narrower 95% prediction intervals for an annual loading estimate, when compared to a simple linear regression using a logarithmic transformation of the response and regressor (turbidity). Unexplained variance and prediction interval width were also reduced using this approach at a second monitoring site, Roanoke River at Thirteenth Street Bridge; the log-based transformation of SSC and regressors was found to be most appropriate at this monitoring station. Furthermore, this study demonstrated the potential for a single model, generated from a pooled set of data from the two monitoring sites, to estimate SSC with less variance than a model generated only from data collected at this single site. When applied at suitable locations, the use of this pooled model approach could provide many benefits to monitoring programs, such as developing SSC-estimation models for multiple sites which individually do not have enough data to generate a robust model or extending the model to monitoring sites between those for which the model was developed and significantly reducing sampling costs for intensive monitoring programs. / Master of Science
|
50 |
An Intrusion Detection System for Battery Exhaustion Attacks on Mobile ComputersNash, Daniel Charles 15 June 2005 (has links)
Mobile personal computing devices continue to proliferate and individuals' reliance on them for day-to-day needs necessitate that these platforms be secure. Mobile computers are subject to a unique form of denial of service attack known as a battery exhaustion attack, in which an attacker attempts to rapidly drain the battery of the device. Battery exhaustion attacks greatly reduce the utility of the mobile devices by decreasing battery life. If steps are not taken to thwart these attacks, they have the potential to become as widespread as the attacks that are currently mounted against desktop systems.
This thesis presents steps in the design of an intrusion detection system for detecting these attacks, a system that takes into account the performance, energy, and memory constraints of mobile computing devices. This intrusion detection system uses several parameters, such as CPU load and disk accesses, to estimate the power consumption of two test systems using multiple linear regression models, allowing us to find the energy used on a per process basis, and thus identifying processes that are potentially battery exhaustion attacks. / Master of Science
|
Page generated in 0.0626 seconds