• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 266
  • 147
  • 42
  • 32
  • 24
  • 15
  • 13
  • 6
  • 4
  • 4
  • 4
  • 3
  • 2
  • 2
  • 2
  • Tagged with
  • 639
  • 639
  • 209
  • 125
  • 114
  • 90
  • 88
  • 88
  • 75
  • 68
  • 61
  • 60
  • 59
  • 57
  • 55
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Leveraging Overtime Hours to Fit an Additional Arthroplasty Surgery Per Day: A Feasibility Study

Khalaf, Georges 30 June 2023 (has links)
The COVID-19 pandemic resulted in the cancellation of many hip and knee replacements, creating a backlog of patients on top of an existing long waiting list. To reduce wait lists with no financial burden, we aim to evaluate the possibility of leveraging our previous efficiency-improving work to add an additional case to a typical 4-joint day with no extra cost. To do this, 761 total operation days were analyzed from 2012 to 2019, capturing variables such as case number, success (completion of 4 cases before 3:45pm), and patient out of room time. Linear regression was used on 301 successful days to predict 5th cases, while overtime hours saved were calculated from the remaining unsuccessful days. Different cost distributions were then analyzed for a 77% 4-joint day success rate (our baseline), and a 100% 4-joint day success rate. Our predictions show that increasing performance to a 77% success rate can lead to approximately 35 extra cases per year at our institution, while a 100% success rate can produce 56 extra cases per year. Overall, this shows the extent of resources wasted by overtime costs, and the potential for their use in reducing wait times. Future work can explore optimal staffing procedures to account for these extra cases.
52

ASSESSMENT AND MODELING OF INDOOR AIR QUALITY

GREEN, CHRISTOPHER FRANK 15 September 2002 (has links)
No description available.
53

A Model to Predict Student Matriculation from Admissions Data

Khajuria, Saket 20 April 2007 (has links)
No description available.
54

Using data analytics and laboratory experiments to advance the understanding of reservoir rock properties

Li, Zihao 01 February 2019 (has links)
Conventional and unconventional reservoirs are both critical in oilfield developments. After waterflooding treatments over decades, the petrophysical properties of a conventional reservoir may change in many aspects. It is crucial to identify the variations of these petrophysical properties after the long-term waterflooding treatments, both at the pore and core scales. For unconventional reservoirs, the productivity and performance of hydraulic fracturing in shales are challenging because of the complicated petrophysical properties. The confining pressure imposed on a shale formation has a tremendous impact on the permeability of the rock. The correlation between confining pressure and rock permeability is complicated and might be nonlinear. In this thesis, a series of laboratory tests was conducted on core samples extracted from four U.S. shale formations to measure their petrophysical properties. In addition, a special 2D microfluidic equipment that simulates the pore structure of a sandstone formation was developed to investigate the influence of injection flow rate on the development of high-permeability flow channels. Moreover, the multiple linear regression (MLR) model was applied with the predictors based on the development stages to quantify the variations of reservoir petrophysical properties. The MLR model outcome indicated that certain variables were effectively correlated to the permeability. The 2D microfluidic model demonstrated the development of viscous fingering when the injection water flow rate was higher than a certain level, which resulted in reduced overall sweep efficiency. These comprehensive laboratory experiments demonstrate the role of confining pressure, Klinkenberg effect, and bedding plane direction on the gas flow in the nanoscale pore space in shales. / Master of Science / Conventional and unconventional hydrocarbon reservoirs are both important in oil-gas development. The waterflooding treatment is the injection of water into a petroleum reservoir to increase reservoir pressure and to displace residual oil, which is a widely used enhanced oil recovery method. However, after waterflooding treatments for several decades, it may bring many changes in the properties of a conventional reservoir. To optimize subsequent oilfield development plans, it is our duty to identify the variations of these properties after the long-term waterflooding treatments, both at the pore and core scales. In unconventional reservoirs, hydraulic fracturing has been widely used to produce hydrocarbon resources from shale or other tight rocks at an economically viable production rate. The operation of hydraulic fracturing in shales is challenging because of the complicated reservoir pressure. The external pressure imposed on a shale formation has a tremendous impact on the permeability of the rock. The correlation between pressure and rock permeability is intricate. In this thesis, a series of laboratory tests was conducted on core samples to measure their properties and the pressure. Moreover, a statistical model was applied to quantify the variations of reservoir properties. The results indicated that certain reservoir properties were effectively correlated to the permeability. These comprehensive investigations demonstrate the role of pressure, special gas flow effect, and rock bedding direction on the gas flow in the extremely small pore in shales.
55

Improving Turbidity-Based Estimates of Suspended Sediment Concentrations and Loads

Jastram, John Dietrich 12 June 2007 (has links)
As the impacts of human activities increase sediment transport by aquatic systems the need to accurately quantify this transport becomes paramount. Turbidity is recognized as an effective tool for monitoring suspended sediments in aquatic systems, and with recent technological advances turbidity can be measured in-situ remotely, continuously, and at much finer temporal scales than was previously possible. Although turbidity provides an improved method for estimation of suspended-sediment concentration (SSC), compared to traditional discharge-based methods, there is still significant variability in turbidity-based SSC estimates and in sediment loadings calculated from those estimates. The purpose of this study was to improve the turbidity-based estimation of SSC. Working at two monitoring sites on the Roanoke River in southwestern Virginia, stage, turbidity, and other water-quality parameters and were monitored with in-situ instrumentation, suspended sediments were sampled manually during elevated turbidity events; those samples were analyzed for SSC and for physical properties; rainfall was quantified by geologic source area. The study identified physical properties of the suspended-sediment samples that contribute to SSC-estimation variance and hydrologic variables that contribute to variance in those physical properties. Results indicated that the inclusion of any of the measured physical properties, which included grain-size distributions, specific surface-area, and organic carbon, in turbidity-based SSC estimation models reduces unexplained variance. Further, the use of hydrologic variables, which were measured remotely and on the same temporal scale as turbidity, to represent these physical properties, resulted in a model which was equally as capable of predicting SSC. A square-root transformed turbidity-based SSC estimation model developed for the Roanoke River at Route 117 monitoring station, which included a water level variable, provided 63% less unexplained variance in SSC estimations and 50% narrower 95% prediction intervals for an annual loading estimate, when compared to a simple linear regression using a logarithmic transformation of the response and regressor (turbidity). Unexplained variance and prediction interval width were also reduced using this approach at a second monitoring site, Roanoke River at Thirteenth Street Bridge; the log-based transformation of SSC and regressors was found to be most appropriate at this monitoring station. Furthermore, this study demonstrated the potential for a single model, generated from a pooled set of data from the two monitoring sites, to estimate SSC with less variance than a model generated only from data collected at this single site. When applied at suitable locations, the use of this pooled model approach could provide many benefits to monitoring programs, such as developing SSC-estimation models for multiple sites which individually do not have enough data to generate a robust model or extending the model to monitoring sites between those for which the model was developed and significantly reducing sampling costs for intensive monitoring programs. / Master of Science
56

An Intrusion Detection System for Battery Exhaustion Attacks on Mobile Computers

Nash, Daniel Charles 15 June 2005 (has links)
Mobile personal computing devices continue to proliferate and individuals' reliance on them for day-to-day needs necessitate that these platforms be secure. Mobile computers are subject to a unique form of denial of service attack known as a battery exhaustion attack, in which an attacker attempts to rapidly drain the battery of the device. Battery exhaustion attacks greatly reduce the utility of the mobile devices by decreasing battery life. If steps are not taken to thwart these attacks, they have the potential to become as widespread as the attacks that are currently mounted against desktop systems. This thesis presents steps in the design of an intrusion detection system for detecting these attacks, a system that takes into account the performance, energy, and memory constraints of mobile computing devices. This intrusion detection system uses several parameters, such as CPU load and disk accesses, to estimate the power consumption of two test systems using multiple linear regression models, allowing us to find the energy used on a per process basis, and thus identifying processes that are potentially battery exhaustion attacks. / Master of Science
57

Zavedení a aplikace obecného regresního modelu / The Introduction and Application of General Regression Model

Hrabec, Pavel January 2015 (has links)
This thesis sumarizes in detail general linear regression model, including testing statistics for coefficients, submodels, predictions and mostly tests of outliers and large leverage points. It describes how to include categorial variables into regression model. This model was applied to describe saturation of photographs of bread, where input variables were, type of flour, type of addition and concntration of flour. After identification of outliers it was possible to create mathematical model with high coefficient of determination, which will be usefull for experts in food industry for preliminar identification of possible composition of bread.
58

Performance Comparison of Imputation Algorithms on Missing at Random Data

Addo, Evans Dapaa 01 May 2018 (has links)
Missing data continues to be an issue not only the field of statistics but in any field, that deals with data. This is due to the fact that almost all the widely accepted and standard statistical software and methods assume complete data for all the variables included in the analysis. As a result, in most studies, statistical power is weakened and parameter estimates are biased, leading to weak conclusions and generalizations. Many studies have established that multiple imputation methods are effective ways of handling missing data. This paper examines three different imputation methods (predictive mean matching, Bayesian linear regression and linear regression, non Bayesian) in the MICE package in the statistical software, R, to ascertain which of the three imputation methods imputes data that yields parameter estimates closest to the parameter estimates of a complete data given different percentages of missingness. In comparing the parameter estimates of the complete data and the imputed data, the parameter estimates in each model were evaluated and compared. The paper extends the analysis by generating a pseudo data of the original data to establish how the imputation methods perform under varying conditions.
59

Um modelo matemático para estudo de otimização do consumo de energia elétrica /

Silva, Mariellen Vital da. January 2007 (has links)
Resumo: Neste trabalho, otimiza-se o funcionamento de uma fábrica desidratadora de forragens localizada na Espanha. Esta possui processos seqüenciados, secagem, produção de fardos de feno e produção de grãos, que para serem realizados consomem quantidades distintas de energia. Estabelecem-se então, os períodos de produção para cada processo, juntamente com a quantidade em toneladas a serem produzidas, sabendo que na Espanha a energia elétrica possui vinte e quatro preços, um para cada hora do dia. É proposto um modelo para a função objetivo, utilizando dados históricos de produção (Ton), consumo (kWh) e tempo (h), que retratará o funcionamento da empresa. Este modelo é obtido por meio de regressão linear múltipla e é implementado utilizando o software Lingo. Os resultados dessa implementação fornecerão as horas totais diárias que cada processo deverá ser realizado, juntamente com a quantidade de toneladas de pacotes de feno e grãos, e o custo diário da energia elétrica para realizar a produção. / Abstract: In this work, optimize of the functioning of a plant that dehydrates fodder plants located in Spain. This possess sequenced processes, drying, production of hay packs and production of grains, which to be carried through consumes distinct amounts of energy. Then, the periods of production for each process are established, together with the amount in tons to be produced, knowing that in Spain the electric energy possess twenty and four prices, one for each hour of the day. It is considered a model for the objective function, by using given historical data of production (Ton), consumption (kWh) and time (h), that the functioning of the company will portray. This model is gotten by means of multiple linear regression and is implemented using software Lingo. The results of this implementation will supply the daily total hours that each process will have to be carried through, with the amount of tons of packages of hay and grains , and the daily cost of the electric energy to carry through the production. / Orientador: Francisco Villarreal Alvarado / Coorientador: Antonio Padilha Feltrin / Banca: Evaristo Bianchini Sobrinho / Banca: José Carlos de Melo Vieira Júnior / Mestre
60

Um modelo matemático para estudo de otimização do consumo de energia elétrica

Silva, Mariellen Vital da [UNESP] 22 March 2007 (has links) (PDF)
Made available in DSpace on 2014-06-11T19:22:35Z (GMT). No. of bitstreams: 0 Previous issue date: 2007-03-22Bitstream added on 2014-06-13T18:08:34Z : No. of bitstreams: 1 silva_mv_me_ilha.pdf: 743779 bytes, checksum: 5aad49dd95d63ada483f753bee811fd7 (MD5) / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES) / Neste trabalho, otimiza-se o funcionamento de uma fábrica desidratadora de forragens localizada na Espanha. Esta possui processos seqüenciados, secagem, produção de fardos de feno e produção de grãos, que para serem realizados consomem quantidades distintas de energia. Estabelecem-se então, os períodos de produção para cada processo, juntamente com a quantidade em toneladas a serem produzidas, sabendo que na Espanha a energia elétrica possui vinte e quatro preços, um para cada hora do dia. É proposto um modelo para a função objetivo, utilizando dados históricos de produção (Ton), consumo (kWh) e tempo (h), que retratará o funcionamento da empresa. Este modelo é obtido por meio de regressão linear múltipla e é implementado utilizando o software Lingo. Os resultados dessa implementação fornecerão as horas totais diárias que cada processo deverá ser realizado, juntamente com a quantidade de toneladas de pacotes de feno e grãos, e o custo diário da energia elétrica para realizar a produção. / In this work, optimize of the functioning of a plant that dehydrates fodder plants located in Spain. This possess sequenced processes, drying, production of hay packs and production of grains, which to be carried through consumes distinct amounts of energy. Then, the periods of production for each process are established, together with the amount in tons to be produced, knowing that in Spain the electric energy possess twenty and four prices, one for each hour of the day. It is considered a model for the objective function, by using given historical data of production (Ton), consumption (kWh) and time (h), that the functioning of the company will portray. This model is gotten by means of multiple linear regression and is implemented using software Lingo. The results of this implementation will supply the daily total hours that each process will have to be carried through, with the amount of tons of packages of hay and grains , and the daily cost of the electric energy to carry through the production.

Page generated in 0.1018 seconds