• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 122
  • 81
  • 39
  • 23
  • 22
  • 10
  • 10
  • 9
  • 7
  • 6
  • 5
  • 3
  • 1
  • 1
  • 1
  • Tagged with
  • 382
  • 51
  • 50
  • 43
  • 39
  • 36
  • 32
  • 29
  • 26
  • 24
  • 24
  • 23
  • 23
  • 22
  • 22
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Pilot Study for Quantifying LEED Energy & Atmosphere Operational Savings in Healthcare Facilities

Daniels, Patrick Rudolph 2012 August 1900 (has links)
Owner groups and Facility Managers of health care facilities interested in reducing operation and maintenance (O&M) expenses for new facilities have often been placed in the difficult position of making cost-benefit assessments without a complete understanding of the cumulative impact of building systems selection on their internal rate of return. This is particularly true when owners are evaluating the initial cost and operational benefit (if any) of obtaining various levels of "Leadership in Energy and Environmental Design" (LEED) certifications for their buildings. Heating Ventilation and Air Conditioning, and Lighting (HVAC&L) loads comprise 51% of the total energy demand in the typical outpatient facility; however, in order to estimate the likelihood of achieving a particular LEED rating for a new building, a "Whole Building Energy Simulation" is necessary to evaluate HVAC&L system performance. The conventional of requiring a design upon which to base an analysis presents owner operators attempting to perform a Lifecycle Cost Analysis (LCCA) early in the concept phase with two unique problems - how to estimate energy use without an actual "design" to model, and how to estimate a system's first cost without knowing its performance requirements. This study outlines a process by which existing energy metrics from the Department of Energy (DOE), Commercial Building Energy Consumption Survey (CBECS), and Energy Star, can be made early during the developer's pro forma phase - without the need for a building design. Furthermore, preliminary business decisions targeted at determining the likelihood of obtaining a particular LEED rating, and specifying the corresponding building systems, can be estimated without the cost required to employ an Architect and Engineer (A&E) team, or the time necessary to develop a design. This paper concludes that regional factors can dramatically affect a building's required level of energy performance, and that the highest performing HVAC&L system, irrespective of cost, will not always provide the best return on investment. Accordingly, the national averages utilized to establish LEED EA1 thresholds do not reflect the cost particularities owners may encounter when developing in various climate zones, and therefor may be less relevant to lifecycle considerations that previously believed.
42

The Exploration of the Relationship Between Guessing and Latent Ability in IRT Models

Gao, Song 01 December 2011 (has links)
This study explored the relationship between successful guessing and latent ability in IRT models. A new IRT model was developed with a guessing function integrating probability of guessing an item correctly with the examinee's ability and the item parameters. The conventional 3PL IRT model was compared with the new 2PL-Guessing model on parameter estimation using the Monte Carlo method. SAS program was used to implement the data simulation and the maximum likelihood estimation. Compared with the traditional 3PL model, the new model should reflect: a) the maximum probability of guessing should not be more than 0.5, even for the highest ability examinees; b) different ability of examinees should have different probability of successful guessing, because a basic assumption for the new models is that higher ability examinees have a higher probability of successful guessing than lower ability examinees; c) smaller standard error in estimating parameters; and d) faster running time. The results illustrated that the new 2PL-Guessing model was superior to the 3PL model in all four aspects.
43

On the Lp-Integrability of Green’s function for Elliptic Operators

Alharbi, Abdulrahman 30 May 2019 (has links)
In this thesis, we discuss some of the results that were proven by Fabes and Stroock in 1984. Our main purpose is to give a self-contained presentation of the proof of this results. The first result is on the existence of a “reverse H ̈older inequality” for the Green’s function. We utilize the work of Muckenhoupt on the reverse Ho ̈lder inequality and its connection to the A∞ class to establish a comparability property for the Green’s functions. Additionally, we discuss some of the underlying preliminaries. In that, we prove the Alexandrov-Bakelman-Pucci estimate, give a treatment to the Ap and A∞ classes of Muckenhoupt, and establish two intrinsic lemmas on the behavior of Green’s function.
44

Effect of selection of censoring times on survival analysis estimation of disease incidence and association with risk factors

Himali, Jayandra Jung 24 September 2015 (has links)
In longitudinal cohort studies, potential risk factors are measured at baseline, subjects are followed over time, and disease endpoints are ascertained via extensive surveillance. Individual follow-up time is from baseline to the event, if one is observed during the study period. Follow-up time is censored for subjects who are not observed to have the event during the study period, at the end of the study period for subjects who remain event-free, but during the study period for subjects who leave the study early by choice or by mortality, or whose last evaluation was before the end of the study. Survival analytic techniques are unique in that the unit of analysis is not the individual but the person-time contributed by the individual. Surveillance in longitudinal studies is generally quite rigorous. Subjects are examined in waves and their event status is ascertained. Surveillance continues between waves, and events come to the attention of the investigator. If there is a long time between waves, analyses can be conducted on all available data, with non-events censored early at the last examination and events followed beyond the general examination to the incident event. Motivated by analyses using the Framingham Heart Study (FHS) with cardiovascular endpoints, we consider four censoring methods for non-events and evaluate their impact on estimates of incidence, and on tests of association between risk factors and incidence. We further investigate the impact of early censoring of non-events (as compared to events) under various scenarios with respect to incidence estimation, robustness, and power using a simulation study of Weibull survival models over a range of sample sizes and distribution parameters. Our FHS and simulation investigations show early censoring of non-events causes over estimation of incidence, particularly when the baseline incidence is low. Early censoring of non-events did not affect the robustness of the Wald test [Ho: Hazard Ratio (HR) =1]. However, in both the FHS and over the range of simulation scenarios, under early censoring of non-events, estimates of HR were closer to the null (1.0), and the power to detect associations with risk factors was markedly reduced.
45

Estimating the risks in defined benefit pension funds under the constraints of PF117

Mahmood, Ra'ees January 2017 (has links)
With the issuing of Pension Funds circular PF117 in 2004 in South Africa, regulation required valuation assumptions for defined benefit pension funds to be on a best-estimate basis. Allowance for prudence was to be made through explicit contingency reserves, in order to increase reporting transparency. These reserves for prudence, however, were not permitted to put the fund into deficit (the no-deficit clause). Analysis is conducted to understand the risk that PF117 poses to pension fund sponsors and members under two key measures: contribution rate risk and solvency risk. A stochastic model of a typical South African defined benefit fund is constructed with simulations run to determine the impact of the PF117 requirements. Findings show that a best-estimate funding basis, coupled with the no-deficit clause, results in significant risk under both contribution rate and solvency risk measures, particularly in the short-term. To mitigate these risks, alternative ways of introducing conservatism into the funding basis are required, with possible options including incorporating margins into investment return assumptions or the removal of the no-deficit clause.
46

A Confidence Interval Estimate of Percentile

Jou, How Coung 01 May 1980 (has links)
The confidence interval estimate of percentile and its applications were studied. The three methods of estimating a confidence interval were introduced. Some properties of order statistics were reviewed. The Monte Carlo Method -- used to estimate the confidence interval was the most important one among the three methods. The generation of ordered random variables and the estimation of parameters were discussed clearly. The comparison of the three methods showed that the Monte Carlo method would always work, but the K-S and the simplified methods would not.
47

Unbalanced Analysis of Variance Comparing Standard and Proposed Approximation Techniques for Estimating the Variance Components

Pugsley, James P. 01 May 1984 (has links)
This paper considers the estimation of the components of variation for a two-factor unbalanced nested design and compares standard techniques with proposed approximation procedures. Current procedures are complicated and assume the unbalanced sample size to be fixed. This paper tests some simpler techniques, assuming sample sizes are random variables. Monte Carlo techniques were used to generate data for testing of these new procedures.
48

Estimating VO2max Using a Personalized Step Test

Webb, Catherine 27 March 2012 (has links) (PDF)
The purpose of this study was to develop a personalized step test and a valid regression model that used non-exercise data and data collected during the step test to estimate VO2max in males and females 18 to 30 years of age. All participants (N= 80) successfully completed a step test with the starting step rate and step height being determined by the self-reported perceived functional ability (PFA) score and participant's height, respectively. All participants completed a maximal graded exercise test (GXT) to measure VO2max. Multiple linear regression analysis yielded the following equation (R = 0.90, SEE = 3.43 mL/kg/min): 45.938 + 9.253(G) - 0.140(KG) + 0.670(PFA) + 0.429(FSR) - 0.149(45sRHR) to predict VO2max (mL/kg/min) where: G is gender (0=female;1=male), KG is body mass in kg, PFA is the sum of the two PFA questions, FSR is the final step rate (step-ups/min), and 45sRHR is the recovery heart rate 45 seconds following the conclusion of the step test. Each independent variable was significant (p < 0.05) in predicting VO2max and the resulting regression equation accounted for roughly 83% (R2=0.8281) of the shared variance of measured VO2max. Based on the standardized B-weights, gender (0.606) explained the largest proportion of variance in VO2max values followed by PFA (0.315), body mass (-0.256), FSR (-0.248), and the 45sRHR (-0.238). The cross validation statistics (RPRESS = 0.88, SEEPRESS = 3.57 (mL/kg/min-1) show minimal shrinkage in the accuracy of the regression model. This study presents a relatively accurate model to predict VO2max from a submaximal step test that is convenient, easy to administer, and individualized.
49

Inverse Methods in Parameter Estimation for High Intensity Focused Ultrasound (HIFU)

Fox-Neff, Kristen 26 May 2016 (has links)
No description available.
50

Flathead catfish stock characteristics in the Pascagoula River following Hurricane Katrina

Barabe, Russell M 11 December 2009 (has links)
Flathead catfish stocks in the Pascagoula River were decimated by the passage of Hurricane Katrina. Age-0 fish survived the storm, producing a strong 2005 year-class. Reproduction by the remaining adults and/or downstream movement from tributaries produced an additional strong cohort in 2006. The strong 2005 year-class resulted in the capture of a high proportion of two-year-old fish in 2007. In 2008, a high proportion of two- and three-year-old fish were captured, illustrating the high rate of survival of the 2005 year-class, and the presence of a strong 2006 year-class. The flathead catfish population of the Pascagoula River was dominated by immature fish that should begin to reproduce in 2009, and most of these fish should reach sexual maturity by 2011. Density estimates are low when compared to other populations, indicating that a management option of a minimum length limit of 610 mm could prove useful in protecting these future spawners.

Page generated in 0.063 seconds