• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2838
  • 469
  • 284
  • 257
  • 232
  • 65
  • 58
  • 40
  • 36
  • 28
  • 22
  • 22
  • 21
  • 20
  • 18
  • Tagged with
  • 5424
  • 668
  • 585
  • 562
  • 561
  • 556
  • 547
  • 514
  • 484
  • 422
  • 406
  • 374
  • 369
  • 354
  • 322
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Development of a prediction model for bacteremia in hospitalized adults with cellulitis to aid in the efficient use of blood cultures: a retrospective cohort study

Lee, Chun-Yuan, Kunin, Calvin M., Chang, Chung, Lee, Susan Shin-Jung, Chen, Yao-Shen, Tsai, Hung-Chin 19 October 2016 (has links)
Background: Cellulitis is a common infectious disease. Although blood culture is frequently used in the diagnosis and subsequent treatment of cellulitis, it is a contentious diagnostic test. To help clinicians determine which patients should undergo blood culture for the management of cellulitis, a diagnostic scoring system referred to as the Bacteremia Score of Cellulitis was developed. Methods: Univariable and multivariable logistic regression analyses were performed as part of a retrospective cohort study of all adults diagnosed with cellulitis in a tertiary teaching hospital in Taiwan in 2013. Patients who underwent blood culture were used to develop a diagnostic prediction model where the main outcome measures were true bacteremia in cellulitis cases. Area under the receiver operating characteristics curve (AUC) was used to demonstrate the predictive power of the model, and bootstrapping was then used to validate the performance. Results: Three hundred fifty one cases with cellulitis who underwent blood culture were enrolled. The overall prevalence of true bacteremia was 33/351 cases (9.4 %). Multivariable logistic regression analysis showed optimal diagnostic discrimination for the combination of age >= 65 years (odds ratio [OR] = 3.9; 95 % confidence interval (CI), 1.5-10.1), involvement of non-lower extremities (OR = 4.0; 95 % CI, 1.5-10.6), liver cirrhosis (OR = 6.8; 95 % CI, 1.8-25.3), and systemic inflammatory response syndrome (SIRS) (OR = 15.2; 95 % CI, 4.8-48.0). These four independent factors were included in the initial formula, and the AUC for this combination of factors was 0.867 (95 % CI, 0.806-0.928). The rounded formula was 1 x (age >= 65 years) + 1.5 x (involvement of non-lower extremities) + 2 x (liver cirrhosis) + 2.5 x (SIRS). The overall prevalence of true bacteremia (9.4 %) in this study could be lowered to 1.0 % (low risk group, score <= 1.5) or raised to 14.7 % (medium risk group, score 2-3.5) and 41.2 % (high risk group, score >= 4.0), depending on different clinical scores. Conclusions: Determining the risk of bacteremia in patients with cellulitis will allow a more efficient use of blood cultures in the diagnosis and treatment of this condition. External validation of this preliminary scoring system in future trials is needed to optimize the test.
42

Validation of operational global wave prediction models with spectral buoy data

Wingeart, Karen M. 12 1900 (has links)
Global wave predictions produced at two U. S. forecasting centers, Fleet Numerical Meteorology and Oceanography Center and the National Centers for Environmental Prediction are evaluated with spectral buoy measurements. In this study, the fidelity of frequency-directional spectra predicted by WAM and WAVEWATCH III at the operational centers is examined with data from 3-meter discus and 6-meter nomad buoys operated by the National Data Buoy Center in the Atlantic and Pacific Oceans and Datawell Directional Waverider buoys deployed along the California coast by the Scripps Institution of Oceanography Coastal Data Information Program. Only buoys located in deep water are used in the comparisons. Model nowcasts of frequency spectra and mean wave directions are compared to buoy measurements over a six-month period from 1 October 2000 to 31 March 2001. At the Pacific buoy locations, individual swell events were identified in the spectra from the three models and the buoy data. Predicted and observed swell frequencies and arrival directions are compared as well as the total energy transported past the buoy over the duration of each individual event. At all buoy locations, predicted and observed wave energy fluxes integrated over fixed frequency ranges are compared. All three models yield reliable nowcasts of swell arrivals at the buoy locations. In most cases, the models under-predict the energy measured by the buoys. WAVEWATCH III better resolves low-frequency swells than WAM, possibly owing to a superior numerical scheme. Swell predictions at NCEP forced with AVN winds are more accurate than those at FNMOC forced with NOGAPS winds. / US Navy (USN) author
43

Snow avalanches in Scotland, with particular reference to the Cairngorm Mountains

Ward, Rodney G. W. January 1981 (has links)
The thesis describes a method for predicting avalanche activity in Scotland, based primarily on meteorological and topographical data, and this is incorporated into a framework for avalanche forecasting. A literature review identifies the main meteorological and topographical factors causing avalanche release. Snow accumulation, cold temperatures or thaws are identified as the major meteorological factors causing avalanches, and the thesis so distinguishes two basic avalanche types---the direct-action avalanche caused by fresh snow accumulation, and the climax avalanche caused by temperature changes. Slope angle, surface roughness, slope geometry and catchment area are identified as the major topographical controls on the location of avalanche activity. The thesis presents data on avalanche activity in Scotland over the last two hundred and eighty years, with particular emphasis on the Cairngorm Mountains. Avalanche locations, types and magnitudes for almost a thousand avalanches are described. Data on Scottish weather and terrain conditions are also presented, also with particular emphasis on the Cairngorm area. Major periods of snow accumulation, cold temperatures and thaws are identified between the winters of 1977/78 and 1979/80, and the location of steep slopes, smooth surfaces and large catchment areas in the Cairngorms is described. Data on the characteristics of the Scottish snow cover, based on work performed in the Cairngorms by E. Langmuir, B. Beattie and the author are then presented. Correlations between avalanche activity and meteorological, topographical and glaciological conditions are demonstrated, and this enables a predictive model to be developed which assesses avalanche probability on the basis of the amount of fresh snowfall and the prevailing temperature. Fresh snowfall of 200 mm., and either several days of cold weather with maximum temperatures below -4&deg;C or two or three days of warm weather with maximum temperatures above 0&deg;C lead to avalanche activity. The model can be continually up-dated and permits a continuous avalanche probability assessment to be made.
44

The determination of crack propagation rates of reflection cracking through asphalt surfacings

Brooker, Timothy Nicholas January 1986 (has links)
A large proportion of the U.K. highway network constructed in the 1960's and 1970's contains lean concrete roadbase with bituminous surfacing. Pavements containing relatively high strength lean concrete have rarely required structural maintenance (thick overlay or reconstruction) but have required maintenance because of reflection cracking where the surfacing cracks above cracks in the lean concrete. The time of appearance of this cracking is very variable (2-20 years). Field observations indicate that roadbase transverse crack spacings are often greater than 5m. Reflection cracking at these long spacings can be caused by thermal stresses, This project identifies conditions under which ther&lt;ral reflection cracking will occur and develops a predictive model that allows estimation of the combined effect of thermal and traffic stresses. Finite element analyses indicate that initial crack development is likely to be caused by thermal stresses and final cracking will be assisted by traffic stresses. A temperature model has been developed to determine roadbase daily temperature range and surfacing temperature on a mean monthly basis. Thermal reflection cracking is considered to result from daily cycle fatigue rather than an extreme low temperature mechanism. A test rig has been developed to apply cyclic crack opening movements and simulative tests have been accelerated to 0.1Hz by using a &quot;bitumen stiffness&quot; fatigue criterion., Finite element results, displacements recorded during tests and tensile creep tests to determine mix stiffness, enable dc/dN and K 1 values and material constants (A, n) to be determined. This fracture mechanics interpretation of test results serves as the basis of the predictive model for thermal reflection cracking that is consistent with observations from an untrafficked road. The combined estimate of thermal and traffic stresses cannot however explain reflection cracking at&lt;5m spacings. This cracking apparently initiates at the surface and is probably influenced by other mechanisms.
45

The use of discriminant analysis in the prediction of performance of junior college students in a program of general education at Boston University Junior College.

Calia, Vincent F. January 1959 (has links)
Thesis (Ed.D.)--Boston University.
46

A comparison of the grade point averages of athletic participants, intramural participants, and non-participants in Mississippi South Big Eight Conference

Gilbert, Elliott January 2010 (has links)
Digitized by Kansas Correctional Industries
47

Updating Risk Engine for Diabetes Progression and Mortality in the United States: the Building, Relating, Acting, Validating for Outcomes (BRAVO) of Diabetes Risk Engine

January 2017 (has links)
acase@tulane.edu / Background The prediction of diabetes-related comorbidities and mortality over lifetime has significant clinical and policy implications. A prediction model can be used for economic evaluation on diabetes medications, comparative effectiveness review (CER) over different therapeutic plans, and estimation of the expected long-tern outcomes for different treatment goals (e.g., HbA1c). Most of the current diabetes prediction models heavily relied on the UKPDS risk engine and Framingham equation, which used data from 1970s on European populations. These populations were significantly different from current US population in various ways including race, different health related concept, treatment algorithm, screening method of comorbidities and even definition of diabetes. In addition, UKPDS risk engine does not include impact of hypoglycemia, which emerged as an important issue in the management of diabetes due to its impact on quality of life, cardiovascular events and mortality. Furthermore, with the advancement of the medical technology and innovation in redefining treatment guideline during the last decades, the rates of cardiovascular events, all-cause mortality and event related mortality have fundamentally changed, especially the survival rates from CVD events has substantially increased. There is an urgent needs to develop a new risk engine that more adaptable to the current US population. Objective The objective of this study was to update risk engine using a cohort of patients with type 2 diabetes in the United States. Methods A total of 21 equations for forecasting diabetes-related microvascular and macrovascular events, hypoglycemia, mortality, and progression of diabetes risk factors were estimated using data on 10,251 patients from the Action to Control Cardiovascular Risk in Diabetes (ACCORD) trial. Left truncated proportional hazard model was applied to fit each event equation using diabetes duration as time index, and a variety of distributions including Weibull and Gompertz distribution were tested. 10-folds cross-validation or bootstrapped validation was applied to account for overfitting issue. Predicted cumulative incidence rates was plotted against the observed cumulative incidence to serve as internal validation to evaluate the prediction accuracy of the BRAVO risk engine on ACCORD data. External validation was performed through applying the BRAVO risk engine onto population from other clinical trials. Results The BRAVO risk engine’s forecast felled within the 95% confidence interval for the occurrence of observed events at each time point through 40 years after diabetes onset. The model prediction provides accurate prediction according to the internal validation and external validation process, and good face validity on risk factors were established by endocrinologists. Severe hypoglycemia was found to be an important risk factor for congestive heart failure (CHF), myocardial infarction (MI), angina, blindness, and associated with increased mortality. Racial factor was included in more than half of the events equations (e.g. MI, revascularization surgery, blindness, SPSL, hypoglycemia). Therefore, the BRAVO risk engine can capture racial difference on diabetes outcomes among US population, as a significant improvement over UKPDS risk engine. Conclusion The BRAVO risk engine for the US diabetes cohort has a good internal validity to simulate events that closely match observed outcomes in the ACCORD trial. And it is also capable of accurately predict diabetes comorbidities in other US and non-US based clinical trials. The risk engine can be extrapolated over lifetime and provide long-term effect evaluation. The BRAVO risk engine can potentially provide more accurate prediction over a range of long-term outcomes than other current models, thus assist making clinical and policy decisions. / 1 / Hui Shao
48

Predicting Success in Shorthand I

Bell, Meredith Cragun 01 May 1968 (has links)
The purpose of this study was to find the best predictors and best combination of predictors of shorthand success. Those used were the Educational Research Corporation Stenographic Aptitude Test and its subtests, I.Q. scores, a Self-Success Rating, and the Digit Symbol subtest taken from the Wechsler Adult Intelligence Scale. Students enrolled in Shorthand I at Sky View High School were the subjects used in this study. The criteria of shorthand success were the final dictation speed and the final grade. The best single predictor of shorthand dictation speed was the total E.R.C. Stenographic Aptitude Test. The best combination of predictors were the total E.R.C. Stenographic Aptitude Test, and the E.R.C. Phonetic Spelling. The best single predictor of final grade was the E.R.C. Phonetic Spelling subtest. The best combination of predictors of final grade were I.Q., total E.R.C. Stenographic Aptitude Test, E.R.C. Word Discrimination, E.R.C. Phonetic Spelling, and E.R.C. Dictation.
49

Prediction of Slamming Occurrence of Catamarans

Grande, Kristoffer January 2002 (has links)
In this work the problem of slamming on the cross structure of catamarans is studied. An introduction and overview of the problem is given.Methods for predicting the slamming occurrence of high-speed power catamarans and sailing catamarans are presented. Emphasis is placed ondeveloping methods that are practical to use in order to facilitate prediction of slamming occurrence at the design stage. The methods used consist of three steps: Ship motion prediction, slamming identification and slamming pressurecalculations. Existing linear and non-linear ship motion prediction theories are used for high-speed power catamarans while a new strip theory has been developed specifically for motion prediction of sailing multihulls. Predicted shipmotion results are compared to full-scale experiments, both for high-speed powercatamarans and sailing catamarans. A new direct method for identification of slamming occurrence in the time domain is presented, as well as results using probabilistic methods. A comparison between the two methods is presented.Slamming pressure calculations are done using an existing two dimensional slamming theory and are compared with analytical results. A parametric study is done on two case study ships to investigate the effect of various hullformparameters on the slamming occurrence. The methods and results presented are of use to designers of high-speed power catamarans and sailing catamarans.
50

Robust Linear Prediction Analysis for Low Bit-Rate Speech Coding

Koestoer, Nanda Prasetiyo, npkoestoer@yahoo.com.au January 2002 (has links)
Speech coding is a very important area of research in digital signal processing. It is a fundamental element of digital communications and has progressed at a fast pace in parallel to the increase of demands in telecommunication services and capabilities. Most of the speech coders reported in the literature are based on linear prediction (LP) analysis. Code Excited Linear Predictive (CELP) coder is a typical and popular example of this class of coders. This coder performs LP analysis of speech for extracting LP coefficients and employs an analysis-by-synthesis procedure to search a stochastic codebook to compute the excitation signal. The method used for performing LP analysis plays an important role in the design of a CELP coder. The autocorrelation method is conventionally used for LP analysis. Though this works reasonably well for noise-free (clean) speech, its performance goes down when signal is corrupted by noise. Spectral analysis of speech signals in noisy environments is an aspect of speech coding that deserves more attention. This dissertation studies the application of recently proposed robust LP analysis methods for estimating the power spectrum envelope of speech signals. These methods are the moving average, moving maximum and average threshold methods. The proposed methods will be compared to the more commonly used methods of LP analysis, such as the conventional autocorrelation method and the Spectral Envelope Estimation Vocoder (SEEVOC) method. The Linear Predictive Coding (LPC) spectrum calculated from these proposed methods are shown to be more robust. These methods work as well as the conventional methods when the speech signal is clean or has high signal-to-noise ratio. Also, these robust methods give less quantisation distortion than the conventional methods. The application of these robust methods for speech compression using the CELP coder provides better speech quality when compared to the conventional LP analysis methods.

Page generated in 0.1642 seconds