• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7
  • 7
  • 1
  • Tagged with
  • 19
  • 19
  • 19
  • 19
  • 7
  • 5
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Forecasting corporate performance

Harrington, Robert P. January 1985 (has links)
For the past twenty years, the usefulness of accounting information has been emphasized. In 1966 the American Accounting Association in its State of Basic Accounting Theory asserted that usefulness is the primary purpose of external financial reports. In 1978 the State of Financial Accounting Concepts, No. 1 affirmed the usefulness criterion. "Financial reporting should provide information that is useful to present and potential investors and creditors and other users..." Information is useful if it facilitates decision making. Moreover, all decisions are future-oriented; they are based on a prognosis of future events. The objective of this research, therefore, is to examine some factors that affect the decision maker's ability to use financial information to make good predictions and thereby good decisions. There are two major purposes of the study. The first is to gain insight into the amount of increase in prediction accuracy that is expected to be achieved when a model replaces the human decision-maker in the selection of cues. The second major purpose is to examine the information overload phenomenon to provide research evidence to determine the point at which additional information may contaminate prediction accuracy. The research methodology is based on the lens model developed by Eyon Brunswick in 1952. Multiple linear regression equations are used to capture the participants’ models, and correlation statistics are used to measure prediction accuracy. / Ph. D.
12

Process parameter optimisation of steel components laser forming using a Taguchi design of experiments approach

Sobetwa, Siyasanga January 2017 (has links)
A research report submitted to the Faculty of Engineering and the Built Environment, University of the Witwatersrand, Johannesburg, in partial fulfilment of the requirements for the degree of Master of Science in Engineering. Date: September 2017, Johannesburg / The focus in this research investigation is to investigate the Process Parameter Optimisation in Laser Beam Forming (LBF) process using the 4.4 kW Nd: YAG laser system – Rofin DY 044 to form 200 x 50 x 3 mm3 mild steel - AISI 1008 samples. The laser power P, beam diameter B, scan velocity V, number of scans N, and cooling flow C were the five input parameters of interest in the investigation because of their influence in the final formed product. Taguchi Design of Experiment (DoE) was used for the selection and combination of input parameters for LBF process. The investigation was done experimentally and computationally. Laser Beam Forming (LBF) input parameters were categorised to three different levels, low (L), medium (M), and high (H) laser forming (LBF) parameters to evaluate parameters that yield maximum bending and better surface finish/quality. The conclusion drawn from LBF process is that samples which are LBFormed using low parameter settings had unnoticeable bending and good material surface finishing. On the other hand, samples LBFormed using medium parameters yielded visible bending and non-smooth surface finishing, while samples processed using high LBF parameters yielded maximum bending and more surface roughness than the other two process parameters. / MT2018
13

Determining the most appropiate [sic] sampling interval for a Shewhart X-chart

Vining, G. Geoffrey January 1986 (has links)
A common problem encountered in practice is determining when it is appropriate to change the sampling interval for control charts. This thesis examines this problem for Shewhart X̅ charts. Duncan's economic model (1956) is used to develop a relationship between the most appropriate sampling interval and the present rate of"disturbances,” where a disturbance is a shift to an out of control state. A procedure is proposed which switches the interval to convenient values whenever a shift in the rate of disturbances is detected. An example using simulation demonstrates the procedure. / M.S.
14

The Fixed v. Variable Sampling Interval Shewhart X-Bar Control Chart in the Presence of Positively Autocorrelated Data

Harvey, Martha M. (Martha Mattern) 05 1900 (has links)
This study uses simulation to examine differences between fixed sampling interval (FSI) and variable sampling interval (VSI) Shewhart X-bar control charts for processes that produce positively autocorrelated data. The influence of sample size (1 and 5), autocorrelation parameter, shift in process mean, and length of time between samples is investigated by comparing average time (ATS) and average number of samples (ANSS) to produce an out of control signal for FSI and VSI Shewhart X-bar charts. These comparisons are conducted in two ways: control chart limits pre-set at ±3σ_x / √n and limits computed from the sampling process. Proper interpretation of the Shewhart X-bar chart requires the assumption that observations are statistically independent; however, process data are often autocorrelated over time. Results of this study indicate that increasing the time between samples decreases the effect of positive autocorrelation between samples. Thus, with sufficient time between samples the assumption of independence is essentially not violated. Samples of size 5 produce a faster signal than samples of size 1 with both the FSI and VSI Shewhart X-bar chart when positive autocorrelation is present. However, samples of size 5 require the same time when the data are independent, indicating that this effect is a result of autocorrelation. This research determined that the VSI Shewhart X-bar chart signals increasingly faster than the corresponding FSI chart as the shift in the process mean increases. If the process is likely to exhibit a large shift in the mean, then the VSI technique is recommended. But the faster signaling time of the VSI chart is undesirable when the process is operating on target. However, if the control limits are estimated from process samples, results show that when the process is in control the ARL for the FSI and the ANSS for the VSI are approximately the same, and exceed the expected value when the limits are fixed.
15

A Clinical Decision Support System for the Identification of Potential Hospital Readmission Patients

Unknown Date (has links)
Recent federal legislation has incentivized hospitals to focus on quality of patient care. A primary metric of care quality is patient readmissions. Many methods exist to statistically identify patients most likely to require hospital readmission. Correct identification of high-risk patients allows hospitals to intelligently utilize limited resources in mitigating hospital readmissions. However, these methods have seen little practical adoption in the clinical setting. This research attempts to identify the many open research questions that have impeded widespread adoption of predictive hospital readmission systems. Current systems often rely on structured data extracted from health records systems. This data can be expensive and time consuming to extract. Unstructured clinical notes are agnostic to the underlying records system and would decouple the predictive analytics system from the underlying records system. However, additional concerns in clinical natural language processing must be addressed before such a system can be implemented. Current systems often perform poorly using standard statistical measures. Misclassification cost of patient readmissions has yet to be addressed and there currently exists a gap between current readmission system evaluation metrics and those most appropriate in the clinical setting. Additionally, data availability for localized model creation has yet to be addressed by the research community. Large research hospitals may have sufficient data to build models, but many others do not. Simply combining data from many hospitals often results in a model which performs worse than using data from a single hospital. Current systems often produce a binary readmission classification. However, patients are often readmitted for differing reasons than index admission. There exists little research into predicting primary cause of readmission. Furthermore, co-occurring evidence discovery of clinical terms with primary diagnosis has seen only simplistic methods applied. This research addresses these concerns to increase adoption of predictive hospital readmission systems. / Includes bibliography. / Dissertation (Ph.D.)--Florida Atlantic University, 2017. / FAU Electronic Theses and Dissertations Collection
16

Multivariate Quality Control Using Loss-Scaled Principal Components

Murphy, Terrence Edward 24 November 2004 (has links)
We consider a principal components based decomposition of the expected value of the multivariate quadratic loss function, i.e., MQL. The principal components are formed by scaling the original data by the contents of the loss constant matrix, which defines the economic penalty associated with specific variables being off their desired target values. We demonstrate the extent to which a subset of these ``loss-scaled principal components", i.e., LSPC, accounts for the two components of expected MQL, namely the trace-covariance term and the off-target vector product. We employ the LSPC to solve a robust design problem of full and reduced dimensionality with deterministic models that approximate the true solution and demonstrate comparable results in less computational time. We also employ the LSPC to construct a test statistic called loss-scaled T^2 for multivariate statistical process control. We show for one case how the proposed test statistic has faster detection than Hotelling's T^2 of shifts in location for variables with high weighting in the MQL. In addition we introduce a principal component based decomposition of Hotelling's T^2 to diagnose the variables responsible for driving the location and/or dispersion of a subgroup of multivariate observations out of statistical control. We demonstrate the accuracy of this diagnostic technique on a data set from the literature and show its potential for diagnosing the loss-scaled T^2 statistic as well.
17

An analysis of the California State Department of Parks and Recreation's "Quality Management Program"

Turney, Celena 01 January 1997 (has links)
No description available.
18

An empirical investigation of the extension of servqual to measure internal service quality in a motor vehicle manufacturing setting

Booi, Arthur Mzwandile January 2004 (has links)
This research explores the role, which the construct, service quality plays in an internal marketing setting. This is achieved by evaluating the perceptions and expectations of the production department with regards to the service quality provided by the maintenance department of a South African motor vehicle manufacturer. This was done using the INTSERVQUAL instrument, which was found to be a reliable instrument for measuring internal service quality within this context. A positivist approach has been adopted in conducting this research. There are two main hypotheses for this study: the first hypothesis is concerned with the relationship between the overall internal service quality and the five dimensions of service quality namely: tangibles, empathy, reliability, responsiveness and reliability. The second hypothesis focuses on the relationship between the front line staff segments of the production department and the five dimensions of internal service quality. The results of this research suggest that the perceptions and expectations of internal service customer segments plays a major role in achieving internal service quality. In addition, the importance of the INTSERVQUAL instrument in measuring internal service quality within the motor vehicle manufacturing environment is confirmed.
19

An Organizational Informatics Analysis of Colorectal, Breast, and Cervical Cancer Screening Clinical Decision Support and Information Systems within Community Health Centers

Carney, Timothy Jay 06 March 2013 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / A study design has been developed that employs a dual modeling approach to identify factors associated with facility-level cancer screening improvement and how this is mediated by the use of clinical decision support. This dual modeling approach combines principles of (1) Health Informatics, (2) Cancer Prevention and Control, (3) Health Services Research, and (4) Organizational Change/Theory. The study design builds upon the constructs of a conceptual framework developed by Jane Zapka, namely, (1) organizational and/or practice settings, (2) provider characteristics, and (3) patient population characteristics. These constructs have been operationalized as measures in a 2005 HRSA/NCI Health Disparities Cancer Collaborative inventory of 44 community health centers. The first, statistical models will use: sequential, multivariable regression models to test for the organizational determinants that may account for the presence and intensity-of-use of clinical decision support (CDS) and information systems (IS) within community health centers for use in colorectal, breast, and cervical cancer screening. A subsequent test will assess the impact of CDS/IS on provider reported cancer screening improvement rates. The second, computational models will use a multi-agent model of network evolution called CONSTRUCT® to identify the agents, tasks, knowledge, groups, and beliefs associated with cancer screening practices and CDS/IS use to inform both CDS/IS implementation and cancer screening intervention strategies. This virtual experiment will facilitate hypothesis-generation through computer simulation exercises. The outcome of this research will be to identify barriers and facilitators to improving community health center facility-level cancer screening performance using CDS/IS as an agent of change. Stakeholders for this work include both national and local community health center IT leadership, as well as clinical managers deploying IT strategies to improve cancer screening among vulnerable patient populations.

Page generated in 0.2546 seconds