Spelling suggestions: "subject:"[een] QUALITY CONTROL"" "subject:"[enn] QUALITY CONTROL""
321 |
Real time quality control for hydrometeorological dataKotwica, Kyle 26 November 1996 (has links)
This thesis investigates the feasibility of implementing a
real time quality control program into a data stream of
hydrometeorlogical data. The vast array of data used in the
forecasting of river levels and avalanches calls for a point of entry
quality control method that is both efficient from a
communications standpoint and practical given the computer
resources available.
The first step in this process is to find a normalization
scheme to enable the direct comparison of precipitation events
between different stations. The normalization scheme derived
uses the climatic database of historical records. The largest set
of historical records available is in the daily time frame.
However, the quick response needed in this type of forecasting
calls for the testing of data in a hourly format. This calls for the
need to develop some sort of transformation between events of
differing time scales.
Once the normalization scheme is in place four tests are used
to analyze the data. These tests compare the incoming data to
what is expected given the climate, forecasted value, previous
weather, and what is occurring at neighboring stations. The
results from these four tests are composited to make a final
opinion of the validity of the incoming data. The data are then
assigned two descriptive parameters. These parameters quantify
the sophistication of the tests performed on the data, and the
believed accuracy of the data. The two scores are then taken into
account to give a final broad description of the program's "opinion"
as to whether the data should be rejected, questioned, screened, or
verified.
Generally the program performs very well. The accuracy and
precision of the tests are left somewhat vague at this point. The
stress in the development of this test was in the modularity and
portability of the program; the testing scheme is not meant to be
limited to the purpose of flood forecasting or even precipitation
data. The threshold parameters, therefore, need to be set by the
end user. These thresholds will be defined by the type of data as
well as the purpose and accuracy of the data checking needed. / Graduation date: 1997
|
322 |
The volunteer experience: predictors of success in the long term care Ombudsman roleDeHart, Kimberly N. 17 August 1999 (has links)
This study explored the influence of motivations on the volunteer experience. The relationship among motivations. volunteer satisfaction, acceptance and support of the organizational goals, and outcomes of success in the volunteer role (pattern of participation and ombudsman effectiveness) were explored using Multiple Linear Regression analyses. Motivational Systems Theory (Ford, 1992) was applied to the investigation of relationships among these variables. It was proposed that alignment between the individual volunteer's motivations and the organization's goals should predict higher levels of satisfaction, organizational commitment, and success.
Psychological aspects of the volunteer experience proved valuable to the explanation of certain indicators of success in the Ombudsman role. The rates of case reporting and the time devoted to the Ombudsman role seemed to be influenced by the importance of particular motivations toward volunteerism, the extent to which these motivations are fulfilled by involvement with the Ombudsman program, and the commitment expressed toward the organization.
Communal (offering) motivations were rated among the most important for the
majority of volunteers. However, satisfaction scores were higher for both agentic and
affiliation motivational factors than for the communal motivational factor. Overall,
Ombudsmen were least motivated by motivations characterized as agentic or self-oriented.
Volunteers with lower importance ratings for agentic motivations had
moderately higher reporting rates than did participants attributing less importance to
self-oriented motivations.
Volunteers expressed high levels of organizational commitment and overall satisfaction in the role. The more committed these participants were to the organization, the more likely they were to experience satisfaction in their roles, and the more likely they were to express high levels of importance for all factors of motivation in this model. A significant effect was found for the influence of organizational commitment on time commitment, case reporting, and the frequency of visits. Motivational Systems Theory was found to be a useful framework for analyzing the effects of personal characteristics and psychological aspects of the volunteer experience on success and satisfaction in the Ombudsman role. / Graduation date: 2000
|
323 |
Verification of dose calculations in radiotherapyNyholm, Tufve January 2008 (has links)
External radiotherapy is a common treatment technique for cancer. It has been shown that radiation therapy is a both clinically and economically effective treatment for many types of cancer, even though the equipment is expensive. The technology is in constant evolution and more and more sophisticated and complex techniques are introduced. One of the main tasks for physicists at a radiotherapy department is quality control, i.e. making sure that the treatments are delivered in accordance with the dosimetric intentions. Over dosage of radiation can lead to severe side effects, while under dosage reduces the probability for patient cure. The present thesis is mainly focused on the verification of the calculated dose. Requirements for independent dose calculation software are identified and the procedures using such software are described. In the publications included in the thesis an algorithm specially developed for verification of dose calculations is described and tested. The calculation uncertainties connected with the described algorithm are investigated and modeled. A brief analysis of the quality assurance procedures available and used in external radiotherapy is also included in the thesis. The main conclusion of the thesis is that independent verification of the dose calculations is feasible in an efficient and cost effective quality control system. The independent calculations do not only serve as a protection against accidents, but can also be the basis for comparisons of the dose calculation performance at different clinics.
|
324 |
Automated Morphology Analysis of NanoparticlesPark, Chiwoo 2011 August 1900 (has links)
The functional properties of nanoparticles highly depend on the surface morphology of the particles, so precise measurements of a particle's morphology enable reliable characterizing of the nanoparticle's properties. Obtaining the measurements requires image analysis of electron microscopic pictures of nanoparticles. Today's
labor-intensive image analysis of electron micrographs of nanoparticles is a significant bottleneck for efficient material characterization. The objective of this dissertation is to develop automated morphology analysis methods.
Morphology analysis is comprised of three tasks: separate individual particles from an agglomerate of overlapping nano-objects (image segmentation); infer the particle's missing contours (shape inference); and ultimately, classify the particles by shape based on their complete contours (shape classification). Two approaches are
proposed in this dissertation: the divide-and-conquer approach and the convex shape analysis approach. The divide-and-conquer approach solves each task separately,
taking less than one minute to complete the required analysis, even for the largest-sized micrograph. However, its separating capability of particle overlaps is limited,
meaning that it is able to split only touching particles. The convex shape analysis approach solves shape inference and classification simultaneously for better accuracy,
but it requires more computation time, ten minutes for the biggest-sized electron micrograph. However, with a little sacrifice of time efficiency, the second approach achieves far superior separation than the divide-and-conquer approach, and it handles the chain-linked structure of particle overlaps well.
The capabilities of the two proposed methods cannot be substituted by generic image processing and bio-imaging methods. This is due to the unique features that the electron microscopic pictures of nanoparticles have, including special particle overlap structures, and large number of particles to be processed. The application
of the proposed methods to real electron microscopic pictures showed that the two proposed methods were more capable of extracting the morphology information than
the state-of-the-art methods. When nanoparticles do not have many overlaps, the divide-and-conquer approach performed adequately. When nanoparticles have many
overlaps, forming chain-linked clusters, the convex shape analysis approach performed much better than the state-of-the-art alternatives in bio-imaging. The author believes that the capabilities of the proposed methods expedite the morphology characterization process of nanoparticles. The author further conjectures that the technical generality of the proposed methods could even be a competent alternative to the current methods analyzing general overlapping convex-shaped objects
other than nanoparticles.
|
325 |
Comparison of response surface model and Taguchi methodology for robust designSudasna-na-Ayudthya, Prapaisri 01 December 1992 (has links)
The principal objective of this study was to compare the results of a proposed
method based upon the response surface model to the Taguchi method. To modify the
Taguchi method, the proposed model was developed to encompass the following
objectives. The first, with the exception of the Taguchi inner array, was obtain optimal
design variable settings with minimum variations, at the same time achieving the target
value of the nominal-the best performance quality characteristics. The second was to
eliminate the need for the use of a noise matrix (that is, the Taguchi outer array),
resulting in the significant reduction of the number of experimental runs required to
implement the model. The final objective was to provide a method whereby signal-tonoise
ratios could be eliminated as performance statistics.
To implement the proposed method, a central composite design (CCD)
experiment was selected as a second-order response surface design for the estimation of
mean response functions. A Taylor's series expansion was applied to obtain estimated
variance expressions for a fitted second-order model. Performance measures, including
mean squared error, bias and variance, were obtained by simulations at optimal settings.
Nine test problems were developed to test the accuracy of the proposed CCD method.
Statistical comparisons of the proposed method to the Taguchi method were performed.
Experimental results indicated that the proposed response surface model can be used to
provide significant improvement in product quality. Moreover, by the reduction of the
number of experimental runs required for use of the Taguchi method, lower cost process
design can be achieved by use of the CCD method. / Graduation date: 1993
|
326 |
Development of dosimetry using detectors of diagnostic digital radiography systemsAriga, Eiji, Ito, Shigeki, Deji, Shizuhiko, Saze, Takuya, Nishizawa, Kunihide 01 1900 (has links)
No description available.
|
327 |
Analysis of dynamic robust design experiment and modeling approach for degradation testingBae, Suk Joo 01 December 2003 (has links)
No description available.
|
328 |
Development of User Interface for Multibeam Echo Sounder Quality ControlHu, Shing-wen 23 July 2007 (has links)
Multi-beamecho sounder systemhas been around nowfor some 40 years and their use in shallow waters for the last 14 years. With modern shallow water systems running at up to 9,600 soundings/second, data collection at the rate of approximately 250 million soundings/day system is possible. Processing of Multibeam Echo sounder (MBES) data is a challenging task from both hydrographic and technological perspectives. We recognize that a completely automatic system is improbable, but propose that significant benefits can still be had if we can automatically process good quality data, and highlight areas that probably need further attention.
We propose an algorithm that takes uncleaned MBES data and attempts to pick out outliers as possible as we can. The traditionalmethod that is still in use today by numerous software applications is based on a line-by-line processing approach. Automatically filtering for a depth window, by beam number, slope between points, quality flags and recently by whether the beam¡¦s error is outside the IHO order for the survey are a number of ways in which the line-by-line approach has been speeded up. The fundamental differences between our method and the previous methods are that our algorithm does not actually delete any soundings at all and transform original one dimension information into two dimensions. Finally, we use Hierarchical Clustering to classifyMBES data into outliers and normal.
We develop the user interface formulti-beamecho sounder quality control. It provides almost the necessary tools and modules to perform a survey. Standard modules are Survey planning (track guidance lines, waypoints), channel design and 3D modeling, data acquisition, data QC and data processing/flagged. However, it will visualize the soundings to aid the decisionmaking process.
|
329 |
An investigation of an alternative to acceptance sampling through a Markov chain analysis of a manufacturing process quality control programHarrington, Daniel F. January 1990 (has links) (PDF)
Thesis (M.S. in Operations Research)--Naval Postgraduate School, September 1990. / Thesis Advisor(s): Lindsay, Glenn F. Second Reader: Bailey, Michael P. "September 1990." Description based on title screen as viewed on December 21, 2009. DTIC Identifier(s): Quality control, sampling, acceptance tests, production control, theses, mathematical models, vendors. Author(s) subject terms: Markov chain, P-chart, fraction nonconforming vs AQL. Includes bibliographical references (p. 50). Also available in print.
|
330 |
Analysis of dynamic robust design experiment and modeling approach for degradation testingBae, Suk Joo, January 2003 (has links) (PDF)
Thesis (Ph. D.)--School of Industrial and Systems Engineering, Georgia Institute of Technology, 2004. Directed by Paul H. Kvam. / Vita. Includes bibliographical references (leaves 108-113).
|
Page generated in 0.0357 seconds