Spelling suggestions: "subject:"2analysis."" "subject:"3analysis.""
101 |
Density separation of clay mineralsNelsen, Terry A. 30 November 1970 (has links)
Illite, chlorite, montmorillonite and kaolinite, as well as
natural marine sediments and mixtures of the standards were subjected
to density separation by centrifugation in a linear heavy-liquid
gradient.
The density layers yielded by centrifugation were recovered
and analyzed by X-ray diffraction. Separates were not monomineralic
but rather were polymineralic usually with minerals of invarient
density, such as quartz, in more than one layer. This incomplete
separation is attributed to insufficient centrifugation time. The
X-ray character of minerals of variable density changes in such a
way as to suggest increased crystallinity with depth in the density
gradient.
Although several samples were known to contain montmorillonite,
none of the density layers showed X-ray evidence of this material. The anomalous behavior of montmorillonite is attributed
to its imbibing of the polar organic chemicals used as surfactants
into its expandable crystal structure to produce an extremely large
basal spacing. This problem can be overcome in some cases by
heating the clay to 110°C for 8 hours, but in other cases this had
little or no effect in collapsing the expanded structure.
Even though the project was not a total success, the method
holds promise, providing the duration and intensity of the centrifugation
is increased and complete purging of polar organic molecules
from expandable layered clays can be accomplished. / Graduation date: 1971
|
102 |
Estimation of discriminant analysis error rate for high dimensional dataLebow, Patricia K. 23 October 1992 (has links)
Methodologies for data reduction, modeling, and classification of grouped
response curves are explored. In particular, the thesis focuses on the analysis of
a collection of highly correlated, highly dimensional response-curve data of
spectral reflectance curves of wood surface features.
In the analysis, questions about the application of cross-validation
estimation of discriminant function error rates for data that has been previously
transformed by principal component analysis arise. Performing cross-validation
requires re-calculating the principal component transformation and discriminant
functions of the training sets, a very lengthy process. A more efficient approach
of carrying out the cross-validation calculations, plus the alternative of
estimating error rates without the re-calculation of the principal component
decomposition, are studied to address questions about the cross-validation
procedure.
If populations are assumed to have common covariance structures, the
pooled covariance matrix can be decomposed for the principal component
transformation. The leave-one-out cross-validation procedure results in a rank-one
update in the pooled covariance matrix for each observation left out.
Algorithms have been developed for calculating the updated eigenstructure
under rank-one updates and they can be applied to the orthogonal
decomposition of the pooled covariance matrix. Use of these algorithms results
in much faster computation of error rates, especially when the number of
variables is large.
The bias and variance of an estimator that performs leave-one-out cross-validation
directly on the principal component scores (without re-computation
of the principal component transformation for each observation) is also
investigated. / Graduation date: 1993
|
103 |
An approach to parameter sensitivity analyses in model assessmentWong, Cecilia Sau Yen 06 1900 (has links)
No description available.
|
104 |
A method of separation of exponentials and its relationship to time domain synthesis of a finite lumped-parameter relaxation system /Vongsuri, Sommai. January 1967 (has links)
Thesis (Ph. D.)--Oregon State University, 1967. / Typescript (photocopy). Includes bibliographical references (p. 47-48). Also available on the World Wide Web.
|
105 |
Minima of functions of lines ... /Le Stourgeon, Elizabeth. January 1900 (has links)
Thesis (Ph. D.)--University of Chicago, 1917. / "Private edition, distributed by the University of Chicago Libraries, Chicago, Illinois, 1920." "Reprinted from Transactions of the American mathematical society, Vol. 21, No. 4, October, 1920." Includes bibliographical references. Also available on the Internet.
|
106 |
An algorithm for the numerical calculation of the degree of a mapping /Stynes, Martin J. January 1977 (has links)
Thesis (Ph. D.)--Oregon State University, 1977. / Typescript (photocopy). Includes bibliographical references. Also available on the World Wide Web.
|
107 |
Development of a thin layer electrochemical cell for anodic stripping voltammetry /McGill, Jim. January 1972 (has links)
Thesis (M.S.)--Oregon Graduate Center, 1972.
|
108 |
Estimating uncertainties in integrated reservoir studiesZhang, Guohong 30 September 2004 (has links)
To make sound investment decisions, decision makers need accurate estimates of the uncertainties present in forecasts of reservoir performance. In this work I propose a method, the integrated mismatch method, that incorporates the misfit in the history match into the estimation of uncertainty in the prediction. I applied the integrated mismatch method, which overcomes some deficiencies of existing methods, to uncertainty estimation in two reservoir studies and compared results to estimations from existing methods. The integrated mismatch method tends to generate smaller ranges of uncertainty than many existing methods. When starting from nonoptimal reservoir models, in some cases the integrated mismatch method is able to bracket the true reserves value while other methods fail to bracket it. The results show that even starting from a nonoptimal reservoir model, but as long as the experimental designs encompass the true case parameters, the integrated mismatch method brackets the true reserves value. If the experimental designs do not encompass all the true case parameters, but the true reserves value is covered by the experiments, the integrated mismatch method may still bracket the true case. This applies if there is a strong correlation between mismatch and closeness to the true reserves value. The integrated mismatch method does not need a large number of simulation runs for the uncertainty analysis, while some other methods need hundreds of runs.
|
109 |
Mobile Imaging: A Market Analysis : MBA-thesis i marketingSvensson, J Håkan, Abbas, Fadi January 2008 (has links)
Abstract Camera phones are moving into the rapid growth stage and they will rapidly be the most common image capture device in the world. Analysis agencies Gartner, ABI-Research and Future Image estimate that over 650 million camera phones were shipped in 2007 and that by the end of the decade there will be a global population of over one billion mobile imaging handsets -- more than double the number of digital still cameras (DSC). Although handset shipments are flourishing, consumers are not using their camera phones to the fullest extent. Researchers reveal that the number of photos taken, shared, and printed is relatively very low compared with DSC, resulting in a significant unrealised revenue potential for the mobile imaging industry. Despite the current limitations, recent researches done by Nokia marketing reveal that 40% of camera phone users indicate the camera phone is their primary camera. The researches suggest that with improvements in functionality, quality, usability and usage model, camera phones have the potential to be the most common and most frequently used type of camera. We believe the industry needs to stimulate more photo activity among camera phone owners to speed up (1) handset purchases, (2) picture taking, and (3) sharing, storing and printing services. To achieve this we believe companies with niche imaging technologies; such as faster decoding, less memory usage, minor processor (CPU) demands, rich features, and rich user experience have slightly high chance to outstand itself in this market. Also companies within the mobile imaging sphere which should have a competitive edge are the one who can solve inhibitors. Five out of every six sold cameras will be embedded in mobile phones. We believe the dominance of camera-phones will impact the imaging market in a variety of ways that will benefit the industry. This report aims to initiate a simple approach to give a high level view for companies aiming and searching for mobile imaging opportunities and should help extending the mobile focus thinking and area. Initiating a study on mobile imaging has been a challenge and this is due to two main reasons; the speed of development within this particular industry and the access to credible sources whether commercial or scientific. There is a significant fierce competition in the industry and it has been a great advantage to the authors to have had access to commercial reports and information sources first hand. When it comes to theories and methods, they have been taped from both the mainstream marketing literature and guerrilla marketing. There has been no obvious advantage to exclude mainstream marketing theories for this fast growing and quickly changing industry and methods described has proven worth while for the outcome of this study. The mainstream marketing literature has been utilised in the market analysis performed on the materiel obtained and when looking to the future possibilities and opportunities as well. The MIO model or the MIO-perspective, has been an excellent tool to help digest the information in a structural way and the three perspectives that are the foundation of the model; Market, Interaction and Organisation are all needed in any successful Marketing activity whether it is a fast moving business like the mobile image one or a more traditional industry e.g. the car industry. The model identifies the present situation, the future, strategy and action plan, all important components in forming the business plan. When describing both the present environment and when searching for new opportunities, the usage of the classic 4P’s is outstanding. Product, Price, Promotion and Place are all important parameters to elaborate on and as the MIO model points out, one should first focus on the industry as a whole and not once own enterprise in order to find profitable ways to develop the business. Some conclusions drawn from the study are; the more megapixels camera phone are released the more the customers’ awareness and education is raised and refined. This leads to better customers inconvenience to get the best out of their phones and the myth of getting a free digital camera does not live any more. The customers are getting better informed and they want their right to one converged high quality device where the camera is as important as the call functions of a phone device.
|
110 |
Hypervelocity spectroscopy from impacts in polymeric materialsBuettner, Douglas J. 30 April 1991 (has links)
Stacked sheets of polyethylene terephthalate andpolystyrene provide a means
for recovering projectiles travelling at hypervelocities. The transparency of these
multiple diaphragms is utilized so that light generated from hypervelocity impacts
can be studied.
A method for gathering visible as well as ultraviolet light from less
transparent polymer foams has been confirmed. Distinct spectral characteristics
as well as a blackbody temperature can now be used as tools for characterizing the
response of polymer foams to hypervelocity impact.
The identification of newly discovered excited atomic transitions from the
incident projectile suggests a method for observing the thermal history of the
projectile as it progresses through the capture medium. / Graduation date: 1991
|
Page generated in 0.0779 seconds