Spelling suggestions: "subject:"martial least square"" "subject:"artial least square""
31 |
Regression methods in multidimensional prediction and estimationBjörkström, Anders January 2007 (has links)
In regression with near collinear explanatory variables, the least squares predictor has large variance. Ordinary least squares regression (OLSR) often leads to unrealistic regression coefficients. Several regularized regression methods have been proposed as alternatives. Well-known are principal components regression (PCR), ridge regression (RR) and continuum regression (CR). The latter two involve a continuous metaparameter, offering additional flexibility. For a univariate response variable, CR incorporates OLSR, PLSR, and PCR as special cases, for special values of the metaparameter. CR is also closely related to RR. However, CR can in fact yield regressors that vary discontinuously with the metaparameter. Thus, the relation between CR and RR is not always one-to-one. We develop a new class of regression methods, LSRR, essentially the same as CR, but without discontinuities, and prove that any optimization principle will yield a regressor proportional to a RR, provided only that the principle implies maximizing some function of the regressor's sample correlation coefficient and its sample variance. For a multivariate response vector we demonstrate that a number of well-established regression methods are related, in that they are special cases of basically one general procedure. We try a more general method based on this procedure, with two meta-parameters. In a simulation study we compare this method to ridge regression, multivariate PLSR and repeated univariate PLSR. For most types of data studied, all methods do approximately equally well. There are cases where RR and LSRR yield larger errors than the other methods, and we conclude that one-factor methods are not adequate for situations where more than one latent variable are needed to describe the data. Among those based on latent variables, none of the methods tried is superior to the others in any obvious way.
|
32 |
The Impacts of Competence and Knowledge Transfer Climate on ERP Knowledge TransferJou, Jau-jeng 07 February 2012 (has links)
While prior studies on ERP implementation have largely focused on the importance of best practices, this paper examines the impact of knowledge transfer climate and the competence of the players (i.e., the implementing firm and the consultant team). The model divides factors that influence the result of knowledge transfer during ERP implementation into three categories: those implemented by the firm, those implemented by the consultant, and those related to the impact of the knowledge transfer climate. Competence factors from the first two aspects facilitate the building of a better knowledge transfer climate. Survey results from 101 respondents were subjected to multivariate analysis. The significance of player competence factors is verified, and an understanding of the role that the knowledge transfer climate plays in the knowledge transfer process and the impact on the transfer process is developed.
This paper provides a broader, richer model of knowledge transfer networks to promote insight into successful ERP implementation. In practice, the key to effective knowledge transfer is the establishment a positive knowledge transfer climate. To achieve a successful ERP implementation, practitioners should focus on possessing robust competences with ERP implementation partners. Additional research may help ERP implementation project teams more effectively promote knowledge transfer in a wider range of conditions with greater confidence and precision.
|
33 |
Acoustic Emission in Composite Laminates - Numerical Simulations and Experimental CharacterizationJohnson, Mikael January 2002 (has links)
No description available.
|
34 |
Partial least squares structural equation modelling with incomplete data : an investigation of the impact of imputation methodsMohd Jamil, J. B. January 2012 (has links)
Despite considerable advances in missing data imputation methods over the last three decades, the problem of missing data remains largely unsolved. Many techniques have emerged in the literature as candidate solutions. These techniques can be categorised into two classes: statistical methods of data imputation and computational intelligence methods of data imputation. Due to the longstanding use of statistical methods in handling missing data problems, it takes quite some time for computational intelligence methods to gain profound attention even though these methods have analogous accuracy, in comparison to other approaches. The merits of both these classes have been discussed at length in the literature, but only limited studies make significant comparison to these classes. This thesis contributes to knowledge by firstly, conducting a comprehensive comparison of standard statistical methods of data imputation, namely, mean substitution (MS), regression imputation (RI), expectation maximization (EM), tree imputation (TI) and multiple imputation (MI) on missing completely at random (MCAR) data sets. Secondly, this study also compares the efficacy of these methods with a computational intelligence method of data imputation, ii namely, a neural network (NN) on missing not at random (MNAR) data sets. The significance difference in performance of the methods is presented. Thirdly, a novel procedure for handling missing data is presented. A hybrid combination of each of these statistical methods with a NN, known here as the post-processing procedure, was adopted to approximate MNAR data sets. Simulation studies for each of these imputation approaches have been conducted to assess the impact of missing values on partial least squares structural equation modelling (PLS-SEM) based on the estimated accuracy of both structural and measurement parameters. The best method to deal with particular missing data mechanisms is highly recognized. Several significant insights were deduced from the simulation results. It was figured that for the problem of MCAR by using statistical methods of data imputation, MI performs better than the other methods for all percentages of missing data. Another unique contribution is found when comparing the results before and after the NN post-processing procedure. This improvement in accuracy may be resulted from the neural network's ability to derive meaning from the imputed data set found by the statistical methods. Based on these results, the NN post-processing procedure is capable to assist MS in producing significant improvement in accuracy of the approximated values. This is a promising result, as MS is the weakest method in this study. This evidence is also informative as MS is often used as the default method available to users of PLS-SEM software.
|
35 |
Analysis of Additive Risk Model with High Dimensional Covariates Using Partial Least SquaresZhou, Yue 09 June 2006 (has links)
In this thesis, we consider the problem of constructing an additive risk model based on the right censored survival data to predict the survival times of the cancer patients, especially when the dimension of the covariates is much larger than the sample size. For microarray Gene Expression data, the number of gene expression levels is far greater than the number of samples. Such ¡°small n, large p¡± problems have attracted researchers to investigate the association between cancer patient survival times and gene expression profiles for recent few years. We apply Partial Least Squares to reduce the dimension of the covariates and get the corresponding latent variables (components), and these components are used as new regressors to fit the extensional additive risk model. Also we employ the time dependent AUC curve (area under the Receiver Operating Characteristic (ROC) curve) to assess how well the model predicts the survival time. Finally, this approach is illustrated by re-analysis of the well known AML data set and breast cancer data set. The results show that the model fits both of the data sets very well.
|
36 |
PRODUCT MANAGEMENT AS FIRM CAPABILITYRoach, David 22 August 2011 (has links)
Product management as an organizational system has a long history of practice, which predates most modern academic management research. Its activities span the external environment of the firm, while simultaneously spanning across internal functional specialties of the organization. Thus product management obtains, codifies, simplifies and stores external information making it available to a responsive organization, which uses it to establish competitive advantage and ultimately superior performance.
Building on the resource based view of the firm and boundary theory, these spanning activities, which are heterogeneously dispersed across firms, are considered organizational capabilities. Drawing upon the extant product management literature, this research uses product management as a proxy for boundary spanning capabilities of the firm. These capabilities are then empirically measured against two well established firm capabilities; market orientation and firm-level innovativeness.
This research addresses a gap in the literature by establishing product management as a set of firm-level capabilities, distinct from the well established constructs of market orientation and innovativeness. Results indicate that external product management capability, defined as channel bonding activities, fully mediates the market orientation – firm performance relationship, while firm level innovativeness continues to have a small mediating effect on performance. Internal product management capabilities, defined as market and technical integration are shown to negatively moderate the external product management capability - firm performance relationship.
Theoretical implications include establishing a link between boundary theory and the resource based view of the firm. Practical implications include the strong relationship between external spanning capabilities and firm performance and the dampening effect of cross-functional integration on firm performance. This empirical link between product management boundary spanning practices and how firms ultimately perform could assist practitioners in allocating resources and managing the relationship between the marketing and technological factions of the organization. Most importantly this research establishes the hereto untested link between product management capability and firm performance.
|
37 |
Hepatic Gene Expression Profiling to Predict Future Lactation Performance in Dairy CattleDoelman, John 07 October 2011 (has links)
An experiment was conducted to obtain a hepatic gene expression dataset from postpubertal dairy heifers that could be fit to a computational model capable of predicting future lactation performance values. The initial animal experiment was conducted to characterize the hepatic transcriptional response to 24-hour total feed withdrawal in one-hundred and two postpubertal Holstein dairy heifers using an 8329-gene oligonucleotide microarray in a randomized block design. Plasma concentration of non-esterified fatty acids was significantly higher, while levels of beta-hydroxybutyrate, triacylglycerol, and glucose were significantly lower with the 24-hour total feed withdrawal. In total, 505 differentially expressed genes were identified and microarray results were confirmed by real-time PCR. Upregulation of key gluconeogenic genes occurred despite diminished dietary substrate and lower hepatic glucose synthesis. Downregulation of ketogenic genes was contrary to the non-ruminant response to feed withdrawal, but was consistent with a lower ruminal supply of short-chain fatty acids as precursors. Following the microarray experiment, the first series of regression analyses was employed to identify relationships between gene expression signal and lactation performance measurements taken over the first lactation of 81 of the subjects from the original study. Regression models were evaluated using mean square prediction error (MSPE) and concordance correlation coefficient (CCC) analysis. The strongest validated stepwise regression models were constructed for milk protein percentage (r = 0.04) and lactation persistency (r = 0.09). To determine if another type of regression analysis would better predict lactation performance, partial least squares (PLS) regression analysis was then applied. Selection of gene expression data was based on an assessment of the linear dependence of all genes in normalized datasets for 81 subjects against 18 dairy herd index (DHI) variables using Pearson correlation analysis. Results were distributed into two lists based on correlation coefficient. Each gene expression dataset was used to construct PLS models for the purpose of predicting lactation performance. The strongest predictive models were generated for protein percentage (r = 0.46), 305-d milk yield (r = 0.44), and 305-d protein yield (r = 0.47). These results demonstrate the suitability of using hepatic gene expression in young animals to quantitatively predict future lactation performance. / Ontario Centre for Agricultural Genomics, NSERC Canada, and the Ontario Ministry of Agriculture, Food and Rural Affairs (OMAFRA)
|
38 |
Data analysis for the classification of gas-liquid and liquid-solid (slurry) flows using digital signal processingFedon S., Roberto J Unknown Date
No description available.
|
39 |
Customer perceived value : reconceptualisation, investigation and measurementBruce, Helen Louise 09 1900 (has links)
The concept of customer perceived value occupies a prominent position within the
strategic agenda of organisations, as firms seek to maximise the value perceived by
their customers as arising from their consumption, and to equal or exceed that
perceived in relation to competitor propositions. Customer value management is
similarly central to the marketing discipline. However, the nature of customer value
remains ambiguous and its measurement is typically flawed, due to the poor
conceptual foundation upon which previous research endeavours are built.
This investigation seeks to address the current poverty of insight regarding the nature
and measurement of customer value. The development of a revised conceptual
framework synthesises the strengths of previous value conceptualisations while
addressing many of their limitations. A multi-dimensional depiction of value arising
from customer experience is presented, in which value is conceptualised as arising at
both first-order dimension and overall, second-order levels of abstraction.
The subsequent operationalisation of this conceptual framework within a two-phase
investigation combines qualitative and quantitative methodologies in a study of
customer value arising from subscription TV (STV) consumption. Sixty semi-structured
interviews with 103 existing STV customers give rise to a multi-dimensional model of
value, in which dimensions are categorised as restorative, actualising and hedonic in
type, and as arising via individual, reflected or shared modes of perception. The
quantitative investigation entails two periods of data collection via questionnaires
developed from the qualitative findings, and the gathering of 861 responses, also from
existing STV customers. A series of scales with which to measure value dimensions is
developed and an index enabling overall perceived value measurement is produced.
Contributions to theory of customer value arise in the form of enhanced insights
regarding its nature. At the first-order dimension level, the derived dimensions are of
specific relevance to the STV industry. However, the empirically derived framework of
dimension types and modes of perception has potential applicability in multiple
contexts. At the more abstract, second-order level, the findings highlight that value
perceptions comprise only a subset of potential dimensions. Evidence is thus
presented of the need to consider value at both dimension and overall levels of
perception. Contributions to knowledge regarding customer value measurement also
arise, as the study produces reliable and valid scales and an index. This latter tool is
novel in its formative measurement of value as a second order construct, comprising
numerous first-order dimensions of value, rather than quality as incorporated in
previously derived measures. This investigation also results in a contribution to theory
regarding customer experience through the identification of a series of holistic, discrete,
direct and indirect value-generating interactions.
Contributions to practice within the STV industry arise as the findings present a solution
to the immediate need for enhanced value insight. Contributions to alternative
industries are methodological, as this study presents a detailed process through which
robust value insight can be derived. Specific methodological recommendations arise in
respect of the need for empirically grounded research, an experiential focus and a twostage
quantitative methodology.
|
40 |
Using satellite hyperspectral imagery to map soil organic matter, total nitrogen and total phosphorusZheng, Baojuan. January 2008 (has links)
Thesis (M.S.)--Indiana University, 2008. / Title from screen (viewed on June 3, 2009). Department of Earth Science, Indiana University-Purdue University Indianapolis (IUPUI). Advisor(s): Lin Li, Pierre Jacinthe, Gabriel M. Filippelli. Includes vita. Includes bibliographical references (leaves 78-81).
|
Page generated in 0.0966 seconds