• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2249
  • 126
  • 111
  • 56
  • 44
  • 44
  • 44
  • 44
  • 44
  • 42
  • 23
  • 19
  • 16
  • 13
  • 13
  • Tagged with
  • 3702
  • 3702
  • 2658
  • 1607
  • 936
  • 849
  • 832
  • 444
  • 339
  • 325
  • 325
  • 308
  • 303
  • 247
  • 246
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
211

An Application of the Means-End Theory: Measurement of Delivery and Consumption of an Educational Service

Anıtsal, Melek Meral 01 May 2007 (has links)
Gaining competitive advantage in today’s fragmented markets is a powerful incentive for marketers to create superior customer value to. Multiple research streams were carried out to understand the meaning of customer value and determine the most effective ways to provide it. A review of the extant literature revealed that these research streams fall into in four different categories, namely cultural values of the customer, customer value as a trade off, customer value as a process, and customer value as an experience. This dissertation research focuses on the third path, which uses means-end theory to understand customer value by exploring the meanings attached to the consumption of services. . This dissertation posits that a service provides superior value to customers if it contributes more towards the attainment of customer goals more than competitors. Although there is a great deal of research on means-end theory with products, limited empirical research has been conducted on services. Recently, some studies have been conducted using quantitative methods to explicate customer value hierarchies. However, means-end theory and its hierarchical framework of attributes, consequences and goals have not been tested using quantitative methods. This dissertation fills this gap by successfully testing the means-end theory in the context of an educational service. The hierarchical framework was analyzed using structural equation modeling to simultaneously test the proposed four hypotheses. The fit of the model was good and all the hypothesized paths were significantly correlated. In sum, this dissertation presents a completely quantitative approach for analyzing dominant paths among many important goals, consequences and attributes in the presence of high multicollinearity. Using this approach, it is possible to measure the types of customer value that may be created by the consumption of services and products.
212

Exploring the Role of Customer Value Change and Relationship Adaptation in Global Business Services

Blocker, Christopher P. 01 May 2007 (has links)
Global business executives recently highlight the importance of understanding the sources of value creation for customers around the world. Beyond a push to better grasp what customers currently value, firms interact with dynamic customers whose needs do not stand still. In response, managers are searching for innovative ways to sense ongoing changes in customers’ desires and effectively adapt their company’s value propositions. Yet, an extensive research review suggests there is little, if any, evidence that managers can rely on to understand how business customers are changing what they value across global markets – or what these changes mean for fostering loyalty in those relationships. This global study responds to these challenges through exploring the sources of value creation and the effects of value change for 939 customers of business services in the United States, Sweden, India, Singapore, and the United Kingdom. A theoretical framework is proposed that builds on research in customer value, international buyer behavior, and buyer-seller relationships and tests 22 hypotheses across three models. Two new constructs are developed, value change responsiveness and value change anticipation, which demonstrate significant effects on customer value. Significant results and close fit across three models tested with structural equation modeling generate a number of interesting implications for global and domestic managers. For executives and strategists who are concerned about growing a profitable base of loyal customers, this study provides insights for how customers in different market segments around the world are changing what they value, and specifically the role that this change plays in their perceptions of satisfaction and loyalty.
213

Work, Family, and Community in a Triciprocal Relationship: An Exploratory Study of Enrichment

Crowder, Cindy L 01 December 2007 (has links)
The purpose of this study was to expand the field of work-life literature through the introduction of triciprocal enrichment model that examines work-, family-, and community-related support antecedents and satisfaction variables. The main objectives were to incorporate the concept of enrichment and the domain of community into the work-life research providing a more accurate portrayal of the myriad of ways that all three domains interact and affect one another. Data from 202 respondents were collected, including information on their level of community involvement, their level of enrichment within work, family, and community, their satisfaction on the job, with their family, and with their community, and the availability and usefulness of resources and support in their community, at work, and from their families. A survey instrument was designed online using the nTreePoint® Web Forms software package. Although the proposed model was rejected, this study should promote further empirical investigations of enrichment and the relationships between work, family, and community. The scale modified for this study to measure the enriching relationships between work, family, and community should be further tested and validated. The results of this study revealed that antecedents from work-life conflict literature do not produce enrichment. Therefore, research should be conducted to determine the specific factors that produce enrichment.
214

Conditional Conservatism in Accounting: New Measures and Test of Determinants of the Asymmetric Timeliness in the Recognition of Good and Bad News in Reported Earnings

Gotti, Giorgio 01 May 2007 (has links)
Accounting standards mandate different, more conservative, rules for the recognition of unrealized gains than unrealized losses in reported earnings. Conditional conservatism, defined as asymmetric timeliness in the recognition of unrealized losses vs. gains in reported earnings has, since its origins, been a peculiar characteristic of the accounting system. Understanding conservatism’s role, its determinants, and its variations across firms is important for interpreting the nature, purposes, and valuation implications of accounting. Basu (1995; 1997) proposed a model to detect accounting conditional conservatism and provided empirical evidence that bad news is recognized more quickly than good news in earnings for a sample over the period 1963-1991. Following his seminal work1, accounting literature adopted the Basu single-period model to measure conditional conservatism (Ball et al. 2000; Ball et al. 2005; Ball and Shivakumar 2005; Lobo and Zhou 2006). However, Basu’s proxy for measuring the arrival of good/bad news, the price of the firm’s stock, may be influenced, in part, by factors that will never be recorded in a firm’s reported earnings. This introduces inaccuracy in the measure of conditional conservatism. To address the problems, I introduce a new measure of conditional conservatism, which results from a Least Absolute Deviation (LAD) piecewise regression and adopts the number of changes in financial analysts’ EPS forecasts as a proxy for good/bad news. Then, I use this new measure to test the determinants, suggested by previous literature, of conditional conservatism in accounting. Results show that companies with (1) lower debt-to-assets ratio, (2) large proportion of executives’ annual compensation independent of the firm’s accounting performance, (3) one of the big 4/big 7 audit firms as auditor, and a auditor opinion qualified with a going concern assumption the previous year exhibit a greater timeliness in the recognition of bad news than good news in annual earnings. ____________________ 1As of December 7, 2006, 102 citations for Basu (1997) are recorded on Thomson ISI’s Social Sciences Citation Index (http://portal.isiknowledge.com) and 291 are on Google Scholar (http://scholar.google.com)
215

Measuring the Impact of Workplace Design on Training Transfer Relative to Other Organizational Factors

Hillsman, Terron L 01 August 2007 (has links)
This ethnographic research extends the findings of an earlier study examining the impact of workplace design on training transfer. The study triangulates data and methods of inquiry through field observation, archival records, interviews, and a survey that was developed from the interview responses. Linking the earlier, more qualitative data and analysis with the latter, more quantitative data and analysis helped to extend several theoretical considerations. Purposeful sampling was used to identify participants who held nonacademic supervisory positions at a major land grant university. Participants had attended a performance review workshop and had been applying the learned skills for at least 6 months. The findings indicate that workplace design appears to play a vital role in facilitating as well as impeding transfer for supervisory skills in this study. The present study also offers a conceptual model that proposes where workplace design fits among other organizational factors perceived to impact training transfer. The findings alert and direct organizations to where they should channel their finite resources to support training transfer and provide organizations with a better ability to differentiate critical design features from design features that are more marginal to training transfer. As a case study, organizations should not infer that these findings apply to all work settings as it may depend upon the relevancy to the particular work situation and circumstances. Methods of analysis: Domain and Taxonomic analyses, descriptive statistics, Binomial distribution, ANOVA/post hoc procedures, and hierarchical clustering.
216

Risk Management in the Post-SOX Era: Do Audit Firms Effectively Retain Clients

Hollingsworth, Carl 01 May 2007 (has links)
Since the initial disclosure of accounting irregularities at Enron in late 2001, the landscape of public company audits has undergone substantial change. These changes include the conviction of Arthur Andersen in June of 2002 and the enactment of the Sarbanes-Oxley Act of 2002. These two changes have had a significant impact on the amount of work required to issue an audit report and the number of clients that can be serviced by the remaining Big Four audit firms. While the existing literature provides us some insight on how audit firms make client acceptance/continuance decisions, almost all this literature predates SOX. I extend this literature by investigating how audit firms make client continuance decisions in the post-SOX era, whether these decisions are effective at identifying better clients, and why audit firms retain some risky clients while dismissing others. It is interesting to note that Big Four audit firms use the same basic set of criteria when making a client continuance decision in the post-SOX era, even though the processes at the firms are slightly different. My findings also indicate that the client continuance process is much more formal and rigorous post-SOX. Additionally, I find that clients who are retained by their audit firms have better subsequent financial performance than those clients who are not retained. Finally, I find that audit firms appear to overweight client size when making the client continuance decision. Specifically, it appears audit firms retain large clients who have risk profiles consistent with smaller clients they dismiss.
217

The Importance of Market Opportunity Recognition Mechanisms in Interfunctional Management Teams

Bonney Jr., Frederick Lefferts 01 August 2008 (has links)
In today‟s fast moving business environments, managers must be able to gather and interpret data in such a way as to identify lucrative market opportunities. However, being able to exploit these opportunities is contingent on management‟s ability to sense important changes in the market or see the market in a new way and ultimately craft an appropriate response to these insights. Unfortunately, this ability to identify market opportunities has not been explored in the marketing literature. Very little is known about the cognitive processes managers use as they seek out market opportunities. The purpose of this dissertation is to shed some light on these cognitive processes by developing a conceptualization of market opportunity recognition mechanisms. Specifically, market opportunity recognition mechanisms are conceptualized as a set of interrelated constructs that include management team situational awareness, management team creative problem solving and management team strategic and tactical agreement. This conceptualization is built from a thorough review of the entrepreneurship, creativity, cognitive science, and market orientation literatures as well as from insights gained from field interviews and observations. The market opportunity recognition mechanisms are tested in a nomological framework that includes a contingency based view of firm responsiveness. The test of the dissertation hypotheses was conducted using participants engaged in a dynamic market simulation. The results of the tests suggest that situational awareness is the foundational construct in market opportunity recognition mechanisms and that the interaction between situational awareness and team agreement on tactical and strategic actions increases the probability that the team will effectively align resources to market conditions. This ultimately results in increased financial performance.
218

The Lack of Consequences for Audit Committee Members Following Accounting Restatements and the Resulting Impact on Investors

Carver, Brian Todd 01 August 2008 (has links)
Prior research has assumed that financial reporting failures indicate that individual directors have provided inferior monitoring of the reporting process and has found that directors suffer the loss of board positions following reporting failures. These penalties, however, are not uniformly applied across all outside directors. Using a sample of firms that have experienced multiple reporting failures and a matched sample of non-restating firms, I collect information on individual audit committee members and investigate whether retention on the audit committee is related to the quality of the director or to the influence of the CEO over the board of directors. I then examine whether the retention of directors on the audit committee is related to further aggressive accounting practices and to additional negative consequences for investors in the long run. I find that the retention of directors on the audit committee is positively related to the quality of the director and negatively related to CEO influence over the board for both the restating and nonrestating sample. I further find that the retention of directors on the audit committee following a reporting failure is not related to future aggressive accounting practices. Test examining other long-term consequences to investors are inconclusive. Overall, these results suggest that the labor market for directors operates in an efficient and effective manner.
219

Bayesian Shrinkage Estimation and Model Selection

Armagan, Artin 01 August 2008 (has links)
We introduce a new shrinkage variable selection operator which we term Adaptive Ridge Selector (ARiS). This approach is inspired by the Relevance Vector Machine (RVM) of Tipping (2001), which uses a Bayesian hierarchical linear model to do sparse estimation. RVM was originally introduced to obtain sparse solutions in the case of kernel regression where one has many highly correlated bases (features). Extending the RVM algorithm, we include a proper prior distribution for the precisions of the regression coefficients along with a hyper-parameter to be chosen. Based upon this model, we derive the full set of conditional posterior distributions for parameters as would typically be done when applying Gibbs sampling. However, instead of simulating samples from the posterior distribution in order to estimate posterior means of quantities, we apply the Lindley-Smith mechanism (Lindley and Smith, 1972). This approach sequentially maximizes the conditional distributions, in order to find the joint maximum of the posterior distribution given the value of the hyper-parameter. An empirical Bayes method is proposed for choosing this hyperparameter leading to ARiS-eB. Having moved from a Bayes argument, we also look at the problem from a penalized least squares estimation angle. From the conventional viewpoint, the proposed method eliminates the need for combinatorial search techniques over a discreet model space, converting the model selection problem into the maximization of the marginal likelihood over a one dimensional continuous space. Close similarities exist between this estimator obtained and the lasso-type shrinkage estimators. The lasso(Tibshirani, 1996) and its variants, as will be thoroughly discussed, use 1-norm for regularization leading to sparse solutions. The estimator proposed here is contrasted with various other shrinkage estimators along with simulation studies and real data examples. Inference is also possible using a very straight forward Gibbs sampling procedure after the active variables are determined in the model. The model is also extended to handle departures from normality in the likelihood.
220

Design and Analysis of Screening Experiments Assuming Effect Sparsity

Edwards, David Joseph 01 August 2008 (has links)
Many initial experiments for industrial and engineering applications employ screening designs to determine which of possibly many factors are significant. These screening designs are usually a highly fractionated factorial or a Plackett-Burman design that focus on main effects and provide limited information for interactions. To help simplify the analysis of these experiments, it is customary to assume that only a few of the effects are actually important; this assumption is known as ‘effect sparsity’. This dissertation will explore both design and analysis aspects of screening experiments assuming effect sparsity. In 1989, Russell Lenth proposed a method for analyzing unreplicated factorials that has become popular due to its simplicity and satisfactory power relative to alternative methods. We propose and illustrate the use of p-values, estimated by simulation, for Lenth t-statistics. This approach is recommended for its versatility. Whereas tabulated critical values are restricted to the case of uncorrelated estimates, we illustrate the use of p-values for both orthogonal and nonorthogonal designs. For cases where there is limited replication, we suggest computing t-statistics and p-values using an estimator that combines the pure error mean square with a modified Lenth’s pseudo standard error. Supersaturated designs (SSDs) are designs that examine more factors than runs available. SSDs were introduced to handle situations in which a large number of factors are of interest but runs are expensive or time-consuming. We begin by assessing the null model performance of SSDs when using all-subsets and forward selection regression. The propensity for model selection criteria to overfit is highlighted. We subsequently propose a strategy for analyzing SSDs that combines all-subsets regression and permutation tests. The methods are illustrated for several examples. In contrast to the usual sequential nature of response surface methods (RSM), recent literature has proposed both screening and response surface exploration using only one three-level design. This approach is named “one-step RSM”. We discuss and illustrate two shortcomings of the current one-step RSM designs and analysis. Subsequently, we propose a new class of three-level designs and an analysis strategy unique to these designs that will address these shortcomings and aid the user in being appropriately advised as to factor importance. We illustrate the designs and analysis with simulated and real data.

Page generated in 0.1127 seconds