• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 9845
  • 4929
  • 2193
  • 737
  • 599
  • 532
  • 440
  • 361
  • 252
  • 158
  • 158
  • 158
  • 153
  • 148
  • 136
  • Tagged with
  • 24502
  • 4811
  • 3512
  • 2183
  • 1855
  • 1241
  • 1222
  • 1218
  • 1209
  • 1045
  • 1029
  • 1009
  • 993
  • 959
  • 946
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
301

ACCURACY AND RELIABILITY OF JOB EVALUATION

HAHN, DAVID C. January 1985 (has links)
This study investigated several factors that could influence the accuracy and reliability of job evaluation ratings. Two of these factors were training and amount of information. The subjects rated a series of 23 jobs on various dimensions. The results indicate that training had little effect on job ratings. Amount of information, however, had a consistent effect on the results. Subjects who were presented with greater amounts of information were generally more reliable and accurate. Procedures proposed by Cronbach (1955) and Jackson (1972) were used to measure accuracy. In addition, three rating biases (halo, leniency, and restriction of range) were conceptualized and operationalized in terms of accuracy. The results did vary somewhat depending on the measure being used. Implications concerning the study of accuracy are discussed.
302

THE EFFECT OF THE DOES-NOT-APPLY RATING AND A COMPARISON OF ITEM- AND DIMENSION-LEVEL PAQ INTERRATER RELIABILITY: A MONTE CARLO STUDY (JOB ANALYSIS)

BLUNT, JANET H. January 1986 (has links)
Recent research with the PAQ that investigated the ability of job-naive raters to make PAQ ratings based on limited job information found average interrater reliabilities for item ratings in the .40 range (Jones, Main, Butler, & Johnson, 1982). While admitting that this represented generally low agreement among raters, Jones et al. deemed these data adequate because the corresponding dimension score reliabilities averaged in the "acceptable" range of .60. The argument proposes that translating item ratings to dimension scores negates the necessity of obtaining reliable job analysis data at the item level. This study took issue with this position and empirically investigated the relationship betwen PAQ item and dimension reliability. Random data were generated to simulate PAQ item ratings for 1000 pairs of raters in each of four conditions of data generation. In each condition, a true profile was generated and the items evidencing agreement on the rating of Does Not Apply (DNA) were identified. A pair of simulated ratings was generated that held the DNA items constant and varied the reliability of the remaining responses. Each condition generated data that reflected different levels of reliability on the non-DNA items. Interrater reliability coefficients (Pearson r's) were calculated for these simulated item data and the corresponding dimension scores. Results indicated that, even with random data, average reliability coefficients for item-level data could be found in the .40 range; in addition, an average dimension score reliability in the .60 range was found when the true reliability for the items that were actually rated was .30. It was also found that as the number of DNA agreements increased, so did the item reliability, but the dimension reliability was essentially unaffected. Furthermore, as the number of DNA agreements increased and the item reliability increased, the reliability of the items not exhibiting DNA agreement was unchanged. Thus, reliability estimates that included a large number of DNA agreements tended to overestimate the reliability of the non-DNA ratings. It was concluded that reliability estimates, for both items and dimensions, of the magnitude reported by Jones et al. are inadequate, especially when the influence of the DNA rating is taken into consideration.
303

CAN REALISTIC JOB DESCRIPTION INFORMATION AND PRACTICE ENABLE NAIVE RATERS TO PROVIDE POSITION ANALYSIS QUESTIONNAIRE (PAQ) RATINGS COMPARABLE TO THOSE OF EXPERTS?

FRIEDMAN, LEE January 1986 (has links)
Jones, Main, Butler, and Johnson (1982) stated that job-naive raters provided with only narrative job descriptions can produce valid and reliable Position Analysis Questionnaire (PAQ) ratings. This implies that traditional time- and labor-intensive methods of collecting job analysis information (e.g., interviews, direct observation) are not necessary in order to accurately complete the PAQ. However, PAQ ratings in the Jones et al. study were not validated against an external standard, thereby making the unambiguous interpretation of their results impossible. To determine the convergent validity of the Jones et al. approach, we provided job-naive raters with varying amounts of job descriptive information and, in some cases, prior practice rating the job with another job analysis instrument; PAQ ratings were validated against those of job analysts who were also job content experts. None of the reduced job descriptive information conditions, or practice, enabled job naive raters to obtain either acceptable levels of convergent validity with experts or high interrater reliability.
304

ASSESSING A MULTI-PHASE APPROACH TO PERSONNEL SELECTION: AN EXAMINATION OF THE INTERRELATIONSHIPS AMONG FOUR PHASES OF A SELECTION SYSTEM (CENTER)

PHILLIPS, AMANDA PEEK January 1986 (has links)
The present study provided a field test of the interrelationships among four phases of a selection system. The four phases included the pre-interview evaluation, the interview, the post-interview evaluation, and an assessment center exercise. Dipboye's (1982) social process model of the interview was employed as the theoretical framework with which to examine the first three phases of the selection system. The interrelationships among the first three phases and the assessment exercise were examined by assessing the increment in prediction of the results on the exercise obtained from using the pre-interview and interview phases of the study. Subjects were 34 interviewers and 164 applicants for the position of account executive, drawn from 19 branch offices of a large financial services corporation. Interviewers reviewed the applicants' application materials and then completed two questionnaires, one prior to the interview and the other following the interview. The job applicants completed a questionnaire following the interview. The questionnaires served as the primary method of data collection. The study tested six propositions based on the social process model and found, overall, good support for them. Three sub-models of the overall model were also tested using the technique of structural equation modeling, and they demonstrated good fit to the data. Assessing the interrelationships among the first three phases and the assessment exercise, very little variability in any of the results of the exercise was explained by information gathered at either the pre-interview or interview phases or by both of the phases considered together. Based on the results of the present study, I conclude that the social process model provides a promising theoretical framework in explaining the interrelationships among the first three phases of the present organization's selection system. There are a number of practical implications of the model for the way in which interviewers conduct interviews, and these implications are discussed. I also conclude that the assessment exercise appears to provide information about the applicants that is unique from that gained in the earlier phases of the selection process. The implications of this conclusion for the future use of advanced assessment procedures are discussed.
305

ISSUES REGARDING FAKEABILITY AND THE MANAGERIAL POTENTIAL SCALE OF THE CALIFORNIA PSYCHOLOGICAL INVENTORY (SELECTION, DISSIMULATION, PLACEMENT, TRANSPARENCY, PERSONALITY)

HOLMES, CHRISTOPHER WELLS January 1986 (has links)
The narrow gap III-V semiconductors, InAs/AlSb/GaSb and InSb, exhibit an array of extreme physical properties, from the lightest effective mass and largest nonparabolicity of III-V semiconductors to heterostructure conduction band offsets ranging from -0.15 to +2.0 eV. In this work, I present three spectroscopic techniques which exploit these unusual properties to provide new insight into the physics of these materials. First, my measurement of cyclotron resonance in InAs/AlSb and InSb/AlInSb quantum wells was the first spectroscopic application of a new laser, the THz quantum cascade laser. The physical properties mentioned above put these materials into an experimentally accessible range, and InAs's high room temperature mobility and low temperature carrier density enabled us to explore a large temperature range. Previous investigations of other materials in limited temperature ranges had suggested what we confirmed: the cyclotron resonance effective mass increases with temperature, contrary to theoretical expectations. Second, we applied time resolved cyclotron resonance to InSb quantum wells for the first time. Because of InSb's large effective g-factor and nonparabolicity, time resolved cyclotron resonance enabled us to monitor the carrier relaxation and recombination from each Landau- and Zeeman-quantized state directly in time. This unprecedented level of detail could be extended to longer times to probe spin-flip relaxation, a significant parasitic process in quantum computation. Finally, I measured intersubband absorption in narrow InAs/AlSb quantum wells with widths from 10.5 to 1.8 nm. I observed the highest energy intersubband resonance in InAs/AlSb quantum wells: 650 meV at 77 K in a 1.8 nm well. I also performed detailed measurements of the temperature dependence of intersubband absorption and confirmed the correlation between the integrated intensity of intersubband absorption and the carrier distribution inferred from Shubnikov-de Haas and Hall measurements. Because of InAs/AlSb intersubband transitions' large accessible energy and temperature robustness, they are ideal candidates for resonant nonlinear optics. In particular, I discuss the potential of InAs/AlSb double quantum wells as a compact, room temperature, and coherent THz source. Such a source could revolutionize chemical sensing by providing convenient access to the strong fundamental vibrational fingerprints which all molecules have in the THz, potentially transforming applications from medicine to the military.
306

Value estimation for software development processes

Wang, Zhihua, 1970- January 2004 (has links)
The management of software development processes is a continual challenge facing software development organizations. Previous studies used "flexible models" and empirical methods to optimize software development processes. In this thesis, the expected payoff is used to quantitatively evaluate processes. Payoff can be defined as the value of a team member's action, and the expected payoff combines the value of the payoff of a team member's action and the probability of taking that action. The mathematic models of a waterfall process and two flexible processes are evaluated in terms of total maximum expected payoff. The results show under which conditions which process is more valuable. An overview of this work and results will be presented in this seminar.
307

Value optimization for engineering tasks

Zhang, Xiao Qi January 2012 (has links)
Great competition drives the widespread application of lean value in many industries. Value identification and product delivery is challenging due to the various concerns of shareholders, the large number of disparate tasks, and the complex resource allocation process. The overall goal of this research, then, is to develop a value focused optimization process adopting an enterprise perspective by investigating value identification, decision support and resource allocation. Firstly, a multiple-attribute model is proposed to identify value covering all the important aspects of the decision objectives. Then, the large number of decision makers drives the development of a decision support method to determine value in an efficient and egalitarian way. Finally, the defined value is incorporated into a resource allocation procedure to optimize the value that is realized from limited resources. The research was validated through testing at an aerospace company. / La concurrence importante entraîne la mise en place des valeurs 'Lean' dans de nombreuses industries. L'identification de la valeur et la livraison du produit sont difficiles à cause des diverses soucis des parties prenantes, le nombre important des tâches distribuées et le processus complexe d'allocation des ressources. L'objectif global de cette recherche est de développer un processus d'optimisation basé sur le concept de valeur. Cela se fait en adoptant une perspective d'entreprise qui s'investit sur l'identification de valeur, l'aide à la décision et l'allocation des ressources.Tout d'abord, un modèle multi-attribut est proposé afin de définir la valeur comme étant un concept qui couvre tous les aspects importants des objectifs de décision. Puis, une méthode d'aide à la décision est développée à l'aide d'un grand nombre de décideurs pour déterminer la valeur d'une manière efficace et égalitaire. Enfin, la valeur définie est incorporée dans une procédure d'allocation des ressources, pour optimiser la valeur telle que réalisée à partir des ressources limitées. La recherche a été validée à travers une implémentation dans une entreprise aérospatiale.
308

Manufacturing execution systems integration and intelligence

Hadjimichael, Basil January 2005 (has links)
In order to survive in today's competitive manufacturing markets, manufacturing systems need to adapt at an ever-increasing pace to incorporate new technology which can lower the cost of production, while maintaining quality and delivery schedules. The task of the manufacturing system becomes even more challenging in the quest to use a common approach for different manufacturing plants and ever evolving manufacturing processes for specific plants. This thesis introduces a reference architecture that enables such changes between plants and updates within plants. For this, we use the paradigm of Manufacturing Execution Systems (MES). A developed MES architecture by the National Institute of Standards and Technology (NIST) is used as the standard reference architecture. Its flexibility and scalability is applied to a specific steel melt-shop plant case study. In this case study the standard framework is specified through re-labeling standard data and modules to specifics tailored for the melt process of a generic steel plant. Since steel plants are faced with difficult scheduling and disturbance handling problems, specific intelligent algorithms are developed to deal with these issues through integrating some of the control into the MES. Conclusions as to the success of the algorithms along with supporting data and recommendations of further use for them are also included.
309

Quantitative assessment of product value and change risk analysis in early design process

Oduncuoglu, Arman January 2011 (has links)
Many products that we see in our daily life are designed through modifications to existing products. The ever changing trends in current markets, along with customers' rising demands for quality, require many companies to make frequent changes to create new products. Due to the challenges in product modification, many companies have adopted a strategy of adaptive design to create new designs by incrementally improving the existing ones. This thesis develops a decision support system which helps product development managers to assess project performance metrics, such as development effort, development time, product cost and revenue, customer satisfaction, profit margin, and risk. The proposed model is a specialized calculator which integrates house of quality (HOQ), functional analysis system technique (FAST), risk assessment, product complexity analysis, and change propagation analysis to provide an overview of the design process from product attributes and design risk to cost and effort.The assessment of different design solutions is performed by comparing the obtained performance metrics with those from the original design. The system allows the recalculation of these performance metrics when engineering change occurs during the creation of new design solutions. The system then provides an estimate of the change in required resources and expected benefits through comparative analysis. Through these means, the proposed tool aims to help project managers identify an optimal design solution.The main goal of the proposed model is to increase product knowledge in the early stages of design to support project managers and design engineers in their decision making process. This is achieved through the visualization of the effects of engineering changes. In this thesis, the details of the proposed decision support system (DSS) are described and illustrated with a simple example of a thermoflask. / De nombreux produits d'usage courant ont été conçus en modifiant des produits déjà existants. L'évolution constante des tendances du marché actuel et la demande croissante des clients pour des produits de qualité forcent les entreprises à effectuer des modifications fréquentes en vue de créer de nouveaux produits. Les défis associés à la modification de produit ont incité beaucoup d'entreprises à développer une stratégie de conception adaptative permettant de créer de nouveaux produits en améliorant de manière incrémentielle les produits existants. Ce mémoire propose un système d'aide à la décision visant à appuyer les gestionnaires en développement de produits dans l'analyse des mesures de performance d'un projet tels que l'effort et le temps de développement, le coût et le potentiel de revenu, la satisfaction de la clientèle, la marge de profit et le risque. Le modèle proposé est un calculateur spécialisé intégrant la maison de la qualité (HOQ), la technique d'analyse fonctionnelle de système (diagramme FAST), l'analyse du risque, l'analyse de la complexité du produit et l'analyse de la propagation de la modification. Il est ainsi possible de fournir une vue d'ensemble du processus de conception du produit allant des attributs du produit et du risque de la conception jusqu'aux coûts et aux efforts requis.L'analyse des diverses solutions de conception est effectuée en comparant les mesures de performance obtenues avec celles qui sont associées à la conception initiale. Le système permet de recalculer les mesures de performance lorsque des modifications techniques sont apportées dans le cadre de la création de nouvelles solutions de conception. Le système fournit alors, par le biais d'une analyse comparative, une estimation de l'impact de la modification sur les ressources requises et les avantages escomptés. À l'aide de ces fonctionnalités, l'outil proposé vise à aider les chefs de projet à trouver la solution de conception optimale.Le principal objectif du modèle proposé est d'augmenter la connaissance du produit dès les étapes préliminaires de la conception afin d'appuyer le processus de prise de décision des chefs de projet et des ingénieurs de conception. Cet objectif est réalisé par la visualisation des impacts d'une modification technique. Dans ce mémoire, le système d'aide à la décision (SAD) proposé est décrit de manière détaillée et illustré par un exemple simple, celui d'un contenant isotherme.
310

Continuous approval methods for engineered-to-order projects

Bhuiyan, Farina. January 1996 (has links)
This thesis studies project management issues which are applicable to organizations which partake in complex engineered-to-order projects. Traditional approaches to management methods for planning and control of projects are examined, and potential problem areas are identified. An alternative approach, continuous approval methods, is developed. The potential for improved productivity is demonstrated. / The acquisition of defence systems in the Department of National Defence (DND) is used to provide examples of complex engineered-to-order projects. The DND study showed how continuous approval methods improved DND's existing engineered-to-order processes with a reduction in both delivery time and cost. Although the study focused on one organization, the results are applicable to the management of any engineered-to-order project which undergoes traditional report-review-approve cycles.

Page generated in 0.496 seconds