• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 305
  • 191
  • 169
  • 69
  • 51
  • 44
  • 23
  • 17
  • 9
  • 9
  • 7
  • 6
  • 5
  • 5
  • 5
  • Tagged with
  • 998
  • 212
  • 165
  • 151
  • 105
  • 83
  • 81
  • 80
  • 68
  • 68
  • 62
  • 60
  • 57
  • 54
  • 53
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
241

Standardisation of Cost Accounting for Cost-Benchmarking

Günther, Edeltraud, Schill, Oliver, Schuh, Heiko 26 September 2001 (has links) (PDF)
This study aims at the presentation of the standardisation method in benchmarking at the conceptual level. This conceptual level is to serve as the basis for the transfer into numerous practical applications. The field example demonstrated is meant to provide additional support. At the same time, this example reveals that the concept is feasible and necessary. The necessity refers to the identification, discussion and solution - oriented towards the goals of the intended benchmarking - of the aspects in the planning and analysis phases of benchmarking which are behind the standardisations. Under certain circumstances this may lead to the fact that this method makes the participating comparative partners sensitive for the conscious perception of the possible significance of standardisations. A more exact analysis may quite as well show that the options in cost accounting are used uniformly to a large extent so that for reasons of materiality the standardisations are not put into practice. Even if this study exclusively examines monetary quantities such as costs we should state that - particularly when benchmarking is extended to further objectives, e.g. quality, environment and time - the need for standardisation is identified for non-monetary quantities as well. Non-monetary quantities also involve - sometimes very complex - questions of evaluation that imply options for subjective evaluation. The basic question of how to evaluate environmental influences or quality aspects is to illustrate this issue. Since standardisations can not at all eliminate subjectivity, which is connected with the determination of options by standards, it may be quite useful in some cases to examine the influence that alternative standards produce. This is particularly useful for the influencing factor parameter, where the Hoechster-Spinne may serve as an analytical tool to illustrate the sensitivity of the quantity which is calculated taking into account this influencing factor.
242

Standardisation of Cost Accounting for Cost-Benchmarking: Also presented at the 23rd Congress of the European Accounting Association in Munich, Germany on March 29-31, 2000

Günther, Edeltraud, Schill, Oliver, Schuh, Heiko 26 September 2001 (has links)
This study aims at the presentation of the standardisation method in benchmarking at the conceptual level. This conceptual level is to serve as the basis for the transfer into numerous practical applications. The field example demonstrated is meant to provide additional support. At the same time, this example reveals that the concept is feasible and necessary. The necessity refers to the identification, discussion and solution - oriented towards the goals of the intended benchmarking - of the aspects in the planning and analysis phases of benchmarking which are behind the standardisations. Under certain circumstances this may lead to the fact that this method makes the participating comparative partners sensitive for the conscious perception of the possible significance of standardisations. A more exact analysis may quite as well show that the options in cost accounting are used uniformly to a large extent so that for reasons of materiality the standardisations are not put into practice. Even if this study exclusively examines monetary quantities such as costs we should state that - particularly when benchmarking is extended to further objectives, e.g. quality, environment and time - the need for standardisation is identified for non-monetary quantities as well. Non-monetary quantities also involve - sometimes very complex - questions of evaluation that imply options for subjective evaluation. The basic question of how to evaluate environmental influences or quality aspects is to illustrate this issue. Since standardisations can not at all eliminate subjectivity, which is connected with the determination of options by standards, it may be quite useful in some cases to examine the influence that alternative standards produce. This is particularly useful for the influencing factor parameter, where the Hoechster-Spinne may serve as an analytical tool to illustrate the sensitivity of the quantity which is calculated taking into account this influencing factor.
243

Financial performance comparison for ABC Farm

Newkirk, Kevin J. January 1900 (has links)
Master of Agribusiness / Department of Agricultural Economics / Michael Langemeier / This thesis had two objectives. One objective was to compare one northeast Kansas farm's financial performance from 2002 through 2011 to various groups of farms participating in the Kansas Farm Management Association (KFMA) during the same period. The second objective was to compare the crop acreage growth trends of the same northeast Kansas farm from 2002 through 2011 to the same groups of farms participating in the KFMA. In this thesis the northeast Kansas farm was referred to as ABC Farm. The purpose of this thesis was to provide ABC Farm's owners and management with information that could be used to formulate long-term goals for ABC Farm and to help identify strategies for achieving those goals. ABC Farm's 10-year financial performance was compared to six different KFMA member groups using 12 different financial measures or ratios. The KFMA groups included all NE region farms, NE region farms in the highest value of farm production (VFP) category, STATE irrigated crop farms, NE region farms in the highest net farm income quartile, NE region farms in the highest crop acreage category, and NE region farms in the lowest adjusted total expense ratio quartile. The 12 financial measures or ratios included VFP, net farm income, adjusted total expense ratio, operating profit margin ratio, asset turnover ratio, percent return on assets, VFP per worker, total crop acres farmed, crop machinery investment per crop acre, crop machinery cost per crop acre, current ratio, and debt to asset ratio. ABC Farm's 10-year average financial performance was better than the 10-year average of any KFMA group for most financial measures. ABC Farm's VFP, net farm income, operating profit margin ratio, VFP per worker, total crop acres, and current ratio were all higher than any KFMA group. ABC Farm's adjusted total expense ratio, crop machinery cost per crop acre, and debt to asset ratio were also lower than those of the various KFMA groups compared to. ABC Farm did not compare favorably to other KFMA groups for some of the financial measures. ABC Farm's average crop machinery investment per crop acre was higher than every group. ABC Farm's average asset turnover ratio was lower than every group. ABC Farm's average return on assets was lower than all but one group, all NE region farms.
244

Étude de faisabilité, projet complexe et benchmarking le cas de la Plateforme Logistique Agroalimentaire de Longueuil

Bhérer, Jean-Pierre January 2009 (has links)
L'étude de faisabilité à laquelle j'ai participé s'est déroulée de novembre 2005 à mars 2007. Elle portait sur un projet d'implantation d'une gare intermodale sur le territoire de Longueuil. Cette étude constituait le point culminant de plusieurs années de travail soutenu de la part des promoteurs du projet. L'industrie agroalimentaire, avec plus de 400 000 emplois, joue un rôle majeur dans l'économie du Québec. Selon plusieurs experts, son avenir est menacé par le phénomène de la mondialisation des marchés. Pour les promoteurs du projet, la mise en place d'une plateforme logistique intermodale, combinant le routier et le ferroviaire, aiderait cette industrie à mieux contrôler ses coûts et l'aiderait à affronter les défis et enjeux qui se dressent dans son environnement comme la sécurité alimentaire et le bioterrorisme, la saturation des infrastructures du transport, les problèmes de pénurie de main-d’œuvre dans le transport routier et les problèmes de pollution associés à ce mode de transport. Dans ce contexte, la problématique managériale s'est intéressée aux conditions de succès à respecter pour réussir l'implantation et l'exploitation de cette gare intermodale. Si le champ de littérature couvrant les plateformes logistiques est particulièrement large, les thèmes principaux de ma recherche portent sur les concepts de gestion de projet et de benchmarking. Ces deux concepts sont analysés sous l'angle spécifique des facteurs clés de succès qui constituent l'objet de recherche. L'objectif de la recherche est d'évaluer la contribution que le benchmarking peut apporter à l'analyse de faisabilité du projet. La gestion de projet comprend deux éléments, la gestion et le projet. La revue de littérature conclut que les facteurs critiques de succès dont parlent les auteurs se rapportent presqu'exclusivement à la gestion du projet au détriment des facteurs de succès associés au projet lui-même. De plus, très peu de recherches considèrent benchmarking comme moyen pour identifier les facteurs de succès du projet, c'est-à-dire comme générateur d'idées susceptibles d'aligner les projets complexes, majeurs et novateurs sur la trajectoire de la réussite. De plus, la revue de littérature a permis de constater un manque criant d'études empiriques tant sur la gestion de projet que sur le benchmarking. Ma recherche portant sur une étude de cas en contexte réel se trouve ainsi légitimée. Une logique principalement inductive fut utilisée. Deux études ont été effectuées lors de la recherche. Une étude de benchmarking où une grille d'entrevue semi-structurée fut utilisée et une étude de faisabilité organisationnelle où une grille d'entrevue semi-structurée fut utilisée de pair avec une enquête par questionnaire associant ainsi le quantitatif au qualitatif. Le benchmarking réalisé auprès des sites logistiques jugés les plus performants en Europe et aux États-Unis a fait ressortir 12 facteurs clés de succès spécifiques à ce type d'organisation. L'étude de faisabilité organisationnelle réalisée auprès de 45 dirigeants du privé ainsi que des milieux associatif et gouvernemental du Québec a permis de positionner l'industrie agroalimentaire par rapport à ces facteurs clés. Une comparaison entre les 10 facteurs de succès de gestion de projet reconnus dans la littérature avec les 12 facteurs de succès de projet identifiés dans le benchmarking a démontré que les deux groupes de facteurs étaient différents et que les deux groupes devaient être associés pour assurer le succès des projets. Fort de ce constat sur le rôle stratégique joué par les facteurs de succès de projet, un modèle expose la place du benchmarking à l'intérieur de la gestion de projet. Subséquemment, une modélisation de la planification stratégique du projet est présentée afin de développer un outil performant pouvant servir à des projets de même nature ou de nature différente en autant qu'ils correspondent à la définition de projets complexes, majeurs et novateurs que j'ai proposée. En ce sens l'outil est non seulement transférable, il est aussi généralisable. Suite à l'information obtenue et à l'évolution du projet de la gare intermodale, l'auteur a donné libre cours à son imagination en esquissant une théorie des facteurs de succès pour tenter d'ouvrir la porte à une réflexion en profondeur sur les projets complexes, majeurs et novateurs afin d'améliorer leur taux de succès. Finalement, mentionnons que l'originalité de cette recherche repose sur trois éléments : l'utilisation d'un benchmarking conjointement à une étude de faisabilité organisationnelle, le caractère novateur d'une recherche scientifique portant sur la relation entre le benchmarking et l'étude de faisabilité de projets complexes, majeurs et novateurs et l'applicabilité immédiate des outils élaborés.
245

The advantages and cost effectiveness of database improvement methods

Alkandari, Abdulaziz January 2002 (has links)
Relational databases have proved inadequate for supporting new classes of applications, and as a consequence, a number of new approaches have been taken (Blaha 1998), (Harrington 2000). The most salient alternatives are denormalisation and conversion to an object-oriented database (Douglas 1997). Denormalisation can provide better performance but has deficiencies with respect to data modelling. Object-oriented databases can provide increased performance efficiency but without the deficiencies in data modelling (Blaha 2000). Although there have been various benchmark tests reported, none of these tests have compared normalised, object oriented and de-normalised databases. This research shows that a non-normalised database for data containing type code complexity would be normalised in the process of conversion to an objectoriented database. This helps to correct badly organised data and so gives the performance benefits of de-normalisation while improving data modelling. The costs of conversion from relational databases to object oriented databases were also examined. Costs were based on published benchmark tests, a benchmark carried out during this study and case studies. The benchmark tests were based on an engineering database benchmark. Engineering problems such as computer-aided design and manufacturing have much to gain from conversion to object-oriented databases. Costs were calculated for coding and development, and also for operation. It was found that conversion to an object-oriented database was not usually cost effective as many of the performance benefits could be achieved by the far cheaper process of de-normalisation, or by using the performance improving facilities provided by many relational database systems such as indexing or partitioning or by simply upgrading the system hardware. It is concluded therefore that while object oriented databases are a better alternative for databases built from scratch, the conversion of a legacy relational database to an object oriented database is not necessarily cost effective.
246

Managing large energy and mineral resources (EMR) projects in challenging environments

Chanmeka, Arpamart 01 June 2010 (has links)
The viability of energy mineral resources (EMR) construction projects is contingent upon the state of the world economic climate. Oil sands projects in Alberta, Canada exemplify large EMR projects that are highly sensitive to fluctuations in the world market. Alberta EMR projects are constrained by high fixed production costs and are also widely recognized as one of the most challenging construction projects to successfully deliver due to impacts from extreme weather conditions, remote locations and issues with labor availability amongst others. As indicated in many studies, these hardships strain the industry’s ability to execute work efficiently, resulting in declining productivity and mounting cost and schedule overruns. Therefore, to enhance the competitiveness of Alberta EMR projects, project teams are targeting effective management strategies to enhance project performance and productivity by countering the uniquely challenging environment in Alberta. The main purpose of this research is to develop industry wide benchmarking tailored to the specific constraints and challenges of Alberta. Results support quantitative assessments and identify the root causes of project performance and ineffective field productivity problems in the heavy industry sector capital projects. Customized metrics produced from the data collected through a web-based survey instrument were used to quantitatively assess project performance in the following dimensions: cost, schedule, change, rework, safety, engineering and construction productivity and construction practices. The system enables the industry to measure project performance more accurately, get meaningful comparisons, while establishing credible norms specific to Alberta projects. Data analysis to identify the root cause of performance problems was conducted. The analysis of Alberta projects substantiated lessons of previous studies to create an improved awareness of the abilities of Alberta-based companies to manage their unique projects. This investigation also compared Alberta- based projects with U.S. projects to point out the differences in project process and management strategies under different environments. The relative impact of factors affecting construction productivity were identified and validated by the input from industry experts. The findings help improve the work processes used by companies developing projects in Alberta. / text
247

A benchmarking model for harmonic distortion in a power system / Johnny Rudolph

Rudolph, Johnny January 2011 (has links)
The present power system is loaded with sophisticated energy conversion technologies like solid state converters. With the rapid advance in semiconductor technology, power electronics have provided new devices that are highly efficient and reliable. These devices are inherently non-linear, which causes the current to deviate from sinusoidal conditions. This phenomenon is known as harmonic current distortion. Multiple consumers are connected to the utility at the point of common coupling. Harmonic currents are then transmitted into the distribution system by various solid state users and this could lead to voltage distortion. Harmonic distortion is just one of the power quality fields and is not desirable in a power system. Distortion levels could cause multiple problems in the form of additional heating, increased power losses and even failing of sensitive equipment. Utility companies like Eskom have power quality monitors on various points in their distribution system. Data measurements are taken at a single point of delivery during certain time intervals and stored on a database. Multiple harmonic measurements will not be able to describe distortion patterns of the whole distribution system. Analysis must be done on this information to translate it to useful managerial information. The aim of this project is to develop a benchmarking methodology that could aid the supply industry with useful information to effectively manage harmonic distortion in a distribution system. The methodology will implement distortion indexes set forth by the Electrical Power Research Institute [3], which will describe distortion levels in a qualitative and quantitative way. Harmonic measurements of the past two years will be used to test the methodology. The information is obtained from Eskom’s database and will benchmark the North-West Province distribution network [40]. This proposed methodology will aim to aid institutions like NERSA to establish a reliable power quality management system. / Thesis (M.Ing. (Nuclear Engineering))--North-West University, Potchefstroom Campus, 2012
248

Plan de desarrollo estratégico para planetario de la Universidad de Santiago de Chile

Montaldo Dibarrart, Gustavo January 2015 (has links)
Ingeniero Civil Industrial / Desde 1986, el Planetario de la Universidad de Santiago ha sido un centro de difusión científica y cultural icónico en Chile. Hoy, en un país que ha cambiado, esta fundación sin fines de lucro requiere modernizarse: descubrir nuevas oportunidades, mejorar su gestión interna y desarrollar sus capacidades. En alineación con la reciente y millonaria renovación digital que incorporó proyectores Carl Zeiss, la dirección promueve la realización de la presente memoria. Consiste en un Plan de desarrollo estratégico para cumplir la misión de Planetario de la Universidad de Santiago. Como objetivo general pretende generar una estrategia que sea funcional a sus declaraciones fundacionales y que le permita incrementar la cantidad de personas que asisten a sus servicios. Vital labor; solo el 19% de la capacidad de sala se completa. Para lograr aquél objetivo se utilizó la siguiente metodología. Primero se desarrolló una contextualización de la situación actual de la organización y su cadena de valor, donde se encontró entre otros elementos, que los flujos positivos del último año apenas alcanzan los $88MM y que se cuenta con espacios subutilizados. Esto fundamentó el proceso de redefinición fundacional diseñado como objetivo específico de este trabajo, actualizando oficialmente la misión: "Generar experiencias sorprendentes que inspiren a audiencias de todas las edades a valorar la astronomía, las ciencias y la cultura". Luego, para situar en perspectiva y en búsqueda de oportunidades, se realizó un análisis acabado del entorno externo. Se descubre, entre otras cosas, un público que utiliza medios digitales para comprar e informarse, y un atractivo panorama para diversificar el financiamiento en base a fondos gubernamentales y beneficios tributarios para aportes privados. Un benchmarking de mejores prácticas de planetarios de alto nivel mundial -con Hayden de Nueva York como aliado- y un análisis de fortalezas, oportunidades, debilidades y amenazas expuso una síntesis de diversas alternativas estratégicas. Se construyó así la estrategia, que con una orientación hacia la ampliación y retención de audiencias, vela por un servicio de alta calidad. En esa línea se articulan en el mapa estratégico objetivos como la ampliación de la prestación, un giro hacia la satisfacción de los clientes y la diversificación de las fuentes de ingreso. Se priorizan para tal efecto procesos clave como la excelencia operacional y la identificación de oportunidades de innovación, apoyándose en mecanismos que mejoren la cultura organizacional y la identificación con la misión. La capacitación dirigida de recursos humanos y un modelo de gestión de la información facilitarán en tales procesos la toma de decisiones oportunas y la necesaria coordinación de tareas. Indicadores y metas completan el cuadro de mando integral. Lejos de establecer rigidez, este propone sistematizar el análisis estratégico que permita caminar hacia la misión, y aumentar a un 63% el promedio de sala ocupada, duplicando la audiencia actual en tres años. El presente trabajo será implementado en Planetario a partir de Marzo de 2015.
249

Enhancing the Accuracy of Synthetic File System Benchmarks

Farhat, Salam 01 January 2017 (has links)
File system benchmarking plays an essential part in assessing the file system’s performance. It is especially difficult to measure and study the file system’s performance as it deals with several layers of hardware and software. Furthermore, different systems have different workload characteristics so while a file system may be optimized based on one given workload it might not perform optimally based on other types of workloads. Thus, it is imperative that the file system under study be examined with a workload equivalent to its production workload to ensure that it is optimized according to its usage. The most widely used benchmarking method is synthetic benchmarking due to its ease of use and flexibility. The flexibility of synthetic benchmarks allows system designers to produce a variety of different workloads that will provide insight on how the file system will perform under slightly different conditions. The downside of synthetic workloads is that they produce generic workloads that do not have the same characteristics as production workloads. For instance, synthetic benchmarks do not take into consideration the effects of the cache that can greatly impact the performance of the underlying file system. In addition, they do not model the variation in a given workload. This can lead to file systems not optimally designed for their usage. This work enhanced synthetic workload generation methods by taking into consideration how the file system operations are satisfied by the lower level function calls. In addition, this work modeled the variations of the workload’s footprint when present. The first step in the methodology was to run a given workload and trace it by a tool called tracefs. The collected traces contained data on the file system operations and the lower level function calls that satisfied these operations. Then the trace was divided into chunks sufficiently small enough to consider the workload characteristics of that chunk to be uniform. Then the configuration file that modeled each chunk was generated and supplied to a synthetic workload generator tool that was created by this work called FileRunner. The workload definition for each chunk allowed FileRunner to generate a synthetic workload that produced the same workload footprint as the corresponding segment in the original workload. In other words, the synthetic workload would exercise the lower level function calls in the same way as the original workload. Furthermore, FileRunner generated a synthetic workload for each specified segment in the order that they appeared in the trace that would result in a in a final workload mimicking the variation present in the original workload. The results indicated that the methodology can create a workload with a throughput within 10% difference and with operation latencies, with the exception of the create latencies, to be within the allowable 10% difference and in some cases within the 15% maximum allowable difference. The work was able to accurately model the I/O footprint. In some cases the difference was negligible and in the worst case it was at 2.49% difference.
250

Informationsarkitektur för användarbehov : en användarcentrerad analys av Oatly.com / Information Architecture for User Needs : a usercenteredanalysis of Oatly.com

Olin, Paulina January 2016 (has links)
The purpose of this bachelor thesis is to examine the usabilityof Oatly.com, using qualitative as well as quantitative methodswithin usercentereddesign and information architecture, inorder to propose ways of developing and improving it. Byanalyzing the content of the website’s home page, andcomparing it to the home pages of similar websites usingcompetitive benchmarking, suggestions are made on how toimprove the home page of the website, along with results fromcontextual inquiries carried out using participants representingthe company’s target group. The methods used are followedout from a usability perspective, and are focused on the needsand preferences of the target group. The conclusion reached inthis thesis suggests that the overall quality of Oatly.com iswell thought out and adjusted to appeal to the desired targetgroup. However suggestions are made on how the website canbe adjusted and improved in order to appeal to the targetaudience, as well as ways for Oatly to continue the assessmentand analysis of how to continually evolve and improve thewebsite. The suggestions for development of the websiteinclude adding a search function, adding significant content tothe productandcontact pages, adding content which explainsto the user how the company’s products can be utilized, aswell as changing the current utilization of the slider on thehome page.

Page generated in 0.0601 seconds