• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 239
  • 117
  • Tagged with
  • 356
  • 334
  • 333
  • 333
  • 144
  • 115
  • 78
  • 64
  • 58
  • 54
  • 52
  • 51
  • 47
  • 45
  • 37
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Implementing total productive maintenance : driving forces and obstacles

Lycke, Liselott January 2000 (has links)
The global marketplace is highly competitive and organisations who want to survive long-term, have to continuously improve, change and adapt in response to market demands. Improvements in a company's performance should focus on cost cutting, increasing productivity levels, quality and guaranteeing deliveries in order to satisfy customers. Total Productive Maintenance (TPM) is one method, which can be used to achieve these goals. TPM is an approach to equipment management that involves employees from both production and maintenance departments. Its purpose is to eliminate major production losses by introducing a program of continuous and systematic improvements to production equipment. TPM should be developed and expanded to embrace the whole organisation and all employees should be involved in the process as members of improvement teams. This thesis describes the development of TPM and the TPM implementation process. Research is focused on the implementation process of TPM. The author has had the opportunity of both monitoring and steering a company through part of its TPM implementation program and has conducted a longitudinal study. The implementation process takes several years and this thesis focuses on the initial three years of the process. This study demonstrates that driving forces, obstacles and difficulties often are dependent on the organisation, its managers and the individual employees. It also shows that the TPM implementation process has many similarities to the implementation of other improvement concepts. The analysis of these findings forms the basis for recommendations and guidance for organisations, who intend to implement TPM. / <p>Godkänd; 2000; 20070318 (ysko)</p>
22

Benefits from TQM for organisational performance

Eriksson, Henrik January 2002 (has links)
Total Quality Management (TQM) is sometimes considered as a management system in continuous change and consisting of values, methodologies and tools, the aim of which is to increase external and internal customer satisfaction with a reduced amount of resources. Whether TQM improves the performance of companies has been discussed for several years. One way to work with TQM and its values, methodologies and tools is to apply for and work with a quality award. Today, there are international, national, regional, branch-wise and in-company quality awards. The purpose of this thesis is to evaluate whether and describe how working with quality awards affects the performance of companies. The thesis consists of an extended summary and three appended papers on this subject, each one with a different aim and methodology. Two of the papers study the benefits from in-company quality awards for the performance of units, and one paper studies the financial performance of quality award recipients compared with competitors and branch indices. The main conclusion of the thesis, which strengthens earlier published results, is that working with quality awards affects financial performance positively if companies successfully implement TQM, which is the case for quality award recipients. Moreover, the results of this thesis have not been able to show strong evidence proving that the performance of units which have worked with in-company quality awards, but have not yet successfully implemented TQM, are affected by this work. However, such units experience that working with in-company quality awards has positive effects on the customers as well as the employees. / Godkänd; 2002; 20070222 (ysko)
23

Test Framework Quality Assurance: Augmenting Agile Processes with Safety Standards

Thörn, Jonathan January 2020 (has links)
Quality of embedded systems is often demonstrated by performed tests and guaranteed by the quality of the tools used to perform them. Test automation is important in agile development and test frameworks can be considered mission-critical. Thus, it is important to ensure the quality of tools used for quality assurance.This thesis explores how industries with agile processes can learn from safety-related development with plan-driven processes for increased test framework quality. Safety standards often rely on plan-driven processes, focused on discipline in long term prospects with substantial documentation and extensive upfront plans and designs. Agile approaches instead focus on quick adaptation, where software is evolved, undergoes continuous improvements and is delivered incrementally. A case study was performed as an industry collaboration. A literature study extracted approaches from articles and safety standards. Analysis and processing resulted in candidate solutions, principles and practices iteratively refined for general applicability and the industrial context. Insights on implications and perceived industrial value resulted from a focus group, with qualitative and quantitative data collected through moderated group discussions and complementary activities. Finally, this thesis proposes guidelines intended to be generally applicable, with a suggested augmented agile process of sequential ”mini V-models” inherently controlled by Definition of Dones. A case-specific set of proposed guidelines extends the suggestion while embracing insights from the focus group. Also identified was the importance of perceiving the framework as a tool-chain and not a single tool, where interaction sequences and intermediate results can be identified and utilized for analysis and applicable measures. Future work could refine the proposed guidelines with an industrial dynamic validation, and also extend the literature study and expand the focus group for diverse contexts and industrial perspectives.
24

Combined Digital Holography and Speckle Correlation for Rapid Shape Evaluation

Khodadad, Davood January 2014 (has links)
In manufacturing industry there is a high demand for on line quality control to minimize the risk of incorrectly produced objects. Conventional contact measurement methods are usually slow and invasive, meaning that they cannot be used for soft materials and for complex shapes without influencing thecontrolled parts. In contrast, interferometry and digital holography in combination with computers become faster, more reliable and highly accurate as an alternative non-contact technique for industrial shape evaluation. In digital holography, access to the complex wave field and the possibility tonumerically reconstruct holograms in different planes introduce a new degree of flexibility to optical metrology. With digital holography high resolution and precise three dimensional (3D) images of the manufactured parts can be generated. This technique can also be used to capture data in a single exposure,which is important when doing measurements in a disturbed environment.The aim of this thesis is to perform online process control of free-form manufactured objects by measuring the shape and compare it to the CAD-model. To do this, a new technique to measure surface gradients and shape based on single-shot dual wavelength digital holography and image correlation of speckle displacements is demonstrated. Based on an analytical relation between phase gradients and speckle displacements it is shown that an object is retrieved uniquely to shape and position without the unwrapping problems that usually appear in dual wavelength holography. The method is firstdemonstrated using continues wave laser light from two temperature controlled laser diodes operating at 640 nm. Further a specially designed dual core diode pumped fiber laser that produces pulsed light with wavelengths close to 1030 nm is used. One significant problem when using the dual wavelength single-shot approach is that phase ambiguities are built in to the system that needs to be corrected. An automatic calibration scheme is therefore required. The intrinsic flexibility of digital holography gives a possibility to compensate these aberrations and to remove errors, fully numerically without mechanical movements. In this thesis I present a calibration method which allows single-shot online shape evaluation in a disturbed environment. It is shown that phase maps and speckle displacements can be recovered free of chromatic aberrations. This is the first time that a single-shot dual wavelength calibration is reported by defining a criteria to make an automatic procedure.By the results of the presented work, it is experimentally verified that the single-shot dual wavelength digital holography and numerically generated speckle images can be used together with digital speckle correlation to retrieve and evaluate the object shape. The proposed method is also robust to large phasegradients and large movements within the intensity patterns. The advantage of the approach is that, using speckle displacements, the shape measurement can be done even though the synthetic wavelength is out of the dynamic range of the height variation of the object.
25

Applications of the PEXE-concept for maintenance policies and proportional hazards models

Westberg, Ulf January 1994 (has links)
No description available.
26

Process capability studies in theory and practice

Deleryd, Mats January 1996 (has links)
The existence of variation has been a major problem within industry since the early days of the industrial revolution and perhaps even earlier. The fact that two parts not ever will be identical, forces every organisation to find a strategy for how to master variation. Process capability studies, a method designed to judge whether a process is capable or not, often plays an important part in such a strategy.The concept of process capability studies has received both positive and negative criticism during the last decade. For instance, the supporters of process capability studies emphasise the importance of using the method to identify improvement priorities to be focused in the overall improvement process within an organisation.However, as all methods, process capability studies has its limitations. Actually, it is not principally the method as such that has been criticised, but rather the measures of capability used when conducting process capability studies, the so called process capability indices. All existing process capability indices have some weaknesses, even the most sophisticated indices have relatively poor statistical properties which might lead the user to make incorrect decisions, even if most theoretical aspects of how to conduct process capability studies are known by the user. The use of process capability indices is for instance partly based on the assumption that the process output is normally distributed, a condition that is often not fulfilled in practice, where it is common that the process output is more or less skewed.This thesis focuses on process capability studies in both theory and practice. In part 1 of the thesis some theoretical aspects of how to conduct process capability studies are identified and then the adherence to these aspects within Swedish industry is investigated. This study reveals that there are certain gaps between how process capability studies are supposed to be conducted according to theory and the way they actually are carried out in practice. The study also tries to explain why these gaps exist, by analysing common obstacles when implementing and conducting process capability studies.In part 2, a simulation study focusing on the effects of skewness on estimates of some process capability indices belonging to the family of indices named Cis presented. The effects of skewness are studied in three different cases, one incapable case, one case just capable and one very capable case. In all cases, four lognormal distributions with different skewness are used. The results from the simulation study indicate that the effect of skewness is relatively systematic, and therefore there are some hope that future investigations might use these results when formulating some practical solution to the problem of how to use process capability indices when the process monitored has a skewly distributed output.Finally, the results are summarised and discussed and some suggestions for future research are given.
27

Reliability-centred maintenance : identification of management and organisational aspects when introducing RCM

Backlund, Fredrik January 1999 (has links)
Increased demands on productivity, quality and cost-efficiency affecting manufacturing trends towards increased complexity and higher degree of process automation. A major break down in such a manufacturing system may generate severe damage on productivity, environment and personnel. Using risk assessment to identify serve risks within a plant, in combination with different maintenance strategies, is one course of action to prioritising maintenance activities needed. A methodology for executing a so-called risk-based maintenance is reliability-centred maintenance, RCM. There are several benefits generated from RCM, for example, improved safety and maintenance cost-effectiveness. Though, several companies have problems to make it work. The problems that occur are many times within management and organisational (M&amp;O) aspects, such as lack of communication and management support. Some M&amp;O aspects of importance when introducing improvement methods as TQM and TPM are similar with the ones valid for RCM. Though, differences seem to depend on the use of RCM in a more technology environment, overshadow the affects M&amp;O aspects really bring about when introducing it. That is probably the main reason why obstacles occur when introducing RCM. A structured step model has been developed, focusing on the preparation and planning activities when introducing RCM. / <p>Godkänd; 1999; 20070403 (ysko)</p>
28

Quality management for sustainable health: methodologies, values and practices taken from Swedish organizations

Bäckström, Ingela January 2006 (has links)
In many Western countries today, not least in Sweden, there are a lot of organizations that have great problems with sickness absence. The costs connected to the high rates of sickness absence have also risen to alarming levels. Healthy co-workers and healthy organizations are obvious goals for many leaders, but this is not always so easy to establish. Work practices and leadership that are beneficial to co-worker health are thus vital to identify. Studies have shown relationships between company-wide implementation of quality programs and improved co-worker satisfaction along with low co-worker turn over; in other words, co-worker health along with improved customer satisfaction and financial results. Despite the great problems concerning sickness absence, there are organizations that have been awarded prizes for excellence in leadership, internal partnership, working environment, and profitability. The overall purposes of the research described in this thesis are to examine and describe how management and leadership can establish sustainable health among the co-workers and examine how the leadership for sustainable health is related to Quality Management. The in-depth purpose is to examine which aspects within the values derived from the quality movement are those that primarily influence the co-workers' perceived health. The results presented can be described in three parts and are results from four case studies carried out in five different organizations. Three of the organizations have received awards for establishing good working environment, good financial results, and low sick leaves among their co- workers; the fourth received an award for the successful implementation of quality programs. The first part consists of results from case studies in three different organizations and describes how organizations can work to achieve sustainable health among their co-workers, with practical examples. The results are methodologies, values and organizational structure, which it is considered possible for other organizations to adopt in their efforts to achieve good working conditions resulting in fewer sick leaves. The second part is an attempt to investigate if leadership for sustainable health is related to Quality Management. Methodologies, leadership values, organizational structure, and general values found in organizations which have achieved sustainable health are analyzed in the light of Deming's 14 points, and a correlation is indicated. There is also correlation found between the TQM values and the co-workers' perception of their health. The third part examines which of the aspects within the values grown from the quality movement are those that influence the co-workers perceived health. The results show significant correlation between the values and the co-workers' perception of their health. Aspects found within the value "Top management commitment" were named; Empathy, Presence and Communication, Integrity, and Continuity. Within the value "Let everybody be committed" the aspects; Development, Influence and Being informed were found. These aspects are described in more detail and also in one model per value. The result implies that the TQM values; "Top management commitment", "Improve continuously" "Let everybody be Committed" and "Focus on customers" are important for achieving healthy organizations and sustainable health among co-workers. / <p>Godkänd; 2006; 20061206 (pafi)</p>
29

Contributions to the use of designed experiments in continuous processes : a study of blast furnace experiments

Vanhatalo, Erik January 2007 (has links)
Design of Experiments (DoE) contains techniques, such as factorial designs, that help experimenters maximize the information output from conducted experiments and minimize the amount of experimental work required to reach statistically significant results. The use of DoE in industrial processes is frequently and thoroughly described in literature. However, continuous processes in industry, frequently found in, for example, the mining and steel industries, highlight special issues that are typically not addressed in the DoE literature. The purpose of this research is to contribute to an increased knowledge of the use of DoE in continuous processes and aims to investigate if factorial designs and other existing techniques in the DoE field are effective tools also in continuous processes. Two studies have been performed. The focus of the first study, a case study of an industrial blast furnace operation, is to explore the potential of using factorial designs in a continuous process and to develop an effective analysis procedure for the experiments in a continuous process. The first study includes, for example, interviews, experiments, and large elements of action research. The focus of the second study is to explore how a-priori process knowledge can be used to increase the analysis sensitivity for unreplicated experiments. The second study includes a metastudy of experiments in literature as well as an experiment. The results show that it is possible to use factorial designs in a continuous process even though it is not straightforward and special considerations by the experimenter will be required. For example, the dynamic nature of continuous processes affects the minimum time required for each run in an experiment since a transient time period is needed between each run to allow the experimental treatments to reach full effect in the process. Therefore, the use of split-plot designs is recommended since it can be hard to completely randomize the experimental run order. It is also found that process control, during the conduction of the experiment, may be unavoidable in continuous processes. Thus, developing a process control strategy during the planning phase is found to be an important experimental success factor. Furthermore, the results indicate that the multitude of cross-correlated response variables typical for continuous processes can be problematic during the planning phase of the experiment. The many and cross-correlated response variables are also reasons to why multivariate statistical techniques, such as principal component analysis, can make an important contribution during the analysis. Moreover, a-priori process knowledge is confirmed to have a positive effect on analysis sensitivity for unreplicated experiments. Since experimental effects in continuous processes can be expected to be small compared to noise, a-priori process knowledge can also make a valuable contribution during analysis of experiments in continuous processes. Furthermore, activities like coordination of people, information and communication as well as logistics planning are found as important parts of the experimental effort in continuous processes. / Försöksplanering innehåller metoder och verktyg, exempelvis faktorförsök, som hjälper den som utför experiment att maximera informationsutbytet från experimenten och samtidigt minimera mängden resurser som krävs för att nå statistiskt säkerställda resultat. Användandet av försöksplanering i industriella processer beskrivs ofta och utförligt i litteraturen, men kontinuerliga tillverkningsprocesser, som ofta hittas i exempelvis gruv- och stålindustrin, medför en problematik som normalt inte beskrivs i försöksplaneringslitteratur. Syftet med forskningen i denna avhandling är att bidra till en ökad kunskap om användandet av försöksplanering i kontinuerliga processer genom att undersöka om faktorförsök och andra verktyg inom försöksplaneringsområdet är effektiva också i kontinuerliga processer. För att uppnå syftet genomfördes två studier. Den första studien är en fallstudie vid en industriell masugnsanläggning. Här utreds, genom intervjuer, experiment och aktionsforskning, potentialen i att använda faktorförsök i en kontinuerlig process och en analysmetodik för experiment i kontinuerliga processer utvecklas. Den andra studien undersöker, genom en metastudie av experiment från litteratur samt ett experiment, hur a-priori processkunskap kan användas för att öka känsligheten i analys av icke upprepade experiment. Resultaten visar att det är möjligt att använda faktorförsök i en kontinuerlig process men att det kräver speciella överväganden av försöksplaneraren. Exempelvis påverkar den dynamiska karaktären hos kontinuerliga processer den minsta möjliga tid som krävs för varje delförsök eftersom en omställningstid, mellan varje delförsök, behövs för att de förändringar som görs ska nå full effekt i processen. Det kan därför vara svårt att använda en fullständigt randomiserad försöksording. Istället rekommenderas användningen av så kallade split-plot-försök där begränsningar i randomiseringsordningen görs. Vidare är processtyrning ofta oundvikligt samtidigt som experimenten pågår, vilket gör det viktigt att i förväg formulera en styrstrategi för att minimera styrningens påverkan på försöksresultaten. Resultaten visar också att den stora mängden korskorrelerade resultatvariabler, som är vanliga i kontinuerliga processer, kan skapa problem under planeringen av experimenten. De många och korskorrelerade resultatvariablerna är också en orsak till att multivariata statistiska verktyg, som t.ex. principalkomponentanalys, kan vara värdefulla hjälpmedel under analysen. Vidare visar resultaten att nyttjandet av a-priori processkunskap under analysen har en positiv effekt på analysresultaten för icke upprepade försök. Eftersom effekter av förändringar i kontinuerliga processer ofta förväntas vara små jämfört med brus kan a-priori processkunskap ge ett värdefullt bidrag vid analys av experiment i kontinuerliga processer. Resultaten visar också på vikten av exempelvis koordination av personal, information, kommunikation och logistikplanering för att lyckas bra med ett experiment i en kontinuerlig process. / <p>ISRN: LTU-LIC--07/66--SE</p>
30

En arbetsmodell för intervallplanerat underhåll : Planering och kostnadsberäkning för underhållsarbete

Lind, Elias, Västerbo, Erik January 2021 (has links)
Denna studie har genomförts hos en uppdragsgivare som verkar inom livsmedelsindustrin. Studien genomfördes då en av deras maskiner i produktionslinan har återkommande haverier. Maskinen är det sista steget i paketeringen och kallas för tejpnedtryckare. Den fäster en tejpremsa på förpackningarna för att försegla förpackningen under tillverkningsprocessen samt att konsumenterna kan återförsegla förpackningen vid användning. Maskinens oplanerade stopptid står för 2,4 procent av den totala produktionstiden och ett genomsnittligt avhjälpande underhållsarbete på tejpnedtryckaren tar två timmar. När studien genomfördes fanns det ingen tydlig plan för hur förebyggande underhåll på maskinen skulle utföras för att minska de haverier som inträffar. Syftet med denna studie är att minska den stopptid som uppstår på grund av haverier vid tejpnedtryckaren och målet med denna studie är därför att presentera en arbetsmodell för planering och kostnadsberäkningar av underhållsarbete. För att uppfylla studiens mål och syfte användes olika metoder. Det första som gjordes var att samla in data med hjälp av observationer, intervjuer och uppdragsgivarens databas. Utifrån datainsamlingen skapades det sedan ett Ishikawadiagram som visade vilka felkällor det finns till haverierna. Efter det utfördes en parvis jämförelse mellan felkällorna för att rangordna felen i hur allvarlig konsekvensen av ett haveri är. För att validera rangordningen av felkällorna gjordes en FMEA vid sidan av Ishikawadiagrammet. Efter felkällorna rangordnats beräknades dessa i en Weibullmodell för att få ut ett intervall för förebyggande underhåll. För att se underhållsarbetet ur ett ekonomiskt perspektiv utfördes beräkningar för att se vilken typ av underhållsarbete som ska utföras. Metoderna sammanfattades till en arbetsmodell som kan användas för att utreda och implementera underhåll på utrustning som saknar tillståndsbaserad övervakning. Arbetsmodellen består av en instruktion för hur modellen ska användas och kalkylark för kostnads- och intervallberäkningar. Stegen i modellen är genomförda i denna studie för att säkerställa att det är genomförbart och att det går att applicera på en maskin utan tillståndsbaserad övervakning. Felkällornas haveridatum var tvetydiga då underhållsrapporterna från uppdragsgivaren var mycket bristfälliga och svåra att tyda. Det var inte alltid möjligt att utläsa hur eller vad underhållet var utfört på. Detta medför att trovärdigheten för Ishikawadiagrammets, FMEA:ns och intervallberäkningarnas resultat är låg. FMEA, Ishikawadiagrammet och parvis jämförelse ingår inte i den slutgiltiga arbetsmodellen då de var överflödiga.

Page generated in 0.0566 seconds