• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 42
  • 12
  • 8
  • 8
  • 4
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 114
  • 114
  • 38
  • 29
  • 19
  • 19
  • 17
  • 17
  • 16
  • 14
  • 13
  • 11
  • 11
  • 10
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Understanding the effects of different levels of product monitoring on maintenance operations : a simulation approach

Alabdulkarim, Abdullah A. January 2013 (has links)
The move towards integrating products and services has increased significantly. As a result, some business models, such as Product Service Systems (PSS) have been developed. PSS emphasises the sale of use of the product rather than the sale of the product itself. In this case, product ownership lies with the manufacturers/suppliers. Customers will be provided with a capable and available product for their use. In PSS, manufacturers/suppliers are penalised for any down time of their product according to the PSS contract. This has formed a pressure on the service providers (maintenance teams) to assure the availability of their products in use. This pressure increases as the products are scattered in remote places (customer locations). Authors have urged that different product monitoring levels are applied to enable service providers to monitor their products remotely allowing maintenance to be performed accordingly. They claim that by adopting these monitoring levels, the product performance will increase. Their claim is based on reasoning, not on experimental/empirical methods. Therefore, further experimental research is required to observe the effect of such monitoring levels on complex maintenance operations systems as a whole which includes e.g. product location, different types of failure, labour and their skills and locations, travel times, spare part inventory, etc. In the literature, monitoring levels have been classified as Reactive, Diagnostics, and Prognostics. This research aims to better understand and evaluate the complex maintenance operations of a product in use with different levels of product monitoring strategies using a Discrete Event Simulation (DES) approach. A discussion of the suitability of DES over other techniques has been provided. DES has proven its suitability to give a better understanding of the product monitoring levels on the wider maintenance system. The requirements for simulating a complex maintenance operation have been identified and documented. Two approaches are applied to gather these generic requirements. The first is to identify those requirements of modelling complex maintenance operations in a literature review. This is followed by conducting interviews with academics and industrial practitioners to find out more requirements that were not captured in the literature. As a result, a generic conceptual model is assimilated. A simulation module is built through the Witness software package to represent different product monitoring levels (Reactive, Diagnostics, and Prognostics). These modules are then linked with resources (e.g. labour, tools, and spare parts). To ensure the ease of use and rapid build of such a complex maintenance system through these modules, an Excel interface is developed and named as Product Monitoring Levels Simulation (PMLS). The developed PMLS tool needed to be demonstrated and tested for tool validation purposes. Three industrial case studies are presented and different experimentations are carried out to better understand the effect of different product monitoring levels on the complex maintenance operations. Face to face validation with case companies is conducted followed by an expert validation workshop. This work presents a novel Discrete Event Simulation (DES) approach which is developed to support maintenance operations decision makers in selecting the appropriate product monitoring level for their particular operation. This unique approach provides numerical evidence and proved that the higher product monitoring level does not always guarantee higher product availability.
22

Simulation and optimisation of industrial steam reformers : development of models for both primary and secondary steam reformers and implementation of optimisation to improve both the performance of existing equipment and the design of future equipment

Dunn, Austin James January 2004 (has links)
Traditionally the reactor is recognised as the `heart' of a chemical process system and hence the focus on this part of the system is usually quite detailed. Steam reforming, however, due to the `building block' nature of its reaction products is unusual and generally is perceived as a `utility' to other reaction processes and hence the focus is drawn " towards the 'main' reaction processes of the system. Additionally as a `mature' process, steam reforming is often treated as sufficiently defined for the requirements within the overall chemical process. For both primary and secondary steam reformers several models of varying complexity were developed which allowed assessment of issues raised about previous models and model improvements; drawing on the advancements in modelling that have not only allowed the possibility of increasing the scope of simulations but also increased confidence in the simulation results. Despite the complex nature of the steam reforming systems, a surprisingly simplistic model is demonstrated to perform well, however, to improve on existing designs and maximise the capability of current designs it is shown that more complex models are required. After model development the natural course is to optimisation. This is a powerful tool which must be used carefully as significant issues remain around its employment. Despite the remaining concerns, some simple optimisation cases showed the potential of the models developed in this work and although not exhaustive demonstrated the benefits of optimisation.
23

Explicitation et modélisation des connaissances de conduite de changement à la SNCF : vers une gestion des connaissances pré-réfléchies / Eliciting and modelling change management knowledge int he national french railways company ( SNCF) : towards the management of pre-reflective knowledge

Remillieux, Anne 07 July 2010 (has links)
Cette thèse porte sur l'externalisation des connaissances des acteurs de la conduite du changement à la SNCF en vue de leur partage explicitation, modélisation et formalisation). Notre approche a été de prendre en compte la nature non seulement "implicite", mais aussi et surtout "pré-réfléchie", des connaissances impliquées dans ce savoir-faire, ce qui nous a conduit à explorer la voie d'une "ingénierie des connaissances pré-réfléchies". Si actuellement, la gestion des connaissances préfère souvent résoudre le problème du partage du tacite par la voie de la socialisation plutôt que par celle de l'externalisation, il existe des recherches sur l'explicitation des connaissances, en particulier les techniques d’"entretien d'explicitation" (Vermersch), qui méritent d'être appliquées à ce problème. Notre recherche a suivi les phases suivantes : revue de la littérature sur la conduite du changement puis sur la gestion des connaissances et le pré-réfléchi, description du système de connaissances de conduite du changement à la SNCF, conception d'une ontologie de représentation de ces connaissances, qu’elles soient explicites, implicites ou pré-réfléchies, dans le formalisme des graphes conceptuels, réalisation d'un serveur pour le partage de ces connaissances, et enfin, explicitation et formalisation du savoir-faire d'animation de groupes de travail participatifs. D'un point de vue méthodologique, ce travail nous a permis de mettre au point une méthode de conception d'ontologie à partir de connaissances pré-réfléchies ainsi qu'une adaptation de la méthode de l'entretien d'explicitation à un contexte de gestion des connaissances. / This thesis deals with the externalization of change management knowledge in the National French Railways Company for sharing (elicitation, modelling and formalization). We have chosen to consider not only the «implicit» dimension of this know-how (which is not stated) but also its "pre-reflective" dimension (which is not immediately conscious for the subject who uses it). This approach has led us to investigate a "pre-reflective knowledge engineering". Currently, most knowledge management researchers and practitioners prefer to solve the problem of tacit sharing by socialization rather than by externalization. However there has been some research done on eliciting knowledge which could be applied to knowledge management. More precisely, we have focused on the contributions of the “explicitation interview” (Vermersch) to knowledge management problems. Our research went through the following stages: state of the art in change management, knowledge management and pre-reflective knowledge, description of the system constituted by the change management knowledge at the SNCF, design of an ontology in order to represent this explicit, implicit and pre-reflective knowledge through conceptual graphs, design and development of a server for this knowledge sharing and finally, the elicitation and formalization of the know-how which consists in leading participative working parties. Methodologically, this study has enabled us to develop on the one hand a method for building an ontology starting from pre-reflective knowledge, on the other hand an adaptation of the explicitation interview for the knowledge management domain
24

Towards an Integrated Framework for Quality and Information Security Management in Small Companies

Große, Christine January 2016 (has links)
This master thesis elaborates the construction of an integrated framework for the simultaneous initiation of quality management and information security management within micro and small enterprises. Called QISMO, the model collection consists of three parts: (1) a holistic framework as structure dedicated to achieving a shared understanding among key stakeholders concerned about relations and dependencies, (2) a reference process model for visualising the entire process with the activities related, and (3) a lifecycle model for illustrating the process loop and for clarifying specific phases therein. This study offers an analysis of alternative approaches that results in premises and requirements adapted to micro and small enterprises. Furthermore, major barriers to the improvement of quality and information security management of micro and small enterprises are identified in this study. These include miscalculation of risks, lack of competence, and absence of structured processes. Aside from valuable insights for further development of enhanced training programs, the study contributes a comprehensive analysis of standards and good practices within the field of IT governance. Moreover, the study shares a concrete reference process model that is adapted to the preconditions of micro and small enterprises. These preconditions are acquired throughout the study. The proposition is to provide a basis for the further improvement of business processes and the models related to them, both in practice and in research.
25

Characterisation of uncured carbon fibre composites

Erland, Samuel January 2017 (has links)
The weight saving benefits of carbon fibre composites have been keenly adopted by civil aviation, with over 50% of the weight of modern designs coming from the carbon fibre components. The rapid rise in demand for this new material has led to the development of fully automated manufacturing techniques, improving rate of production and repeatability of manufacture. However, this rapid development, combined with a constant drive for increased rate of manufacture from industry can result in the formation of critical defects in the more complicated structural components. Manufacturing complex aeronautical structures from carbon fibre leads to a number of interesting mechanical problems. Forcing a multi-layered laminate to conform to a curved geometry requires individual layers to move relative to one another in order to relieve various forming-induced stresses. If the layers are constrained the dissipation of these stresses in the form of interply shear is prevented and a wide range of defects can occur, compromising the integrity of the final component. One of the most important of these is fibre wrinkling, which is effectively the buckling of one or more layers within an uncured laminate. This buckle results in a localised change in fibre orientation, which can result in a significant knockdown in part strength. A large amount of research has been conducted on carbon fibre in its cured state, when it exists as elastic fibres in an elastic matrix. Manufacturing occurs when the material is uncured however, with modern processes typically using fibres which are pre-impregnated with resin in order to reduce void content and aid fibre placement. A ply of uncured material therefore consists of stiff elastic fibres suspended in a very weak liquid viscoelastic material, whose properties are hugely influenced by temperature and rate of deformation. This thesis builds a better understanding of the mechanics involved in forming, using a series of characterisation techniques developed drawing from techniques in the literature. Part of the process involves the fitting of a one-dimensional viscoelasto-plastic model to experimental test data in order to represent the material response when shearing two plies about their interface. This model shows the material response to be dominated by the viscoelastic resin at low temperatures, before becoming frictional and fibre dominated at higher temperatures. In terms of optimum formability, a region exists in the transition from the viscous to frictional behaviour at which resistance to forming is minimised. With this data alone, optimum forming parameters such as rate of deformation, pressure and temperature can be suggested based on the material being used, along with design parameters such as stacking sequence. Another important characteristic which must be understood when considering ply wrinkling is the bending stiffness of uncured prepreg, both as a single ply and when combined to form a small laminate. A wrinkle is in effect the buckling of a single or small number of plies within a laminate, therefore by understanding the bending stiffness and process-induced loading we can begin to predict whether or not wrinkles are likely to occur for a particular manufacturing regime. In order to assess bending stiffness, a modified Dynamic Mechanical Analysis process is employed, replacing the standard Engineers Bending Theory calculations with a Timoshenko element to capture the large degree of intraply shear experienced in the bending of uncured prepreg. Finally, a small laminate scale demonstrator is considered in which a 24-ply laminate is consolidated into a female tool in such a way as to induced maximum shear strain between the plies, in order that the optimum forming parameters predicted by the characterisation tests might be validated. A simple energy minimisation model is used to predict the variation in consolidation strain around the part due to resistance to shear, using material parameters from the model describing the inter-ply shear test data. These parameters are also used to inform a novel modelling technique which has been developed parallel to this thesis, which is validated against the experimental results, and shows how the characterisation techniques can be used to advance simulation methods aimed at reducing the development time for new carbon fibre components. This work provides a set of tests and methodologies for the accurate characterisation of the behaviour of uncured carbon fibre during forming. The models developed alongside these tests allow for a detailed interrogation of the results, providing valuable insight into the mechanics behind the observed material behaviour and enabling informed decisions to be made regarding the forming process in order that the occurrence of defects might be minimised. The primary aim has been to provide a set of vital input parameters for novel, complex process modelling techniques under development, which has been achieved and validated experimentally.
26

Designing secure business processes from organisational goal models

Argyropoulos, Nikolaos January 2018 (has links)
Business processes are essential instruments used for the coordination of organisational activities in order to produce value in the form of products and services. Information security is an important non-functional characteristic of business processes due to the involvement of sensitive data exchanged between their participants. Therefore, potential security shortfalls can severely impact organisational reputation, customer trust and cause compliance issues. Nevertheless, despite its importance, security is often considered as a technical concern and treated as an afterthought during the design of information systems and the business processes which they support. The consideration of security during the early design stages of information systems is highly beneficial. Goal-oriented security requirements engineering approaches can contribute to the early elicitation of system requirements at a high level of abstraction and capture the organisational context and rationale behind design choices. Aligning such requirements with process activities at the operational level augments the traceability between system models of different abstraction levels and leads to more robust and context-aware operationalisations of security. Therefore, there needs to be a well-defined and verifiable interconnection between a system’s security requirements and its business process models. This work introduces a framework for the design of secure business process models. It uses security-oriented goal models as its starting point to capture a socio-technical view of the system to-be and its security requirements during its early design stages. Concept mappings and model transformation rules are also introduced as a structured way of extracting business process skeletons from such goal models, in order to facilitate the alignment between the two different levels of abstraction. The extracted business process skeletons, are refined to complete business process models through the use of a set of security patterns, which standardise proven solutions to recurring security problems. Finally, the framework also offers security verification capabilities of the produced process models through the introduction of security-related attributes and model checking algorithms. Evaluation of this work is performed: (i) through individual evaluation of its components via their application in real-life systems, (ii) a workshop-based modelling exercise where participants used and evaluated parts of the framework and (iii) a case study from the public administration domain where the overall framework was applied in cooperation with stakeholders of the studied system. The evaluation indicated that the developed framework provides a structured approach which supports stakeholders in designing and evaluating secure business process models.
27

Defining a Formalized Representation for Information Demand

Idiahi, Innocent January 2011 (has links)
Information demand is a part of comprehensive business logistics which encompass logistics of information. The demand for information has provided a unifying framework for different needs on enterprise modeling. Hence, the problems organizations faces relating to flow and distribution has lead to the development of various framework for analyzing information demand and this is guided by a set of rules, methods and even a unified representation. This thesis work defines a specification for enterprise Information Demand Context model using XPDL as the language of construct. The paper gives reasons why XPDL was preferred for such a representation and show how mapping is carried out from the constructs of notations to its associated XPDL specifications, so that when we are defining a representation we are as well defining its meta model. The resulting specification is presented in such a way that it should be able to give a flexible, logical and more defined structure.
28

Model Refinement and Reduction for the Nitroxide-Mediated Radical Polymerization of Styrene with Applications on the Model-Based Design of Experiments

Hazlett, Mark Daniel 21 September 2012 (has links)
Polystyrene (PS) is an important commodity polymer. In its most commonly used form, PS is a high molecular weight linear polymer, typically produced through free-radical polymerization, which is a well understood and robust process. This process produces a high molecular weight, clear thermoplastic that is hard, rigid and has good thermal and melt flow properties for use in moldings, extrusions and films. However, polystyrene produced through the free radical process has a very broad molecular weight distribution, which can lead to poor performance in some applications. To this end, nitroxide-mediated radical polymerization (NMRP) can synthesize materials with a much more consistently defined molecular architecture as well as relatively low polydispersity than other methods. NMRP involves radical polymerization in the presence of a nitroxide mediator. This mediator is usually of the form of a stable radical which can bind to and disable the growing polymer chain. This will “tie up” some of the free radicals forming a dynamic equilibrium between active and dormant species, through a reversible coupling process. NMRP can be conducted through one of two different processes: (1) The bimolecular process, which can be initiated with a conventional peroxide initiator (i.e. BPO) but in the presence of a stable nitroxide radical (i.e. TEMPO), which is a stable radical that can reversibly bind with the growing polymer radical chain, and (2) The unimolecular process, where nitroxyl ether is introduced to the system, which then degrades to create both the initiator and mediator radicals. Based on previous research in the group, which included experimental investigations with both unimolecular and bimolecular NMRP under various conditions, it was possible to build on an earlier model and come up with an improved detailed mechanistic model. Additionally, it was seen that certain parameters in the model had little impact on the overall model performance, which suggested that their removal would be appropriate, also serving to reduce the complexity of the model. Comparisons of model predictions with experimental data both from within the group and the general literature were performed and trends verified. Further work was done on the development of an additionally reduced model, and on the testing of these different levels of model complexity with data. The aim of this analysis was to develop a model to capture the key process responses in a simple and easy to implement manner with comparable accuracy to the complete models. Due to its lower complexity, this substantially reduced model would me a much likelier candidate for use in on-line applications. Application of these different model levels to the model-based D-optimal design of experiments was then pursued, with results compared to those generated by a parallel Bayesian design project conducted within the group. Additional work was done using a different optimality criterion, targeted at reducing the amount of parameter correlation that may be seen in D-optimal designs. Finally, conclusions and recommendations for future work were made, including a detailed explanation of how a model similar to the ones described in this paper could be used in the optimal selection of sensors and design of experiments.
29

Behavioural profiles : a relational approach to behaviour consistency

Weidlich, Matthias January 2011 (has links)
Business Process Management (BPM) emerged as a means to control, analyse, and optimise business operations. Conceptual models are of central importance for BPM. Most prominently, process models define the behaviour that is performed to achieve a business value. In essence, a process model is a mapping of properties of the original business process to the model, created for a purpose. Different modelling purposes, therefore, result in different models of a business process. Against this background, the misalignment of process models often observed in the field of BPM is no surprise. Even if the same business scenario is considered, models created for strategic decision making differ in content significantly from models created for process automation. Despite their differences, process models that refer to the same business process should be consistent, i.e., free of contradictions. Apparently, there is a trade-off between strictness of a notion of consistency and appropriateness of process models serving different purposes. Existing work on consistency analysis builds upon behaviour equivalences and hierarchical refinements between process models. Hence, these approaches are computationally hard and do not offer the flexibility to gradually relax consistency requirements towards a certain setting. This thesis presents a framework for the analysis of behaviour consistency that takes a fundamentally different approach. As a first step, an alignment between corresponding elements of related process models is constructed. Then, this thesis conducts behavioural analysis grounded on a relational abstraction of the behaviour of a process model, its behavioural profile. Different variants of these profiles are proposed, along with efficient computation techniques for a broad class of process models. Using behavioural profiles, consistency of an alignment between process models is judged by different notions and measures. The consistency measures are also adjusted to assess conformance of process logs that capture the observed execution of a process. Further, this thesis proposes various complementary techniques to support consistency management. It elaborates on how to implement consistent change propagation between process models, addresses the exploration of behavioural commonalities and differences, and proposes a model synthesis for behavioural profiles. / Das Geschäftsprozessmanagement umfasst Methoden zur Steuerung, Analyse sowie Optimierung von Geschäftsprozessen. Es stützt sich auf konzeptionelle Modelle, Prozessmodelle, welche den Ablauf zur Erreichung eines Geschäftszieles beschreiben. Demnach ist ein Prozessmodell eine Abbildung eines Geschäftsprozesses, erstellt hinsichtlich eines Modellierungsziels. Unterschiedliche Modellierungsziele resultieren somit in unterschiedlichen Modellen desselben Prozesses. Beispielsweise unterscheiden sich zwei Modelle erheblich, sofern eines für die strategische Entscheidungsfindung und eines für die Automatisierung erstellt wurde. Trotz der in unterschiedlichen Modellierungszielen begründeten Unterschiede sollten die entsprechenden Modelle konsistent, d.h. frei von Widersprüchen sein. Die Striktheit des Konsistenzbegriffs steht hierbei in Konflikt mit der Eignung der Prozessmodelle für einen bestimmten Zweck. Existierende Ansätze zur Analyse von Verhaltenskonsistenz basieren auf Verhaltensäquivalenzen und nehmen an, dass Prozessmodelle in einer hierarchischen Verfeinerungsrelation stehen. Folglich weisen sie eine hohe Berechnungskomplexität auf und erlauben es nicht, den Konsistenzbegriff graduell für einen bestimmten Anwendungsfalls anzupassen. Die vorliegende Arbeit stellt einen Ansatz für die Analyse von Verhaltenskonsistenz vor, welcher sich fundamental von existierenden Arbeiten unterscheidet. Zunächst werden korrespondierende Elemente von Prozessmodellen, welche den gleichen Geschäftsprozess darstellen, identifiziert. Auf Basis dieser Korrespondenzen wird ein Ansatz zur Konsistenzanalyse vorgestellt. Jener basiert auf einer relationalen Verhaltensabstraktion, dem Verhaltensprofil eines Prozessmodells. Die Arbeit führt verschiedene Varianten dieses Profils ein und zeigt wie sie für bestimmte Modellklassen effizient berechnet werden. Mithilfe von Verhaltensprofilen werden Konsistenzbegriffe und Konsistenzmaße für die Beurteilung von Korrespondenzen zwischen Prozessmodellen definiert. Weiterhin werden die Konsistenzmaße auch für den Anwendungsfall der Konformität angepasst, welcher sich auf beobachtete Abläufe in Form von Ausführungsdaten bezieht. Darüber hinaus stellt die Arbeit eine Reihe von Methoden vor, welche die Analyse von Verhaltenskonsistenz ergänzen. So werden Lösungen für das konsistente Übertragen von Änderungen eines Modells auf ein anderes, die explorative Analyse von Verhaltensgemeinsamkeiten, sowie eine Modellsynthese für Verhaltensprofile vorgestellt.
30

Understanding the effects of different levels of product monitoring on maintenance operations: A simulation approach

Alabdulkarim, Abdullah A. 10 1900 (has links)
The move towards integrating products and services has increased significantly. As a result, some business models, such as Product Service Systems (PSS) have been developed. PSS emphasises the sale of use of the product rather than the sale of the product itself. In this case, product ownership lies with the manufacturers/suppliers. Customers will be provided with a capable and available product for their use. In PSS, manufacturers/suppliers are penalised for any down time of their product according to the PSS contract. This has formed a pressure on the service providers (maintenance teams) to assure the availability of their products in use. This pressure increases as the products are scattered in remote places (customer locations). Authors have urged that different product monitoring levels are applied to enable service providers to monitor their products remotely allowing maintenance to be performed accordingly. They claim that by adopting these monitoring levels, the product performance will increase. Their claim is based on reasoning, not on experimental/empirical methods. Therefore, further experimental research is required to observe the effect of such monitoring levels on complex maintenance operations systems as a whole which includes e.g. product location, different types of failure, labour and their skills and locations, travel times, spare part inventory, etc. In the literature, monitoring levels have been classified as Reactive, Diagnostics, and Prognostics. This research aims to better understand and evaluate the complex maintenance operations of a product in use with different levels of product monitoring strategies using a Discrete Event Simulation (DES) approach. A discussion of the suitability of DES over other techniques has been provided. DES has proven its suitability to give a better understanding of the product monitoring levels on the wider maintenance system. The requirements for simulating a complex maintenance operation have been identified and documented. Two approaches are applied to gather these generic requirements. The first is to identify those requirements of modelling complex maintenance operations in a literature review. This is followed by conducting interviews with academics and industrial practitioners to find out more requirements that were not captured in the literature. As a result, a generic conceptual model is assimilated. A simulation module is built through the Witness software package to represent different product monitoring levels (Reactive, Diagnostics, and Prognostics). These modules are then linked with resources (e.g. labour, tools, and spare parts). To ensure the ease of use and rapid build of such a complex maintenance system through these modules, an Excel interface is developed and named as Product Monitoring Levels Simulation (PMLS). The developed PMLS tool needed to be demonstrated and tested for tool validation purposes. Three industrial case studies are presented and different experimentations are carried out to better understand the effect of different product monitoring levels on the complex maintenance operations. Face to face validation with case companies is conducted followed by an expert validation workshop. This work presents a novel Discrete Event Simulation (DES) approach which is developed to support maintenance operations decision makers in selecting the appropriate product monitoring level for their particular operation. This unique approach provides numerical evidence and proved that the higher product monitoring level does not always guarantee higher product availability.

Page generated in 0.1017 seconds