1 |
The investigation of a method to generate conformal lattice structures for additive manufacturingBrennan-Craddock, James January 2011 (has links)
Additive manufacturing (AM) allows a geometric complexity in products not seen in conventional manufacturing. This geometric freedom facilitates the design and fabrication of conformal hierarchical structures. Entire parts or regions of a part can be populated with lattice structure, designed to exhibit properties that differ from the solid material used in fabrication. Current computer aided design (CAD) software used to design products is not suitable for the generation of lattice structure models. Although conceptually simple, the memory requirements to store a virtual CAD model of a lattice structure are prohibitively high. Conventional CAD software defines geometry through boundary representation (B-rep); shapes are described by the connectivity of faces, edges and vertices. While useful for representing accurate models of complex shape, the sheer quantity of individual surfaces required to represent each of the relatively simple individual struts that comprise a lattice structure ensure that memory limitations are soon reached. Additionally, the conventional data flow from CAD to manufactured part is arduous, involving several conversions between file formats. As well as a lengthy process, each conversion risks the generation of geometric errors that must be fixed before manufacture. A method was developed to specifically generate large arrays of lattice structures, based on a general voxel modelling method identified in the literature review. The method is much less sensitive to geometric complexity than conventional methods and thus facilitates the design of considerably more complex structures. The ability to grade structure designs across regions of a part (termed functional grading ) was also investigated, as well as a method to retain connectivity between boundary struts of a conformal structure. In addition, the method streamlines the data flow from design to manufacture: earlier steps of the data conversion process are bypassed entirely. The effect of the modelling method on surface roughness of parts produced was investigated, as voxel models define boundaries with discrete, stepped blocks. It was concluded that the effect of this stepping on surface roughness was minimal. This thesis concludes with suggestions for further work to improve the efficiency, capability and usability of the conformal structure method developed in this work.
|
2 |
Batch-to-batch iterative learning control of a fed-batch fermentation processJewaratnam, Jegalakshimi January 2013 (has links)
Recently, iterative learning control (ILC) has been used in the run-to-run control of batch processes to directly update the control trajectory. The basic idea of ILC is to update the control trajectory for a new batch run using the information from previous batch runs so that the output trajectory converges asymptotically to the desired reference trajectory. The control policy updating is calculated using linearised models around the nominal reference process input and output trajectories. The linearised models are typically identified using multiple linear regression (MLR), partial least squares (PLS) regression, or principal component regression (PCR). ILC has been shown to be a promising method to address model-plant mismatches and unknown disturbances. This work presents several improvements of batch to batch ILC strategy with applications to a simulated fed-batch fermentation process. In order to enhance the reliability of ILC, model prediction confidence is incorporated in the ILC optimization objective function. As a result of the incorporation, wide model prediction confidence bounds are penalized in order to avoid unreliable control policy updating. This method has been proven to be very effective for selected model prediction confidence bounds penalty factors. In the attempt to further improve the performance of ILC, averaged reference trajectories and sliding window techniques were introduced. To reduce the influence of measurement noise, control policy is updated on the average input and output trajectories of the past a few batches instead of just the immediate previous batch. The linearised models are re-identified using a sliding window of past batches in that the earliest batch is removed with the newest batch added to the model identification data set. The effects of various parameters were investigated for MLR, PCR and PLS method. The technique significantly improves the control performance. In model based ILC the weighting matrices, Q and R, in the objective function have a significant impact on the control performance. Therefore, in the quest to exploit the potential of objective function, adaptive weighting parameters were attempted to study the performance of batch to batch ILC with updated models. Significant improvements in the stability of the performance for all the three methods were noticed. All the three techniques suggested have established improvements either in stability, reliability and/or convergence speed. To further investigate the versatility of ILC, the above mentioned techniques were combined and the results are discussed in this thesis.
|
3 |
Fault diagnosis for industrial processes using multivariate statistical process controlLieftucht, D. January 2006 (has links)
No description available.
|
4 |
Multifunctional new product developmentTurnbull, Kenneth James January 2005 (has links)
The New Product Development (NPD) process in manufacturing industry, together with the application of multi functional teams in the process, has been well studied in the extant literature. Tools, and techniques used to assist project teams in NPD have also been investigated in detail. However, many of the claims of the effectiveness of 'tools’ such as Rapid Prototyping (RP) and techniques such as Failure Mode and Effects Analysis (FMEA) are anecdotal in nature, lacking empirical evidence, or promoted by authors with a commercial interest in the subject. Therefore, as part of the objectives of this research to provide more empirical data, case studies were conducted over a period of 12 years in companies such as Flymo, Kenwood, and Domnick Hunter. Key Performance Indicators (KPIs) were selected for the case studies to provide a rich source of quantitative and qualitative data from which some of the root causes of NPD problems were identified. A common NPD problem identified was project delays, following late changes to the specification and the product engineering. It was clear however, that not all of the changes had a negative impact on a project, indeed some teamwork studies encourage changes to improve the product value and quality. A 'penalty weighting' model to quantify the 'impact' of changes with respect to any benefits was developed to identify the most cost effective period for teamwork studies and provide an efficiency profile for each project. A strategic business approach for Rapid Prototyping activities was also presented together with a 'sub-group' methodology to encourage innovation and reduce 'front end' delays. Appropriate project management control documentation was developed for the NPD teams to support the control of various KPIs including product deliverables, product costs, capital spends and launch timing.
|
5 |
Investigation into the effect of fixturing systems on the design of condition monitoring for machining operationsAbbas, J. K. January 2013 (has links)
The global market competition has drawn the manufacturer’s attention on automated manufacturing processes using condition monitoring systems. These systems have been used for improving product quality, eliminating inspection, and enhancing manufacturing productivity. Fixtures are essential devices in machining processes to hold the tool or workpiece, hence they are influenced directly by the stability of the cutting tool. Therefore, tool and fixturing faults play an important part in the inaccuracy of the machining processes causing deterioration of surface roughness. For the above mentioned reasons, and the limited work in this domain, this thesis develops an experimental investigation to evaluate the effect of fixturing quality on the design of condition monitoring systems. The proposed monitoring system implements multisensors and signal processing methods able to analyse the sensory information and make an appropriate decision. Therefore, several sensors namely force, vibration, acoustic emission, eddy current, power, strain and sound, are combined with a newly suggested approach, named Taylor’s Equation Induced Pattern (TIP), and neural networks to detect tool wear and tool breakage. It also evaluates the monitoring system to provide valuable data to show the effect of fixturing quality. Surface roughness of the workpiece has been measured and compared with the sensitivity of the monitoring system, which reflects the state of tool and fixturing conditions. A novel approach, termed ASPSF, (Automated Sensor and Signal Processing Selection for Fixturing) has been implemented to select the most sensitive sensors and signal processing method. The aim is to reduce the number of sensors needed in the overall system and reduce the cost. New automated detection methods (Principal Component Analysis (PCA), Fuzzy logic, correlation coefficients) have been implemented to prove the capability of the approach. A cost reduction is performed based on removing least utilised sensors without losing the performance of the condition monitoring system. The results prove that the ASPSF is capable of detection the effect of fixturing quality on the design of the condition monitoring system and the trend in surface roughness. Consequently, the findings of this thesis prove that the change in the fixturing quality could have significant effect on the design of the condition monitoring system and the behaviour of the system. Therefore, continuous condition monitoring design process will be needed regularly for every machine, to allow compensation in the change in the characteristics.
|
6 |
Non-contact free-form shape measurement for coordinate measuring machinesHuddart, Yvonne R. January 2010 (has links)
Precision measurement of manufactured parts commonly uses contact measurement methods. A Coordinate Measuring Machine (CMM) mounted probe touches the surface of the part, recording the probe’s tip position at each contact. Recently, devices have been developed that continuously scan the probe tip across the surface, allowing points to be measured more quickly. Contact measurement is accurate and fast for shapes that are easily parameterized such as a sphere or a plane, but is slow and requires considerable user input for more general objects such as those with free-form surfaces. Phase stepping fringe projection and photogrammetry are common non-contact shape measurement methods. Photogrammetry builds a 3D model of feature points from images of an object taken from multiple perspectives. In phase stepping fringe projection a series of sinusoidal patterns, with a phase shift between each, is projected towards an object. A camera records a corresponding series of images. The phase of the pattern at each imaged point is calculated and converted to a 3D representation of the object’s surface. Techniques combining phase stepping fringe projection and photogrammetry were developed and are described here. The eventual aim is to develop an optical probe for a CMM to enable non-contact measurement of objects in an industrial setting. For the CMM to accurately report its position the probe must be small, light, and robust. The methods currently used to provide a phase shift require either an accurately calibrated translation stage to move an internal component, or a programmable projector. Neither of these implementations can be practically mounted on a CMM due to size and weight limits or the delicate parts required. A CMM probe consisting of a single camera and a fringe projector was developed. The fringe projector projects a fixed fringe pattern. Phase steps are created by moving the CMM mounted probe, taking advantage of the geometry of the fringe projection system. New techniques to calculate phase from phase stepped images created by relative motion of probe and object are proposed, mathematically modelled, and tested experimentally. Novel techniques for absolute measurement of surfaces by viewing an object from different perspectives are developed. A prototype probe is used to demonstrate measurements of a variety of objects.
|
7 |
Structural and metric design of a class of finishing robotsKoloszko, Daniela M. January 2002 (has links)
No description available.
|
8 |
CARDINAL-Vanilla : immune system inspired prioritisation and distribution of security information for industrial networksScully, Peter Matthew David January 2016 (has links)
This thesis has made advances in Distributed Self-Healing Security Systems (DSHSS) -- a tool for the future to defend against the problems faced by industrial automation computer networks, such as critical national infrastructure (CNI), Industrial Control System (ICS) and Supervisory Control and Data Acquisition (SCADA), caused by computer malware and cyber attacks (Ch.2). Based on principles of an holistic view of the biological immune system (Ch.3) and recent Artificial Immune System (AIS) research (Ch.4), CARDINAL-Vanilla -- a self-healing and collaborative host-based security architecture has been designed (Ch.5) for application into real-world ICS and SCADA networks. Using a novel evaluation framework for DSHSS (Ch.6) the architecture has been assessed by 'immunisation rate' scores, to measure self-healing performance and resource feasibility, under rigorous virtual and real-world enterprise networks experiment conditions. These empirical studies (Ch.7 and Ch.8) identified that the CARDINAL-Vanilla module dispatch algorithm scores significantly worse than a near-optimal engineered dispatch algorithm; however the performance score difference is not important for a real-life application. In addition, the engineered dispatch algorithm is impossible to apply directly to real-world systems. The dispatch heuristics of CARDINAL-Vanilla are interesting to apply if the mult-agent system architecture's memory overhead can be reduced to and modelled as mathematical approximate equations. A comprehensive theoretical DSHSS architecture (Ch.9) to address the cyber attack problems identified (Ch.2) using a reinforcement classification pipeline, role switching and additional hardware to integrate into industrial networks is proposed as the future theme for DSHSS.
|
9 |
Suppliers versus lead users : examining collaboration in mass customisationAl-Zu’b, Zu’bi Mohammad January 2008 (has links)
Mass customisation has been hailed as the manufacturing paradigm of the future, and has accordingly received much academic interest. Nevertheless, it is important to gain a better understanding of the ways in which mass customisation performance may be enhanced, in the light of the number of reported failures of mass customisation ventures. This thesis explores the use of collaboration in product development processes as a means of increasing mass customisation operational performance. The two collaborative partners of interest are suppliers and lead users ― a specialised subset of users. The effects of lead users in the product development processes of mass customisation have not previously been evaluated, nor has their value been compared to that of suppliers. Accordingly, the aim of this study is to investigate the relative effects of collaborating with suppliers and lead users in the product development processes on mass customisation. This is achieved by measuring mass customisation operational performance in terms of four attributes derived from the literature: development cost, development time, customer influence and product scope. Hierachical regression analysis of survey data collected from two hundred and fifty-one UK consumer products manufacturers revealed a significant positive relationship between lead user collaboration and all four mass customisation operational performance attributes, while supplier collaboration was found to positively affect three of the four attributes, with the exception of customer influence. In addition, analysis revealed that lead user collaboration had a greater effect on the operational performance than supplier collaboration. These results give a valuable indication to scholars as well as manufacturers of the importance of lead users in the product development processes of mass customisation.
|
10 |
Analysis of manufacturing operations using knowledge-enriched aggregate process planningBramall, David Graham January 2006 (has links)
Knowledge-Enriched Aggregate Process Planning is concerned with the problem of supporting agile design and manufacture by making process planning feedback integral to the design function. A novel Digital Enterprise Technology framework (Maropoulos 2003) provides the technical context and is the basis for the integration of the methods with existing technologies for enterprise-wide product development. The work is based upon the assertion that, to assure success when developing new products, the technical and qualitative evaluation of process plans must be carried out as early as possible. An intelligent exploration methodology is presented for the technical evaluation of the many alternative manufacturing options which are feasible during the conceptual and embodiment design phases. 'Data resistant' aggregate product, process and resource models are the foundation of these planning methods. From the low-level attributes of these models, aggregate methods to generate suitable alternative process plans and estimate Quality, Cost and Delivery (QCD) have been created. The reliance on QCD metrics in process planning neglects the importance of tacit knowledge that people use to make everyday decisions and express their professional judgement in design. Hence, the research also advances the core aggregate planning theories by developing knowledge-enrichment methods for measuring and analysing qualitative factors as an additional indicator of manufacturing performance, which can be used to compute the potential of a process plan. The application of these methods allows the designer to make a comparative estimation of manufacturability for design alternatives. Ultimately, this research should translate into significant reductions in both design costs and product development time and create synergy between the product design and the manufacturing system that will be used to make it. The efficacy of the methodology was proved through the development of an experimental computer system (called CAPABLE Space) which used real industrial data, from a leading UK satellite manufacturer to validate the industrial benefits and promote the commercial exploitation of the research.
|
Page generated in 0.0224 seconds