41 |
In-mold coating of thermoplastic and composite parts: microfluidics and rheologyAramphongphun, Chuckaphun 13 March 2006 (has links)
No description available.
|
42 |
Investigating the Process-Structure-Property Relationships in Vat Photopolymerization to Enable Fabrication of Performance PolymersMeenakshisundaram, Viswanath 07 January 2021 (has links)
Vat photopolymerization's (VP) use in large-scale industrial manufacturing is limited due to
poor scalability, and limited catalogue of engineering polymers. The challenges in scalability
stem from an inherent process paradox: the feature resolution, part size, and manufacturing
throughput cannot be maximized simultaneously in standard VP platforms. In addition,
VP's inability to process viscous and high-molecular weight engineering polymers limits the
VP materials catalogue. To address these limitations, the research presented in this work
was conducted in two stages: (1) Development and modeling of new VP platforms to address
the scalability and viscosity challenges, and (2) Investigating the influence of using the new
processes on the cured polymer network structure and mechanical properties.
First, a scanning mask projection vat photopolymerization (S-MPVP) system was developed
to address the scalability limitations in VP systems. The process paradox was resolved by
scanning the mask projection device across the resin surface while simultaneously projecting
the layer as a movie. Using actual projected pixel irradiance distribution, a process model
was developed to capture the interaction between projected pixels and the resin, and predict
the resulting cure profile with an error of 2.9%. The S-MPVP model was then extended
for processing heterogeneous UV scattering resins (i.e. UV curable polymer colloids). Using
computer vision, the scattering of incident UV radiation on the resin surface was successfully
captured and used to predict scattering-compensated printing parameters (bitmap pattern, exposure time , scanning speed). The developed reverse-curing model was used to successfully
fabricate complex features using photocurable SBR latex with XY errors < 1.3%.
To address the low manufacturing throughput of VP systems, a recoat-less, volumetric curing
VP system that fabricates parts by continuously irradiating the resin surface with a
movie composed of different gray-scaled bitmap images ( Free-surface movie mask projection
(FreeMMaP)) was developed. The effect of cumulative exposure on the cure profile
(X,Y,Z dimensions) was investigated and used to develop an iterative gray-scaling algorithm
that generated a combination of gray-scaled bitmap images and exposure times that result
in accurate volumetric curing (errors in XY plane and Z axis < 5% and 3% respectively).
Results of this work demonstrate that the elimination of the recoating process increased
manufacturing speed by 8.05 times and enabled high-resolution fabrication with highly viscous
resins or soft gels.
Then, highly viscous resins were made processible in VP systems by using elevated processing
temperatures to lower resin viscosity. New characterization techniques were developed
to determine the threshold printing temperature and time that prevented the onset
of thermally-induced polymerization. The effect of printing temperature on curing, cured
polymer structure, cured polymer mechanical properties, and printable aspect ratio was also
investigated using diacrylate and dimethacrylate resins. Results of this investigation revealed
increasing printing temperature resulted in improvements in crosslink density, tensile
strength, and printability. However, presence of hydroxl groups on the resin backbone caused
deterioration of crosslink density, mechanical properties, and curing properties at elevated
printing temperatures.
Finally, the lack of a systematic, constraint based approach to resin design was bridged
by using the results of earlier process-structure-property explorations to create an intuitive
framework for resin screening and design. Key screening parameters (such as UV absorptivity,
plateau storage modulus) and design parameters (such as photoinitiator concentration, polymer concentration, UV blocker concentration) were identified and the methods to optimize
them to meet the desired printability metrics were demonstrated using case studies.
Most work in vat photopolymerization either deal with materials development or process
development and modeling. This dissertation is placed at the intersection of process development
and materials development, thus giving it an unique perspective for exploring the
inter-dependency of machine and material. The process models, machines and techniques
used in this work to make a material printable will serve as a guide for chemists and engineers
working on the next generation of vat photopolymerization machines and materials. / Doctor of Philosophy / Vat Photopolymerization (VP) is a polymer-based additive manufacturing platform that uses
UV light to cure a photo-sensitive polymer into the desired shape. While parts fabricated
via VP exhibit excellent surface finish and high-feature resolution, their use for commercial
manufacturing is limited because of its poor scalability for large-scale manufacturing and
limited selection of engineering materials. This work focuses on the development of new VP
platforms, process models and the investigation of the process-structure-property relationships
to mitigate these limitations and enable fabrication of performance polymers.
The first section of the dissertation presents the development of two new VP platforms to address
the limitations in scalability. The Scanning Mask Projection Vat Photopolymerization
(S-MPVP)) was developed to fabricate large area parts with high-resolution features and
the Free-surface movie mask projection (FreeMMaP) VP platform was developed to enable
high-speed, recoat-less, volumetric fabrication of 3D objects. Computer-vision based models
were developed to investigate the influence of these new processes on the resultant cure
shape and dimensional accuracy. Process models that can: (1) predict the cure profile for
given input printing parameters (error < 3%), (2) predict the printing parameters (exposure
time, bitmap gray-scaling) required for accurate part fabrication in homogeneous and UV
scattering resins, and (3) generate gray-scaled bitmap images that can induce volumetric
curing inside the resin (dimensional accuracy of 97% Z axis, 95% XY axis), were designed
and demonstrated successfully.
In the second portion of this work, the use of high-temperature VP to enable processing
of high-viscosity resins and expansion of materials catalogue is presented. New methods to
characterize the resin's thermal stability are developed. Techniques to determine the printing
temperature and time that will prevent the occurrence of thermally-induced polymerization
are demonstrated. Parts were fabricated at different printing temperatures and the influence
of printing temperature on the resultant mechanical properties and polymer network structure
was studied. Results of this work indicate that elevated printing temperature can be
used to alter the final mechanical properties of the printed part and improve the printability
of the high-resolution, slender features.
Finally, the results of the process-structure-property investigations conducted in this work
were used to guide the development of a resin design framework that highlights the parameters,
metrics, and methods required to (1) identify printable resin formulations, and (2)
tune printable formulations for optimal photocuring. Elements of this framework were then
combined into an intuitive flowchart to serve as a design tool for chemists and engineers.
|
43 |
Business Process Modeling: Process Events and States / Modelování business procesů: události a stavy procesuSvatoš, Oleg January 2005 (has links)
This thesis focuses on modeling of business processes which are very sensitive on correct capturing of process details characterized as process events and states. At the beginning of the analysis are the process events and states classified into three types: activity related, object related and time related. Each type is analyzed in detail and there are formulated states and their transitions that form a lifecycle of each individual type of the process state. There are discussed contemporary process modeling languages starting from the very popular to relatively less known, all representing slightly different approach towards process modeling. The analysis of process events and states shows that the contemporary process modeling languages cover the defined lifecycles only partially. There are picked three popular process modeling languages and put through a test case, which is based on Czech regulation of a building process. This test case allows us to review their capabilities to capture the process events including the ways how they get along with their only partial support. Upon analysis of process events and states and the unsatisfying results of the con-temporary process modeling languages in the test case, there is introduced a new process modeling language which, as demonstrated, is capable of capturing many of the process events and states in the test case in much simpler and precise way that the three reviewed process modeling languages were able to.
|
44 |
Process analysis of department of management and real estate management of supermarket company / Procesní analýza oddělení řízení a správy nemovitostí obchodního řetězceVozáb, Václav January 2012 (has links)
This thesis deals with realization of process analysis of department of management and real estate managnement of a supermarket company. The goal of this work, addressed in the practical part, is to map the processes in terms of business process diagrams with description of these processes, identify potential inconsistencies and eventually recommend a revision of current documents. The theoretical part explains the concepts of process management, process, process analysis and procedure of mapping process. There are also described and evaluated basic methods and standards for process modeling.
|
45 |
Standardy modelování a řízení podnikových procesů / Business Process Modeling StandardsKlička, Lukáš January 2010 (has links)
Business process modeling plays an important role for the documentation and analysis of organizational processes and for the specification of requirements for information systems. Currently, there are many standards for process modeling, and each represents a different approach to modeling and provides different opportunities for creating models. Thus, same models created in different modeling languages differ in terms of their expressiveness, according to each standard. The goal of this thesis is to evaluate the effectiveness of selected business process modeling standards. The effectiveness of standards will be evaluated on two levels. First, in terms of their semantics and second, in terms of their syntax. Evaluation of standards in terms of their semantics will be based on the comparison of standards with a business process metamodel created within the project Opensoul. This model defines a set of basic elements and their associations at a basic level of process modeling. Evaluation of standards in terms of their syntax will be based on the comparison of standards with a framework proposed by Moody and Hillegersberg. The results of this comparison should show which standard best supports defined criteria. The evaluation framework will be applied on the following process modeling standards: BPMN, EPC and IDEF3. Based on the results obtained their strength and weaknesses will be discussed. The main contribution of this thesis can be seen both in the analysis of the element set of selected process modeling standards and further in the comparison of standards according to the evaluation framework.
|
46 |
Hodnocení nástrojů pro procesní modelování / Evaluation of tools for process modelingEngeová, Andrea January 2017 (has links)
This diploma thesis deals with the modeling and evaluation tools that are used for process modeling. The aim is to provide a method of selecting the most appropriate tool for selected requirements. This goal is achieved by using several subgoals, which include tools analysis for process modeling, design criteria for the selection of tools, their weights calculation, design and construction method for selecting tools in Excel and create case studies that will demonstrate the principles of selection tools. At the beginning of the thesis is characterized by the area of process management, its history, actual trends and definitions related thereto. Another chapter is a description of notations that are used for modeling processes. The largest part of the chapter is devoted to the characteristics Notation Business Process Model and Notation (BPMN), which together with their graphic design described all the elements of notation. The remaining part of the chapter describes the notation event-driven process chain (EPC) and the Unified Modeling Language (UML). A description of tools for process modeling is dedicated to the fifth chapter includes a detailed characterization of each instrument. Based on the individual tools and literary sources, the next chapter in the proposed criteria by which the tools will be evaluated. For calculating the weights of the proposed criteria is used Fuller method. The chapter concludes with evaluation tools previously described and a description of how to select the creation tools in Excel. The work ends with case studies which serve to illustrate the procedure for selecting tools.
|
47 |
Evaluationg IT investments : a business process simulation approachSilva Molina, Enrique January 2003 (has links)
<p>Information technology (IT) is becoming the primary factordetermining the survival of most organizations. The differenttypes of systems and the wide range of objectives suggest thatdiverse evaluation methods are needed. There is a critical needfor a new approach to managing IT investments, and solving theinformation paradox should be a business imperative for allmanagers today.</p><p>Evaluating IT investments introduces different types ofproblems that investment in traditional assets does notconsider. The focal point shifts from measuring hard andquantifiable benefits that appear on a firms incomestatement to measuring soft, diffuse, and qualitativeimpact.</p><p>The decision to acquire new information technology poses anumber of serious evaluation problems for managers because theavailable methods for IT investment evaluation are mostlystatic and they do not consider dynamism in decision-makingprocesses.</p><p>Common problems with the methods for evaluating ITinvestments are related to their inability to take account ofthe full range of potential benefits. There is a gap betweentheory and practice in relation to the use of any method formaking decisions and for continuous evaluation of ITinvestments.</p><p>This thesis presents a new approach to evaluating benefitsof IT investments in a dynamic way, an approach consisting of acombination of dynamic information workflow models and businessprocess simulation techniques. The proposed approach givesmanagers and organizations the possibility of implementingother models for measuring different metrics and aspects of ITinvestments.</p><p>A dynamic information workflow model of an electric utilityandsimulation essays are presented in order to show how theproposed approach is applied. The performance measure selectedfor running experiments was efficiency, which was characterizedby the following selected performance indicators: cycle time,resource utilization, and activity costs. Empirical data wascollected from case studies of different utilities in CentralAmerican countries.</p><p><b>Key words:</b>Business Process Modeling and Simulation,Evaluating IT Investments, Dynamic Information Workflow Model,Electric Utilities.</p>
|
48 |
Particle generation for geometallurgical process modelingKoch, Pierre-Henri January 2017 (has links)
A geometallurgical model is the combination of a spatial model representing an ore deposit and a process model representing the comminution and concentration steps in beneficiation. The process model itself usually consists of several unit models. Each of these unit models operates at a given level of detail in material characterization - from bulk chemical elements, elements by size, bulk minerals and minerals by size to the liberation level that introduces particles as the basic entity for simulation (Paper 1). In current state-of-the-art process simulation, few unit models are defined at the particle level because these models are complex to design at a more fundamental level of detail, liberation data is hard to measure accurately and large computational power is required to process the many particles in a flow sheet. Computational cost is a consequence of the intrinsic complexity of the unit models. Mineral liberation data depends on the quality of the sampling and the polishing, the settings and stability of the instrument and the processing of the data. This study introduces new tools to simulate a population of mineral particles based on intrinsic characteristics of the feed ore. Features are extracted at the meso-textural level (drill cores) (Paper 2), put in relation to their micro-textures before breakage and after breakage (Paper 3). The result is a population of mineral particles stored in a file format compatible to import into process simulation software. The results show that the approach is relevant and can be generalized towards new characterization methods. The theory of image representation, analysis and ore texture simulation is briefly introduced and linked to 1-point, 2-point, and multiple-point methods from spatial statistics. A breakage mechanism is presented as a cellular automaton. Experimental data and examples are taken from a copper-gold deposit with a chalcopyrite flotation circuit, an iron ore deposit with a magnetic separation process. This study is covering a part of a larger research program, PREP (Primary resource efficiency by enhanced prediction). / PREP
|
49 |
[en] INFORMATION QUALITY IMPROVE ON PETROBRAS MARITIME TRANSPORT: ANALYSIS AND PROPOSALS / [pt] MELHORIA DA QUALIDADE DA INFORMAÇÃO NO TRANSPORTE MARÍTIMO DA PETROBRAS: ANÁLISE E PROPOSIÇÕESEDUARDO LADEIRA AVILA 22 October 2008 (has links)
[pt] Para uma empresa como a Petrobras o transporte marítimo tem
grande importância nas suas operações logísticas de
suprimento dos mercados nacional e internacional. As
informações geradas por esse modal são utilizadas na
coordenação do planejamento e o controle das operações, por
isso a necessidade de garantir a sua qualidade. Esse
trabalho apresenta uma aplicação prática dos
conceitos da Engenharia de Processos e do Processo de
Pensamento da Teoria das Restrições através de um estudo de
caso cujo foco é o fluxo de informações
no transporte marítimo. O objetivo é apresentar O quê
mudar, Para o quê mudar e Como mudar a realidade desta
empresa e melhorar o desempenho de seus processos. / [en] For a company like Petrobras, the marine transportation has
a great importance in its supply logistic operations for
national and international markets. The information
generated for this modal are used in the planning
coordination and the control operations, so that, there is
the necessity of guarantee its quality.
The work presents a practical application of the concepts
of Processes Engineering and the Theory of Constrains
Thinking Process through a case study with focus on
information flow of the maritime transport. The objective
is present what to change, what t change to and how to
change the reality of this company and to improve the its
processes performance.
|
50 |
Modelagem matemática de um processo industrial de produção de cloro e soda por eletrólise de salmoura visando sua otimização. / Mathematical modeling of an industrial process for chlorine and caustic manufacturing using brine electrolysis aiming at its optimization.De Jardin Júnior, Roberto Nicolas 14 September 2006 (has links)
O presente trabalho envolve a elaboração de um modelo matemático para um processo industrial de produção de cloro e soda a partir de salmoura, visando sua otimização em termos de eficiência de produção e dos custos dos consumos de energia elétrica e vapor. O estudo contemplou duas etapas do processo: eletrólise e concentração de licor de NaOH por evaporação. Para a unidade de eletrólise não foram encontrados na literatura modelos fenomenológicos adequados à simulação do processo. Por essa razão, foram desenvolvidos modelos empíricos baseados em redes neurais tipo ?feedforward? constituídas por três camadas, a partir de dados da operação industrial. Para a unidade de evaporação foi elaborado um balanço de energia adequado à estimativa do consumo de vapor. Porém, devido à falta de modelos para previsão das relações de equilíbrio para o sistema, o modelo fenomenológico foi substituído por um modelo de redes neurais tipo ?feedforward? de três camadas também para essa unidade. Para ajuste dos modelos, uma base de dados foi montada a partir de dados de operação do processo da Carbocloro S.A. Indústrias Químicas, localizada em Cubatão-SP, analisados por meio de técnicas estatísticas multivariadas, visando detectar e eliminar erros grosseiros e dados anômalos, além de identificar correlações entre variáveis e diferentes regimes operacionais da planta de produção de cloro e soda. Os modelos ajustados para os diferentes circuitos de células de eletrólise, bem como para a etapa de evaporação, apresentaram boa concordância com os dados operacionais. Isto possibilitou sua utilização para simular a operação das unidades de células eletrolíticas e evaporação no processo industrial de produção de cloro-soda, com células tipo diafragma. O modelo matemático baseado em redes neurais foi utilizado em estudos de otimização do processo, de modo a maximizar o ganho financeiro na unidade industrial, para uma dada condição de operação. / The present work consists on the development of a mathematical model on an industrial chlorine and sodium hydroxide production plant, aiming at the optimization of production efficiency and costs saving concerning electrical energy and vapor consumption. Two process steps were considered in the study: electrolysis and NaOH-liquor concentration by evaporation. Since there are no adequate models reported in the literature for simulating electrolysis-based processes like the one considered, empirical models for the different types of electrolysis cells were developed based on the fitting of neural networks to operational data from industrial operation. In this case, feedforward neural networks containing three neuron layers were fitted to the data. The raw data obtained from industrial operation at Carbocloro plant, in Cubatão ? SP, were first treated by means of multivariate statistical techniques, with the purpose of detecting and eliminating data containing gross errors and outliers, as well as to identify correlations among variables and different operational regimes of the industrial plant. Although material and energy balances for the evaporation step have been initially adopted, this approach could not be used in simulations due to the lack of valid models to predict liquid ? vapor equilibria for the specific system. Thus, a neural network model was also fitted to data from operation of the evaporation step. Fitting of the neural network models resulted in good agreement between model predictions and measured values of the model output variables, and this enabled their use in simulation studies for the electrolysis and evaporation process steps. The neural network-based mathematical model was utilized in process optimization studies aiming at the best financial gain under given operational conditions.
|
Page generated in 0.0193 seconds