Spelling suggestions: "subject:"stressprocess design"" "subject:"dprofprocess design""
41 |
Processing of expandable thermoplastic/thermoset syntactic foamHong, Yifeng 21 September 2015 (has links)
While hollow glass microspheres are commonly used in syntactic foam, their abrasive and brittle properties usually result in poor processability and have adverse effects on the foam performance. Therefore, a number of attempts have been made in the industry to replace hollow glass microspheres with polymeric foamed microspheres. Among many choices, expandable thermoplastic (ETP) microspheres filled syntactic foam has shown its high potential to become a novel class of engineering materials, especially for lightweight structural applications. However, conventional processing techniques for syntactic foam usually experience difficulties such as high processing viscosity, low loading of foam fillers, and ineffective microsphere expansion.
To address these emerging issues, a microwave expansion process to produce thermoset-matrix syntactic foam containing thermoplastic foam beads was developed in this thesis work. In this process, unexpanded ETP microspheres were directly foamed in uncured thermoset matrix via microwave heating. Expandable polystyrene (EPS) microspheres and epoxy resin were chosen as a model material system. The resin viscosity and specific microwave energy are found to be the two primary control parameters determining the process window. Mechanical characterization showed that the syntactic foam can outweigh neat polymer in lightweight structural applications and was effectively toughened by foamed EPS.
Furthermore, the microwave expansion process was found to be capable of molding syntactic foam parts of relatively sophisticated geometry with smooth surfaces.
In order to broaden its impact, the microwave expansion process was extended to produce composite EPS foam. This process converts an expandable suspension into a composite foam with a honeycomb-like barrier structure. The suspension viscosity was found to highly influence the foam morphology. Results from mechanical tests showed that the existence of the barrier structure can considerably improve the mechanical performance of the composite foam. Fire-retardation tests demonstrated that the barrier structure can effectively stop the fire path into the foam, suppress toxic smoke generation, and maintain foam structure integrity.
A general formulation was developed to model the EPS expansion to optimize the microwave expansion process. A semi-analytical solution was first obtained based on the case of a single bubble expansion in an infinite matrix. The dimensionless bubble radius and pressure are defined and found to be as exponential functions of dimensionless expansion time. The semi-analytical solution can qualitatively predict the radial expansion of EPS microsphere observed in a real-time experiment. To have an accurate prediction, a numerical solution was obtained to the model that couples the nucleation and expansion of multiple bubbles in a finite matrix. The results show that the numerical solution can quantitatively predict the radial expansion of EPS. A parameter sensitivity study was performed to examine the effect of each parameter over the expansion process.
|
42 |
Advancing Water Management through Methods to Assess Environmental Flow Needs and Improve Stakeholder EngagementMott Lacroix, Kelly E. January 2015 (has links)
Adequate water for ecosystems and humans is at a premium as the global population increases and the climate changes. Coping with these impacts requires tools to improve water governance and water management through legal or policy mechanisms. Water governance generates laws, policies, and rulings and water management implements those laws, policies, and rulings and rulings through management decisions. A key concern of water governance is balancing human and ecosystem water needs. Effective governance that promotes sustainable use of water resources to maintain ecosystem integrity is challenging. Many regions do not have sufficient resources for water management or water for ecosystems is not protected under traditional legal mechanisms. The challenge of improving water governance for ecosystems is, therefore, twofold. First, there is a need to provide resources that build the capacity of water managers to allocate water to ecosystems. Second, mechanisms to promote effective transformation of environmental flow needs into policy or practice are required. This research provides methods to advance water management by: 1) assessing environmental flow needs through creation of a geospatial database and 2) improving stakeholder engagement through lessons learned from three multi-year stakeholder engagement processes. Appendix A describes the current understanding of the link between hydrology and riparian and aquatic ecosystems in Arizona through synthesis of environmental flow needs. The synthesized information, stored in a geospatial database, can be used by water managers to determine the water needs to maintain riparian and aquatic habitats. Review of 121 studies reveals that there are very few analyses of surface water and groundwater requirements for intermittent or ephemeral river systems, and there are only limited generalizable data for aquatic species. This database can be used to identify critical geographic and topical knowledge gaps, as well as serve as a single place for water and land managers to assess and use the most current research to inform management decisions. Appendix B provides an empirical example of engagement to promote social learning as a way to preserve water for the environment when law does not protect environmental flows. Through 43 focus groups with 226 individuals representing a diversity of interests, we determined that there was common ground on concerns about water conservation, cooperation, financial incentives, and multiple benefits for water use. Through this engagement process, we found that identifying and then building common ground requires attention to details. These details include the process of analyzing qualitative data and methods for displaying complex information, which are not frequently discussed in the social learning or stakeholder engagement literature. Appendix C presents a framework for designing effective stakeholder engagement based on the experiences of the University of Arizona Water Resources Research Center in three separate processes that engaged water experts. The proposed framework provides an iterative and flexible approach centered on a bridging organization that can bring people together and keep the engagement process moving forward. As illustrated through an evaluation of the three projects, the proposed framework provides for inclusivity, interactiveness, and flexibility in engagement through guidance by a steering committee and iteratively examining the water resource management problem. While further assessment is necessary, it appears that this framework is general enough to be applicable across projects at three different scales and with three separate sets of goals, yet detailed enough to provide a tangible approach that could aid other processes where the goal is implementing and evaluating expert engagement to solve complex problems and promote social learning. Previous studies on water governance have focused predominantly on the identification of the current problems with governance. However, because humans have an important role in shaping the global water cycle, the time has come to focus on solutions. In order to further water management solutions, a better understanding of the tools needed to manage water for ecosystems and effective methods for co-producing knowledge or encouraging social learning are needed. This research provides a regional example of approaches to advance water management using a tool to assess environmental flows needs and frameworks for promoting common ground and social learning in stakeholder engagement.
|
43 |
A Logic-Based Methodology for Business Process Analysis and Design: Linking Business Policies to Workflow ModelsWang, Jiannan January 2006 (has links)
Today, organizations often need to modify their business processes to cope with changes in the environment, such as mergers/acquisitions, new government regulations, and new customer demand. Most organizations also have a set of business policies defining the way they conduct their business. Although there has been extensive research on process analysis and design, how to systematically extract workflow models from business policies has not been studied, resulting in a missing link between the specification of business policies and the modeling of business processes.Given that process changes are often determined by executives and managers at the policy level, the aforementioned missing link often leads to inefficient and inaccurate implementation of process changes by business analysts and process designers. We refer to this problem as the policy mismatch problem in business process management. For organizations with large-scale business processes and a large number of business policies, solving the policy mismatch problem is very difficult and challenging.In this dissertation, we attempt to provide a formal link between business policies and workflow models by proposing a logic-based methodology for process analysis and design. In particular, we first propose a Policy-driven Process Design (PPD) methodology to formalize the procedure of extracting workflow models from business policies. In PPD, narrative process policies are parsed into precise information on various workflow components, and a set of process design rules and algorithms are applied to generate workflow models from that information.We also develop a logic-based process modeling language named Unified Predicate Language (UPL). UPL is able to represent all workflow components in a single logic format and provides analytical capability via logic inference and query. We demonstrate UPL's expressive power and analytical ability by applying it to process design and process change analysis. In particular, we use UPL to define and classify process change anomalies and develop algorithms to verify and enforce process consistency.The Policy-driven Process Design, Unified Predicate Language, and process change analysis approach found in this dissertation contribute to business process management research by providing a formal methodology for resolving the policy mismatch problem.
|
44 |
Bridging the gap : finding the processes to adapt a repository-based knowledge management system to the knowledge intense sales organization at IBM NordicBackman, David, Åkerfeldt, Jonas January 2007 (has links)
For knowledge intense companies the knowledge and skills of its employees are much more valuable than any physical assets. To share knowledge between its employees, some companies launch knowledge sharing initiatives which aims to spread best practices and increase the expertise of the employees. These initiatives are often supported by technical systems, repositories, which store the information that is to be shared. This report discusses how the value of such a repository, a Wiki containing reference cases of SOA projects at IBM Nordic, can be increased by using processes that aims to better connect it to the organization. To do this, seven employees at IBM Nordic were interviewed. Four of them were sales people, the main user group of the Wiki. Two were employees at the SOA Acceleration Team, the group responsible for the Wiki. The last interviewee works at Learning and Knowledge, IBM’s internal department for organization-wide knowledge management and education. The answers were analyzed using a framework created using academic theory. This framework consists of four different areas of requirements for the processes connecting the Wiki to the organization. The analysis showed that for IBM the most important area to manage is enablement followed by governance, motivation and finally content. The report is concluded with recommendations for five processes to connect the Wiki to the organization. The process Internal selling aims to inform the sales people about the existence of the Wiki and how they are to use it. Ensure search engine compatibility makes sure that the sales people are able to find the contents of the Wiki via the intranet based search engines. The process for adding a new case description ensures that new case descriptions which are added to the Wiki is consistent and contains the right kind of information. By validating the case information that is added to the Wiki the acceleration team verifies that the information is correct, increasing its credibility. In the last process, ensure information congruence, the members of the acceleration team updates the guidelines on what information to collect and the information in the Wiki as the information need of the sales people changes. This is done on a regular basis and ensures that the information that is collected and stored is actually useful.
|
45 |
Service recommendation for individual and process useNguyen, Ngoc Chan 13 December 2012 (has links) (PDF)
Web services have been developed as an attractive paradigm for publishing, discovering and consuming services. They are loosely-coupled applications that can be run alone or be composed to create new value-added services. They can be consumed as individual services which provide a unique interface to receive inputs and return outputs; or they can be consumed as components to be integrated into business processes. We call the first consumption case individual use and the second case business process use. The requirement of specific tools to assist consumers in the two service consumption cases involves many researches in both academics and industry. On the one hand, many service portals and service crawlers have been developed as specific tools to assist users to search and invoke Web services for individual use. However, current approaches take mainly into account explicit knowledge presented by service descriptions. They make recommendations without considering data that reflect user interest and may require additional information from users. On the other hand, some business process mechanisms to search for similar business process models or to use reference models have been developed. These mechanisms are used to assist process analysts to facilitate business process design. However, they are labor-intense, error-prone, time-consuming, and may make business analyst confused. In our work, we aim at facilitating the service consumption for individual use and business process use using recommendation techniques. We target to recommend users services that are close to their interest and to recommend business analysts services that are relevant to an ongoing designed business process. To recommend services for individual use, we take into account the user's usage data which reflect the user's interest. We apply well-known collaborative filtering techniques which are developed for making recommendations. We propose five algorithms and develop a web-based application that allows users to use services. To recommend services for business process use, we take into account the relations between services in business processes. We target to recommend relevant services to selected positions in a business process. We define the neighborhood context of a service. We make recommendations based on the neighborhood context matching. Besides, we develop a query language to allow business analysts to formally express constraints to filter services. We also propose an approach to extract the service's neighborhood context from business process logs. Finally, we develop three applications to validate our approach. We perform experiments on the data collected by our applications and on two large public datasets. Experimental results show that our approach is feasible, accurate and has good performance in real use-cases
|
46 |
Reconfigurable Logic Architectures based on Disruptive TechnologiesGaillardon, Pierre-Emmanuel 15 September 2011 (has links) (PDF)
For the last four decades, the semiconductor industry has experienced an exponential growth. According to the ITRS, as we advance into the era of nanotechnology, the traditional CMOS electronics is reaching its physical and economical limits. The main objective of this thesis is to explore novel design opportunities for reconfigurable architectures given by the emerging technologies. On the one hand, the thesis will focus on the traditional FPGA architecture scheme, and survey some structural improvements brought by disruptive technologies. While the memories and routing structures occupy the major part of the FPGAs total area and mainly limit the performances, 3-D integration appears as a good candidate to embed all this circuitry into the metal layers. Configuration and routing circuits based on back-end compatible resistive memories, a monolithic 3-D process flow and a prospective vertical FETs process flow are introduced and assessed within a complete architectural context. On the other hand, the thesis will present some novel architectural schemes for ultra-fine grain computing. The size of the logic elements can be reduced thanks to inherent properties of the technologies, such as the crossbar organization or the controllable polarity of carbon electronics. Considering the granularity of the logic elements, specific fixed and incomplete interconnection topologies are required to prevent the large overhead of a configurable interconnection pattern. To evaluate the potentiality of this new architectural scheme, a specific benchmarking flow will be presented in order to explore the ultra-fine grain architectural design space.
|
47 |
Towards meaningful spaces : reclaiming cultural context to its inhabitants in GCC cities through the conceptual phase of urban design processAl Ghareebi, Abdul Azeez January 2016 (has links)
The motive of the present thesis has been the realization of lack of cultural impact and meaningful representation within public open urban spaces that fall in between areas of urban planning and construction in the GCC cities. In pursuit of aiding towards the reclamation of meaningful spaces, this study demonstrates methods of providing a framework of cultural context design during the conceptual phase of the urban planning process. In doing so, the emphasis falls both on public users’ and professionals’ opinions and perception regarding cultural elements, physical experience and design processes that take place within the Gulf Cooperation Council (GCC) cities. The understanding and framework of levels of comprehension and awareness of cultural implementation derives from the theory of the Circuit of Culture, resulting both in an epistemological contemplation of the existing situation and the inter-relationship between culture and design. Meanwhile, the study of the elements that define the Circuit of Culture starts to form a strong cultural reference point for the future amelioration of definition of strong identity in-between spaces, within the urban context. The elements of this theory are based on the five aspects of Production, Identity, Regulation, Consumption and Representation. The implementation of these cultural elements are drawn in reference to the Double Diamond design model, considering its four aspects of discovery, definition, development and delivery, where the first two stages make up the conceptual phase, and the latter two stages make up the implementation phase of design. Therefore, while the thesis stresses the problem of cultural identity of in-between and vacuum spaces of the GCC cities, it suggests the Circuit of Culture as a method for reinforcing a thorough cultural consideration within the conceptual phase of design (discovery and definition). The thesis seeks to develop the process of design integration through an evolving methodological approach for studying the cultural development and cultural concern of urban spaces that fall in-between well planned schemes. In this work, cultural identity and cultural representation are of fundamental importance.
|
48 |
An integrated framework for developing generic modular reconfigurable platforms for micro manufacturing and its implementationSun, Xizhi January 2009 (has links)
The continuing trends of miniaturisation, mass customisation, globalisation and wide use of the Internet have great impacts upon manufacturing in the 21st century. Micro manufacturing will play an increasingly important role in bridging the gap between the traditional precision manufacturing and the emerging technologies like MEMS/NEMS. The key requirements for micro manufacturing in this context are hybrid manufacturing capability, modularity, reconfigurability, adaptability and energy/resource efficiency. The existing design approaches tend to have narrow scope and are largely limited to individual manufacturing processes and applications. The above requirements demand a fundamentally new approach to the future applications of micro manufacturing so as to obtain producibility, predictability and productivity covering the full process chains and value chains. A novel generic modular reconfigurable platform (GMRP) is proposed in such a context. The proposed GMRP is able to offer hybrid manufacturing capabilities, modularity, reconfigurablity and adaptivity as both an individual machine tool and a micro manufacturing system, and provides a cost effective solution to high value micro manufacturing in an agile, responsive and mass customisation manner. An integrated framework has been developed to assist the design of GMRPs due to their complexity. The framework incorporates theoretical GMRP model, design support system and extension interfaces. The GMRP model covers various relevant micro manufacturing processes and machine tool elements. The design support system includes a user-friendly interface, a design engine for design process and design evaluation, together with scalable design knowledge base and database. The functionalities of the framework can also be extended through the design support system interface, the GMRP interface and the application interface, i.e. linking to external hardware and/or software modules. The design support system provides a number of tools for the analysis and evaluation of the design solutions. The kinematic simulation of machine tools can be performed using the Virtual Reality toolbox in Matlab. A module has also been developed for the multiscale modelling, simulation and results analysis in Matlab. A number of different cutting parameters can be studied and the machining performance can be subsequently evaluated using this module. The mathematical models for a non-traditional micro manufacturing process, micro EDM, have been developed with the simulation performed using FEA. Various design theories and methodologies have been studied, and the axiomatic design theory has been selected because of its great power and simplicity. It has been applied in the conceptual design of GMRP and its design support system. The implementation of the design support system is carried out using Matlab, Java and XML technologies. The proposed GMRP and framework have been evaluated through case studies and experimental results.
|
49 |
Understanding design impact : a new framework for understanding the potential of design and enhancing future professional practiceStephen, Green January 2016 (has links)
Understanding Design Impact: A new framework for understanding the potential of design and enhancing future professional practice. Design is widely recognised as an important driver for economic performance. However, the value of design has proved resistant to quantification despite research attention since the early 1980s. Correlation between design investment and impact has been demonstrated, but not causation. There is considerable interest from policy and professional bodies in what is described here as ‘Design Impact’. Impact can be measured, for example, by return on investment, increases in profitability or cost reductions. However this only crudely captures the economic impact of a design ingredient. Increasingly, social and environmental impacts are also of interest. The design profession sees the potential for better articulation of design impact as a means to increase their influence. The context has been explored through a series of descriptive and prescriptive studies including analysis of 45 DBA Design Effectiveness Award case studies, 304 undergraduate design projects from two institutions over a three year period together with interviews and workshops with senior design professionals and design academics. A new Understanding Design Impact framework is the overall outcome and contribution to knowledge from the work. This bridges between theory and practice and is a powerful basis for placing consideration of design impact at the heart of design activity. A design impact ontology has been developed as a robust foundation to the framework which resolves issues with underlying concepts. An initial version of this ontology is published in The Design Journal and is claimed as a supporting contribution to new knowledge. So too are new ontological classifications of factors which have considerable influence on design impact: Design Influences and Authority and Motivation and Path. These provide fresh perspectives and are worthy of further research consideration. A number of routes are identified for the further development and dissemination of the framework.
|
50 |
MACHINE LEARNING AND PROBABILISTIC DESIGN FRAMEWORK FOR LASER POWDER BED FUSION PROCESSLingbin Meng (8817110) 13 May 2020 (has links)
<p>There has been increasing demand
for 3D printed metals from aerospace & defense and automotive end-use
industries, due to their low manufacturing cost, and reduction in lead times. Although the significant
advancement in metal 3D printing promises to revolutionize industry, it is
constrained by a widespread problem: the cracks and other defects in the metal
3D printed parts. In this work, two major causes of defects in the laser
power bed fusion (L-PBF) process are focused: keyhole mode and spattering phenomena.
Both defect sources are highly dependent to the processing parameters. Although
extensive efforts have been made on experiments and computational models to
improve the quality of printed parts, the high experimental costs and large
computational intensity still limit their effect on the optimization of the
processing parameters. In addition, the uncertainty in the design process
further limits the accuracy of these models.</p><p>The aim of this thesis is to
develop a probabilistic design framework for reliability-based design in the
L-PBF process. The modeling framework spans physical models, machine learning
models, and probabilistic models. First, the keyhole mode and spattering
phenomena are simulated by physical models including computational fluid dynamics
(CFD) and smoothed particle hydrodynamics (SPH) methods, respectively. Then,
the data acquired by the physical models serve as the training data for machine
learning models, which are used as surrogates to alleviate the high
computational cost of physical models. Finally, the feasible design region is
computed by probabilistic models such as Monte Carlo simulation (MCS) and the
first order reliability method (FORM). The feasible design region can be used
assessing a satisfactory reliability not lower than the specified reliability
level.</p><p>The developed Gaussian process (GP) based machine learning
model is capable of predicting the remelted depth of single tracks, as a
function of combined laser power and laser scan speed in the L-PBF process. The
GP model is trained by both simulation and experimental data from the
literature. The mean absolute prediction
error magnified by the GP model is only 0.6 μm for a powder bed with
layer thickness of 30 μm,
suggesting the adequacy of the GP model. Then, the process design maps of two
metals, 316L and 17-4 PH stainless steel, are developed using the trained
model. The normalized enthalpy criterion of identifying keyhole mode is
evaluated for both stainless steels. For 316L, the result suggests that the
criterion should be related to the powder
layer thickness. For 17-4 PH, the criterion should be revised to
.</p><p>Moreover, a new and efficient
probabilistic method for the reliability analysis is developed in this work. It
provides a solution to address quality inconsistency due to uncertainty in the L-PBF
process. The method determines a feasible region of the design space for given
design requirements at specified reliability levels. If a design point falls
into the feasible region, the design requirement will be satisfied with a
probability higher or equal to the specified reliability. Since the problem
involves the inverse reliability analysis that requires calling the direct
reliability analysis repeatedly, directly using MCS is computationally
intractable, especially for a high reliability requirement. In this work, a new
algorithm is developed to integrate MCS and FORM. The algorithm finds the
initial feasible region quickly by FORM and then updates it with higher
accuracy by MCS. The method is applied to several case studies, where the
normalized enthalpy criterion is used as a design requirement. The feasible
regions of the normalized enthalpy criterion are obtained as contours with
respect to the laser power and laser scan speed at different reliability
levels, accounting for uncertainty in seven processing and material parameters.
The results show that the proposed method dramatically alleviates the
computational cost while maintaining high accuracy. This work provides a
guidance for the process design with required reliability.</p><p>The developed SPH model is used
to simulate the spattering phenomenon in the L-PBF process, to overcome the
limitation of commercial CFD packages, including their incapability of phase
change and particle sticking phenomena, which are however commonly seen in the
spattering process. The SPH model is capable to couple heat transfer, particle
motion and phase change. The sticking phenomenon observed in the experiment is
successfully reproduced by the SPH model using a similar scenario.</p><p>In summary, the modeling framework developed in this thesis
can serve as a comprehensive tool for reliability-based design in the L-PBF
process. The work is helpful for applying machine learning models in the
additive manufacturing field.</p><p>
</p><p><br></p><p>
</p><p> </p><p></p>
|
Page generated in 0.0961 seconds