• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 476
  • 71
  • 37
  • 37
  • 37
  • 37
  • 37
  • 37
  • 2
  • Tagged with
  • 604
  • 604
  • 59
  • 52
  • 47
  • 46
  • 42
  • 42
  • 40
  • 33
  • 32
  • 30
  • 25
  • 25
  • 23
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
201

Reengineering a process model generalisation tool

Sarrazin, Pierre, 1971- January 1996 (has links)
A large organization that has many projects to manage may want to build a model that gives an overview of the common and variable parts of its projects' processes. This action is process model generalisation. The McGill Software Engineering Laboratory has developed a technique and a tool to achieve that. The work described in this thesis consisted of reengineering the tool to give it a longer life expectancy and to make it part of a future client-server architecture suitable for developing a suite of process management tools. The tool was effectively reengineered and this helped the laboratory define the architecture better. Also, some lessons about software maintenance were learned.
202

Geometric and computational aspects of manufacturing processes

Bose, Prosenjit. January 1994 (has links)
Two of the fundamental questions that arise in the manufacturing industry concerning every type of manufacturing process are: (1) Given an object, can it be built using a particular process? (2) Given that an object can be built using a particular process, what is the best way to construct the object? The latter question gives rise to many different problems depending on how best is qualified. We address these problems for two complimentary categories of manufacturing processes: rapid prototyping systems and casting processes. The method we use to address these problems is to first define a geometric model of the process in question and then answer the question on that model. / In the category of rapid prototyping systems, we concentrate on stereolithography, which is emerging as one of the most popular rapid prototyping systems. We model stereolithography geometrically and then study the class of objects that admit a construction in this model. For the objects that admit a construction, we find the orientations that allow a construction of the object. / In the category of casting processes, we concentrate on gravity casting and injection molding. We first model the process and its components geometrically. We then characterize and recognize the objects that can be formed using a re-usable two-part cast. Given that a cast of an object can be formed, we determine a suitable location for the pin gate, the point from which liquid is poured or injected into a mold. Finally, we compute an orientation of a mold that ensures a complete fill and minimizes the number of venting holes for molds used in gravity casting processes.
203

Environment support for business modelling : concepts, architecture and implementation

Shen, Xijin, 1966- January 1994 (has links)
The goal of business modelling is the design, analysis, and simulation of an enterprise's architectural structures and their information technology components. To comprehensively support business modelling, an appropriate modelling environment with adequate visualization mechanisms is required. Such an environment may handle model information in a flexible, yet expressive way and support substitution, documentation, validation and dynamic analysis of models as well as model visualization and alternative representations. / We have developed a business modelling approach which is based on the formalism of extended, colored Petri nets. To support and validate our approach, we have engineered the Macrotec environment. Macrotec meets a list of requirements we have identified as crucial for the quality of a comprehensive modelling environment. It is conceived as a set of various tools which are seamlessly integrated. Our experience with Macrotec suggests that our concepts and environment substantially facilitate business modelling.
204

Dynamic optimization of job allocation using constant job-mix stages and priority factors

Massicotte, Paul. January 1997 (has links)
The dynamic optimization proposed in this work uses a linear programming technique to maximize the quantity of job orders processed on the machines at each constant job-mix stage. Priority factors guide the sequential allocation of partial and complete setups by ranking job/machine combinations in order of processing performance and capacity to meet due dates. / The job allocation is governed by an algorithm which constitutes the backbone of the dispatching software that was developed and used to solve the examples presented in this thesis. / Priority factors offer an effective mean of guiding the selection of setups by ranking job/machine combinations by processing speed and capacity to meet due times. The necessity for priority factors becomes more important as setup times increase in relation with quantities to be manufactured and the processing times. Priority factors also guide the allocation of partial setups on the machines to speed up job completion. / The job allocation system has a substantial potential for providing shorter makespan than the Shortest Operating Time methodology by increasing machine utilization.
205

The Localization of Free-Form Surfaces by the Maximization of the Shared Extent

Geisler, Jeannette 31 March 2015 (has links)
<p> Feedback, such as an inspection of a part, is a key step in the design and manufacture of complex products. It determines where a product or manufacturing process should be re-evaluated to conform to design specifications. The inspection of a part is characteristically accomplished by comparing the CAD model to the measurements of a manufactured part. This is simple for parts that contain a commonality: central axis, plane on a flat side, center of a sphere, etc. When a part does not share a commonality&mdash;like free-form surfaces&mdash;the comparison analysis becomes complex. </p><p> This complexity occurs when determining the process for correspondence of every point on a manufactured part to every point on a design model. Whenever one coordinate system is shifted, the comparison can be lost and, then, has to be reevaluated, creating an iteration. The demand for substantial iterations protracts the process and thwarts optimization. It is, also, challenging to mathematically determine which points should be compared to another. Is the selected point optimal for comparison? Is a higher resolution of points needed? This problem of how the coordinate systems of the CAD model and the measured part can be aligned is termed as localization and is extensively researched [1]. Currently, most algorithms use a line or surface fitting technique that minimizes the sum of the square of the errors, drawing upon Gunnarsson and Prinz's original idea [2]. Such nonlinear approaches may result in local minima when minimized, resulting in false solutions. Additionally, a solution achieved may not be optimal due to averaging of outliers in the data. </p><p> This thesis proposes a methodology that automatically aligns the coordinate systems of free-form CAD models to collected manufactured measurements, with resiliency to outliers of the fit and false solutions given by local minima, by maximizing the shared extent depending on dimension. To perform this, data from the manufactured surface and design surface are polygonized and compared until geometrically similar. Then, the overlapping or intersecting extent is calculated depending on the dimension and maximized using a heuristic approach, particle swarm optimization. At the maximum shared extent, two coordinate systems should be aligned in the optimal position. In this work, only two dimensional free-form curves are used to determine if the maximization of the shared extent results in an optimal solution, reducing complexity from three dimensions. Results obtained validated the approach and that the manufactured curve was aligned to the design, as measured by the sum of the squared errors. Also, the method was discovered to resist outliers, demonstrated by the tight alignment of consistent sloped areas while not necessarily aligned to peaks and valley features. Error observed is mainly due to inaccurate polygon geometry between the curves rather than the maximization of shared area process.</p>
206

The effect of job performance aids on quality assurance

Fosshage, Erik 25 October 2014 (has links)
<p> Job performance aids (JPAs) have been studied for many decades in a variety of disciplines and for many different types of tasks, yet this is the first known research experiment using JPAs in a quality assurance (QA) context. The objective of this thesis was to assess whether a JPA has an effect on the performance of a QA observer performing the concurrent dual verification technique for a basic assembly task. The JPA used in this study was a simple checklist, and the design borrows heavily from prior research on task analysis and other human factors principles. The assembly task and QA construct of concurrent dual verification are consistent with those of a high consequence manufacturing environment. Results showed that the JPA had only a limited effect on QA performance in the context of this experiment. However, there were three important and unexpected findings that may draw interest from a variety of practitioners. First, a novel testing methodology sensitive enough to measure the effects of a JPA on performance was created. Second, the discovery that there are different probabilities of detection for different types of error in a QA context may be the most far-reaching results. Third, these results highlight the limitations of concurrent dual verification as a control against defects. It is hoped that both the methodology and results of this study are an effective baseline from which to launch future research activities.</p>
207

Materials price risk mitigation in construction projects

Al-Zarrad, Mohammad Ammar 07 November 2014 (has links)
<p> Construction materials cost estimation is considered one of the most important tasks in the development of project budget. Using material hedging to mitigate the risk of material price volatility is a new concept for construction companies.</p><p> This thesis matched material hedging with the fuel hedging application utilized by airlines. The weather hedging process was used as a precedent for material hedging application in the construction industry. This thesis developed a model to provide a step by step guidance to apply material hedging in the construction industry. Further, this thesis matched its model with the model presented by Macdonald (2013) and provided a lower level of detail to support actual implementation of material hedging.</p><p> Future work in this area could be the investigation of material hedging cost to decide if the hedging application is feasible. Also, validity and reliability of the model presented by this thesis should be investigated. </p>
208

Equipment for testing X-ray methods for on-line texture measurement and plasticity prediction

Blandford, Peter. January 1997 (has links)
Equipment has been developed to test x-ray sensor designs suitable for texture analysis and plasticity prediction of rolled sheet in the production environment. A general method has been developed to optimize the complete sensor design by minimizing the rms error of the sensor's texture and property estimates for the sheet. A versatile reflection, x-ray diffraction sensor has been designed and built for the laboratory to test how accurately the optimized designs classify the quality of commercial rolled product. Experimental testing with the laboratory sensor was made on two sets of specimens: (1) a set of cold-rolled and annealed, interstitial-free steels; and (2) a set of hot-rolled, 3004 alloy aluminum sheets. Tests performed on the aluminum specimens showed that an optimized design sampling the mid-plane texture is able to classify the quality of the set's members; an optimized design sampling the surface texture can also perform the same classification. Tests performed on the steel specimens showed that an optimized design sampling the mid-plane texture can predict the average r-value, r&macr;, and the 2-fold anisotropy measure, (r90-r0)/2, with an accuracy of s = 0.10 and 0.03, respectively; an optimized design sampling the surface texture of the sheet can predict the same plasticity measures with an accuracy of s = 0.18 and 0.06. As a whole, the work has demonstrated the following: (1) workable reflection sensors can be developed for certain industrial applications; however, optimized transmission sensors are generally superior because the mid-plane texture leads to more accurate predictions, (2) statistically reliable predictions of the bulk texture and plasticity of commercial sheet can be made using specialized sensor designs that measure 10 to 50 pole densities, and (3) the general method developed to optimize the sensor design significantly reduces the sensor error---by as much as 140% over the unoptimized designs.
209

The Workers' Leadership Role in a Lean Transformation

Wilson, Jane T. 25 January 2014 (has links)
<p> This research explored the challenges that New Hampshire manufacturers faced in introducing and sustaining improvement initiatives and the social dimension of this issue.</p><p> Using an ethnographic methodology, participant-observation was employed to examine the culture of a New Hampshire manufacturer. Through active participation in work groups, regular assignments, and meetings, an understanding of the impact the culture had on the implementation of a set of improvement techniques known as lean manufacturing was gained. Observations illuminated the leadership roles that both management and workers play. Nine themes described the lived experience and the social forces that guided that experience. These themes were: subgroups, lack of overall teamwork, dominant clique, lack of management interaction, focus on the past, workers social lives, individual cooperation, pride in workmanship, lack of process ownership. The effect of these nine themes on the lean initiatives was explored. Finally, the study extrapolated the findings to provide recommendations on successful practices in sustaining lean improvements.</p>
210

Agent based human behavior modeling: A knowledge engineering based systems methodology for integrating social science frameworks for modeling agents with cognition, personality and culture

Bharathy, Gnana K. Unknown Date (has links)
Thesis (Ph.D.)--University of Pennsylvania, 2006. / (UMI)AAI3246140. Adviser: Barry Silverman. Source: Dissertation Abstracts International, Volume: 67-12, Section: B, page: 7352.

Page generated in 0.1146 seconds