• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 9605
  • 4928
  • 2171
  • 726
  • 599
  • 532
  • 440
  • 361
  • 252
  • 158
  • 158
  • 158
  • 153
  • 148
  • 136
  • Tagged with
  • 24197
  • 4589
  • 3491
  • 2170
  • 1828
  • 1239
  • 1212
  • 1205
  • 1201
  • 1038
  • 1028
  • 1008
  • 978
  • 957
  • 930
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Process development for high powered amplifier Au/Sn eutectic die attach via vacuum furnace

Blanden, Zachary F. 05 January 2017 (has links)
<p>This research was conducted to develop and qualify a vacuum GaAs semiconductor monolithic microwave integrated circuit die attach process. Research was done to understand the causes and effects of voiding levels on device performance and reliability. Simultaneous investigation was done to qualify vacuum-attach as a successful methodology by which minimal voiding levels were achieved. After an initial vacuum-attach trial was completed to verify the methodology, internal accept/reject criteria were developed to qualify die attach interfaces. A dual phase attachment methodology was created to minimize tolerance stacking resulting in more consistent component placement. MATLAB image processing code was developed to quantify the voiding levels against the accept/reject criteria. Statistical methodologies were employed to troubleshoot root causes for special cause variation of initial attachment failures. A design of experiment was conducted testing three factors each at two levels (process gas [Gas A, Gas B], leaking chamber [yes, no], and carrier supplier [Supplier A, Supplier B]). The DOE identified process gas and its interaction with the carrier supplier to be significant. Further investigation of the carriers identified plating contamination, resulting in the process gas the primary factor of interest. A secondary experiment focusing on process gas identified no statistical difference between Gas A? and Gas B (Gas A? indicating a high purity form of Gas A). With this information, Gas A? was selected as the process gas. A total of 56 attachment interfaces were then produced yielding 0.7485% voiding, on average, following a Weibull distribution (?= 1.04171, ? = 0.75967) with zero rejections. The process?s consistency of minimal voiding levels were deemed a success and the process was released to production.
72

IMPROVING LABORATORY ANIMAL CARE: A BEHAVIOR MANAGEMENT INTERVENTION

Unknown Date (has links)
Behavior management of employee performance is becoming more prevalent; however, few systems have been implemented in service-industry settings. This experiment compared two behavior management systems for improving one critical service, providing adequate care to laboratory animals. One utilized goal setting + performance feedback and the other, self-recording + goal setting + feedback. It was hypothesized that both systems would increase performance, but the latter would be the more effective of the two because prompts indicating those tasks that were to be performed could not be ignored. / Laboratory Animal Resources (LAR) in the Florida State University was the site of the experiment and the technicians who performed the animal-care services were the subjects. Behavioral criteria for task completion and a system that accurately measured performance of tasks were developed. Following a baseline period, goal setting + feedback was implemented in one animal-housing unit and goal setting + feedback + self-recording was implemented in the other in a multiple-baseline design. / Throughout the study, the experimenter assessed percent completion of daily scheduled tasks. On a weekly basis, LAR management inspected the units, assessed task completion and assigned performance rating based on the findings. Feedback to the technicians about management's findings was delivered weekly. / Baseline results showed that approximately 39 percent of tasks in one unit and 55 percent in the other were performed as scheduled and no areas received "satisfactory" ratings. The intervention in each unit increased daily task completion to about 80 percent and approximately 50 percent of the areas in both units received "satisfactory" ratings each week. / These results indicated that either behavior management system significantly increased employee performance; however, the similarity of results indicated that self-recording did not enhance performance above goal setting + feedback alone. Peak performance was obtained by employees who self-recorded faster than by those who did not, thus, self-recording may assist in the acquisition of behavior; however, performance was maintained after self-recording was no longer required. / Source: Dissertation Abstracts International, Volume: 43-04, Section: B, page: 1289. / Thesis (Ph.D.)--The Florida State University, 1982.
73

Performance of Control Charts for Weibull Processes

Unknown Date (has links)
Statistical Process Control (SPC) is a statistical method for monitoring variability of processes. Process variation can be categorized as common cause and special cause. Common causes are the natural or expected variation of some change in the process. The presence of a special cause indicates that the process is not in a state of statistical control. The SPC methodology dictates that a search should be initiated when a special cause is detected. This thesis is about the set-up of magnitude robust control chart and CUSUM charts for detecting changes in Weibull processes. The research includes the comparison of the ARL performance of the control charts. / A Thesis submitted to the Department of Industrial and Manufacturing Engineering in partial fulfillment of the requirements for the degree of Master of Science. / Degree Awarded: Spring Semester, 2009. / Date of Defense: October 31, 2008. / Statistical Process Control, Weibull Distribution, Magnitude Robust Control Chart, CUSUM Chart, ARL, Maximum Likelihood Estimates, Maximum Likelihood Ratio Test / Includes bibliographical references. / Joseph J. Pignatiello, Jr., Professor Directing Thesis; Samuel A. Awoniyi, Committee Member; Arda Vanli, Committee Member; Okenwa Okoli, Committee Member.
74

Dimension Variation Prediction and Control for Composites

Unknown Date (has links)
This dissertation presents a systematic study on the dimension variation prediction and control for polymer matrix fiber reinforced composites. A dimension variation model was developed for process simulation based on thermal stress analysis and finite element analysis (FEA). This model was validated against the experimental data, the analytical solutions and the data from literature. Using the FEA-based dimension variation model, the deformations of typical composite structures were studied and the regression-based dimension variation model was developed. The regression-based dimension variation model can significantly reduce computation time and provide a quick design guide for composite products with reduced dimension variations. By introducing the material modification coefficient, this comprehensive model can handle various fiber/resin types and stacking sequences. It eliminates the complicated, time-consuming finite element meshing and material parameter defining process. The deformation compensation through tooling design was investigated using the FEA-based and the regression-based dimension variation models. The structural tree method (STM) was developed to compute the assembly deformation from the deformations of individual components, as well as the deformation of general shape composite components. The STM enables rapid dimension variation analysis/synthesis for complex composite assemblies with the regression-based dimension variation model. Using the STM and the regression-based dimension variation model, design optimization and tolerance analysis/synthesis were conducted. The exploring work presented in this research provides a foundation to develop practical and proactive dimension control techniques for composite products. / A Dissertation submitted to the Department of Industrial Engineering in partial fulfillment of the requirements for the degree of Doctor of Philosophy. / Degree Awarded: Summer Semester, 2003. / Date of Defense: July 7, 2003. / Composites / Includes bibliographical references. / Chuck Zhang, Professor Directing Dissertation; George Buzyna, Outside Committee Member; Zhiyong Liang, Committee Member; Okenwa Okoli, Committee Member; Ben Wang, Committee Member.
75

A STANDARDIZED AVIATION PILOT TRAINING PROGRAM

Unknown Date (has links)
The purpose of the study was to develop a standardized curriculum for pilot training which can be used in response to ever-increasing demand for the pilot-with baccalaureate. The demand is partly in response to the military's curtailment of training of pilots for other than their own needs. The demise of the G.I. Bill for flight training has increased demand for pilot training in higher education. / Precedent for standardization exists, expecially in the Civilian Pilot Training Program (CPTP) in World War II. The expense and complexity of pilot training today demand optimum use of resources. / The many curricular organizational models which exist indicate the necessity for organization. Development of a list of elements is but one step in the organizational model used. / The four groups represented within the panel were Air Carriers, the Air Force, the Navy and General Aviation. The differences exhibited as groups were generally attributable to differences in mission . The Delphi survey technique was used to elicit this information. / Programs reviewed were the U.S. Air Force, the U.S. Navy, Purdue University, the University of Illinois, the University of South Alabama and the Federal Aviation Administration. Of the 756 elements listed as being taught by any of the programs, 94.6% were considered Mandatory by a panel of twelve experts in aviation education. Not Recommended were 0.2% of the elements. / A standardized aviation pilot training program at the pre-specialization level which will satisfy the needs of all sectors of aviation is possible. Organizations which teach pilots do not teach significantly different elements through the pre-specialization level. / Source: Dissertation Abstracts International, Volume: 43-12, Section: A, page: 3830. / Thesis (Educat.D.)--The Florida State University, 1983.
76

Hierarchy Generation for Designing Assembly System for Product with Complex Liaison and Sub-Assembly Branches

Unknown Date (has links)
Manufacturers need to deploy their assembly systems in a timely manner to cope with expedited product development. Design of such responsive assembly systems consists of generation of assembly/subassembly operations and their hierarchies, operation-machine assignment, selections of machine types and quantities, and the material flow among machines. Exploration of all the feasible solutions to the assembly operations and their hierarchical relationships is vital to optimization of system designs. This research developed a theoretical framework based on a recursive algorithm to automatically generate all feasible and non-redundant assembly hierarchies efficiently, thereby investigating its impact on assembly system designs. Then this research further discussed the potential applications of the recursive framework in system optimization including joint determination of optimal assembly operations, operation-machine assignment, machine types and quantities, and the material flows among machines. The work was also extended to the optimization of assembly systems for the products with complex liaison relations and product families. / A Thesis submitted to the Department of Industrial and Manufacturing Engineering in partial fulfillment of the Master of Science. / Fall Semester 2015. / November 16, 2015. / assembly hierarchy, assembly system, generation algorithm, product family, system optimization / Includes bibliographical references. / Hui Wang, Professor Directing Thesis; Okenwa Okoli, Committee Member; Arda Vanli, Committee Member.
77

Image Segmentation for Extracting Nanoparticles

Unknown Date (has links)
With the advent of nanotechnology, nanomaterials have drastically improved our lives in a very short span of time. The more we can tap into this resource, the more we can change our lives for better. All the applications of nanomaterials depend on how well we can synthesize the nanoparticles in accordance with our desired shape and size, as they determine the properties and thereby the functionality of the nanomaterials. Therefore in this report, it is focused on how to extract the shape of the nanoparticles from electron microscope images using image segmentation more accurately and more efficiently. By developing automated image segmentation procedure, we can systematically determine the contours of an assortment of nanoparticles from electron microscope images; reducing data examination and interpretation time substantially. As a result, the defects in the nanomaterials can be reduced drastically by providing an automated update to the parameters controlling the production of nanomaterials. The report proposes new image segmentation techniques that specifically work very effectively in extracting nanoparticles from electron microscope images. These techniques are manifested by imparting new features to Sliding Band Filter (SBF) method called Gradient Band Filter (GBF) and by amalgamating GBF with Active Contour Without Edges method, followed by fine tuning of μ (a positive parameter in Mumford-Shah functional). The incremental improvement in the performance (in terms of computation time, accuracy and false positives) of extracting nanoparticles is therefore portrayed by comparing image segmentation by SBF versus GBF, followed by comparing Active Contour Without Edges versus Active Contour Without Edges with the fusion of Gradient Band Filter (ACGBF). In addition we compare the performance of a new technique called Variance Method to fine tune the value of μ with fine tuning of μ based on ground truth, followed by gauging the improvement in the performance of image segmentation by ACGBF with fine tuned value of μ over ACGBF with an arbitrary value of μ. / A Thesis submitted to the Department of Industrial & Manufacturing Engineering in partial fulfillment of the requirements for the degree of Master of Science. / Fall Semester 2015. / November 09, 2015. / Active Contours, Image Segmentation, Nanoparticles, Sliding Band Filter / Includes bibliographical references. / Chiwoo Park, Professor Directing Thesis; Abhishek Shrivastava, Committee Member; Tao Liu, Committee Member; Adrian Barbu, Committee Member.
78

PROMOTING SAFETY BELT USE AMONG STATE EMPLOYEES: THE EFFECTS OF PROMPTING, STIMULUS CONTROL, AND A RESPONSE-COST INTERVENTION (RESTRAINT, BEHAVIORAL, STATE GOVERNMENT, WORKER'S COMPENSATION LAW, LOSS PREVENTION)

Unknown Date (has links)
A program which attempted to increase automobile seat belt utilization among state government employees through the use of prompting, stimulus control, and a response-cost intervention was evaluated. A multiple baseline design was used to assess the effects of dashboard stickers and signature sheets which informed the occupants of state owned vehicles of the rule/law requiring seat belt use and the consequences of a 25 percent reduction in benefits for non-compliance if the driver was involved in an accident. A third agency was included in the study to assess the effects of dashboard stickers alone. The results indicate that seat belt use did significantly increase during the intervention phase in all three agencies and maintained consistently high levels throughout the duration of the study, which was five months in the longest condition. Agency-1 and Agency-2 (Stickers + Signature Sheets) achieved relative increases of 527% and 500% over baseline, respectively. Agency-3 (Stickers Only) achieved an increase of 392% over baseline. No generalization to private vehicle use was observed. Arguements in favor of the maintanence of the effects, a cost-effective analysis, contributions to the field, and directions for future research are presented. / Source: Dissertation Abstracts International, Volume: 46-01, Section: B, page: 0337. / Thesis (Ph.D.)--The Florida State University, 1984.
79

Supply Prepositioning for Disaster Management

Unknown Date (has links)
This thesis studies two-stage stochastic optimization methods for supply prepositioning for hurricane relief logistics. The first stage determines where to preposition supplies and how much to preposition at a location. The second stage decides the amount of supplies distributed from supply centers to demand centers. The methods proposed are (I) a method to minimize the expected total cost (II) a method to minimize the variance of the total cost that accounts for the uncertainties of parameters of the expected cost model. For method II, a Bayesian model and a robust stochastic programming solution approach are proposed. In this approach the cost function parameters are assumed to be uncertain random variables. We propose a Mixed Integer Programming model, which can be solved efficiently using linear and nonlinear programming solvers. The resultslinear and nonlinear integer programming problems are obtained solved using CPLEX and FILMINT solvers, respectively. A computational case study comprised of real-world hurricane scenarios is conducted to illustrate how the proposed methods work on a practical problem. A buffer zone is specified in order to be sent of the commodities to a certain distance. Estimation of hurricane landfall probabilities and the effect of cost uncertainty on prepositioning decisions is considered.We propose a Mixed Integer Programming model, which can be solved efficiently using a linear and nonlinear programming solver. The results are obtained using CPLEX and FILMINT. / A Thesis submitted to the Department of Industrial and Manufacturing Engineering in partial fulfillment of the requirements for the degree of Master of Science. / Spring Semester 2018. / April 18, 2018. / bayesian analysis, disaster relief, inventory management, optimization, stochastic programming / Includes bibliographical references. / Arda Vanli, Professor Directing Thesis; Hui Wang, Committee Member; Chiwoo Park, Committee Member; Eren Erman Ozguven, Committee Member.
80

Sparsity-Regularized Learning for Nano-Metrology

Unknown Date (has links)
The key objective of nanomaterial metrology is to extract relevant information on nano-structure for quantitatively correlating structure-property with functionality. Historic improvements on in- strumentation platforms has enabled comprehensive capture of the information stream both glob- ally and locally. For example, the impressive scanning transmission electron microscopy (STEM) progress is the access to vibrational spectroscopic signals such as atomically resolved electron en- ergy loss spectroscopy (EELS) and the most recent ptychography. This is particularly pertinent in the scanning probe microscopy (SPM) community that has seen a rapidly growing trend towards simultaneous capture of multiple imaging channel and increasing data sizes. Meanwhile signal pro- cessing analysis remained in the same, depending on simple physics models. This approach by definition ignores the material behaviors associated with the deviations from simple physics models and hence require more complex dynamic models. Introduction of such models, in turn, can lead to spurious growth of free parameters and potential overfitting etc. To derive signal analysis pathways necessitated by large,complex datasets generated by progress in instrumentation hardware, here we propose data-physics inference driven approaches for high- veracity and information-rich nanomaterial metrology. Mathematically, we found structural spar- sity regularizations extremely useful which are explained at corresponding applications in later chapters. In a nutshell, we overview the following contributions: 1.We proposed a physics-infused semi-parametric regression approach for estimating the size distribution of nanoparticles with DLS measurements, yielding more details of the size distribution than the traditional methodology. Our methodology expands DLS capability of characterizing heterogeneously shaped nanoparticles. 2.We proposed a two-level structural sparsity regularized regression model and correspondingly developed a variant of group orthogonal matching pursuit algorithm for simultaneously estimating global periodic structure and detecting local outlier structures in noisy STEM images. We believe this an important step toward automatic phase. 3.We develop and implement a universal real-time image reconstruction algorithm from rapid and sparse STEM scans for non-invasive and high-dynamic range imaging. We build up and opensource the systematic platform that fundamentally push the evolution of STEM for both imaging and e-beam based atom-by-atom fabrication, forming a marriage between the imaging and manipulation modes via intelligent and adaptive responses to the real-time material evolution. / A Dissertation submitted to the Department of Industrial and Manufacturing Engineering in partial fulfillment of the requirements for the degree of Doctor of Philosophy. / Summer Semester 2018. / July 6, 2018. / Includes bibliographical references. / Chiwoo Park, Professor Directing Dissertation; Anuj Srivastava, University Representative; Zhiyong Liang, Committee Member; Arda Vanli, Committee Member.

Page generated in 0.0757 seconds