• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2028
  • 247
  • 99
  • 74
  • 49
  • 18
  • 16
  • 10
  • 9
  • 7
  • 6
  • 3
  • 2
  • 2
  • 2
  • Tagged with
  • 2937
  • 2937
  • 507
  • 484
  • 483
  • 482
  • 451
  • 401
  • 344
  • 332
  • 218
  • 208
  • 183
  • 177
  • 176
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
151

Design, Analytics and Quality Assurance for Emerging Personalized Clinical Diagnostics Based on Next-Gen Sequencing

January 2014 (has links)
abstract: Major advancements in biology and medicine have been realized during recent decades, including massively parallel sequencing, which allows researchers to collect millions or billions of short reads from a DNA or RNA sample. This capability opens the door to a renaissance in personalized medicine if effectively deployed. Three projects that address major and necessary advancements in massively parallel sequencing are included in this dissertation. The first study involves a pair of algorithms to verify patient identity based on single nucleotide polymorphisms (SNPs). In brief, we developed a method that allows de novo construction of sample relationships, e.g., which ones are from the same individuals and which are from different individuals. We also developed a method to confirm the hypothesis that a tumor came from a known individual. The second study derives an algorithm to multiplex multiple Polymerase Chain Reaction (PCR) reactions, while minimizing interference between reactions that compromise results. PCR is a powerful technique that amplifies pre-determined regions of DNA and is often used to selectively amplify DNA and RNA targets that are destined for sequencing. It is highly desirable to multiplex reactions to save on reagent and assay setup costs as well as equalize the effect of minor handling issues across gene targets. Our solution involves a binary integer program that minimizes events that are likely to cause interference between PCR reactions. The third study involves design and analysis methods required to analyze gene expression and copy number results against a reference range in a clinical setting for guiding patient treatments. Our goal is to determine which events are present in a given tumor specimen. These events may be mutation, DNA copy number or RNA expression. All three techniques are being used in major research and diagnostic projects for their intended purpose at the time of writing this manuscript. The SNP matching solution has been selected by The Cancer Genome Atlas to determine sample identity. Paradigm Diagnostics, Viomics and International Genomics Consortium utilize the PCR multiplexing technique to multiplex various types of PCR reactions on multi-million dollar projects. The reference range-based normalization method is used by Paradigm Diagnostics to analyze results from every patient. / Dissertation/Thesis / Ph.D. Industrial Engineering 2014
152

Surgical Instrument Reprocessing in a Hospital Setting Analyzed with Statistical Process Control and Data Mining Techniques

January 2014 (has links)
abstract: In a healthcare setting, the Sterile Processing Department (SPD) provides ancillary services to the Operating Room (OR), Emergency Room, Labor & Delivery, and off-site clinics. SPD's function is to reprocess reusable surgical instruments and return them to their home departments. The management of surgical instruments and medical devices can impact patient safety and hospital revenue. Any time instrumentation or devices are not available or are not fit for use, patient safety and revenue can be negatively impacted. One step of the instrument reprocessing cycle is sterilization. Steam sterilization is the sterilization method used for the majority of surgical instruments and is preferred to immediate use steam sterilization (IUSS) because terminally sterilized items can be stored until needed. IUSS Items must be used promptly and cannot be stored for later use. IUSS is intended for emergency situations and not as regular course of action. Unfortunately, IUSS is used to compensate for inadequate inventory levels, scheduling conflicts, and miscommunications. If IUSS is viewed as an adverse event, then monitoring IUSS incidences can help healthcare organizations meet patient safety goals and financial goals along with aiding in process improvement efforts. This work recommends statistical process control methods to IUSS incidents and illustrates the use of control charts for IUSS occurrences through a case study and analysis of the control charts for data from a health care provider. Furthermore, this work considers the application of data mining methods to IUSS occurrences and presents a representative example of data mining to the IUSS occurrences. This extends the application of statistical process control and data mining in healthcare applications. / Dissertation/Thesis / M.S. Industrial Engineering 2014
153

Demonstrating Set-Based Design Techniques- A UAV Case Study

Small, Colin 15 May 2018 (has links)
<p> The Department of Defense (DoD) and Engineered Resilient Systems (ERS) community seek to improve decision making in the Analysis of Alternatives (AoA) process by incorporating resilience and leveraging the capabilities of model-based engineering (MBE) early in the design process. Traditional tradespace exploration utilizing Point-Based Design (PBD) often converges quickly on a solution with subsequent engineering changes to modify the design. However, this process can lead to a suboptimal solution if an incorrect initial solution is chosen. Enabled by MBE, Set-Based Design (SBD) considers sets of all possible solutions and enables down-selecting possibilities to converge on a final solution. Using a US Army Armament Research, Development, and Engineering Center case study and an open source Excel<sup>&reg;</sup> add-in called SIPmath, this research develops an integrated MBE case study demonstration that simultaneously generates numerous designs using physics models into the value and cost tradespace allowing for tradespace exploration and SBD. In addition, this research explores incorporating resilience quantification and uncertainty into SBD.</p><p>
154

Labor Skills in the Maintenance Department for Industry 4.0

Marzullo, Tomas 17 May 2018 (has links)
<p> Industry 4.0 is changing the manufacturing environment with its cyber-physical infrastructure to support and help increase production performance. The cyber-physical infrastructure brings new technologies such as Internet of Things, big data, cloud computing, and machine learning using advanced algorithms. To deal with this new order to preserve asset performance, industrial maintenance needs to be prepared. This study aims to understand the impact of Industry 4.0 on the skills required within industrial maintenance departments. A survey of industrial maintenance professionals finds that the majority of training comes from internal sources and that much of the information systems used for training are out-of-date or does not exist. The results of this study show that Industry 4.0 will impact the maintenance department and that a Change Management process should be put in place to accomplish this transition smoothly.</p><p>
155

Low vision, stimulus encoding and information processing: a characterization of performance of partially sighted users on computer-based tasks

Dixon, Max A. 23 July 1998 (has links)
This study focuses on the characterization of partially sighted users' performance within a graphical user interface environment. Participants, ranging in visual abilities from fully sighted (FSU) with no visual impairments to partially sighted (PSU) with limited visual abilities, participated in computer-based search and select tasks. It is shown that visual search strategies employed by both PSU and FSU within a graphical user interface can be described by Steinberg's (1969) Additive Factor Model. In addition, selection strategies, measured by mouse movement times, are linearly related and highly correlated to the Index of Difficulty as explained by Fitts' Law. This is the first study of its kind that links the physiology of partial vision to behaviors and strategies exhibited during psychomotor task performance. These results can enable system interface designers to effectively design and accommodate the wide range of visual capabilities of today's growing population of computer users.
156

The effects of semantic and syntactic instruction on user performance and satisfaction in search user interface design

Bandos, Jennifer M. 20 November 2003 (has links)
The design of interfaces to facilitate user search has become critical for search engines, ecommercesites, and intranets. This study investigated the use of targeted instructional hints to improve search by measuring the quantitative effects of users' performance and satisfaction. The effects of syntactic, semantic and exemplar search hints on user behavior were evaluated in an empirical investigation using naturalistic scenarios. Combining the three search hint components, each with two levels of intensity, in a factorial design generated eight search engine interfaces. Eighty participants participated in the study and each completed six realistic search tasks. Results revealed that the inclusion of search hints improved user effectiveness, efficiency and confidence when using the search interfaces, but with complex interactions that require specific guidelines for search interface designers. These design guidelines will allow search designers to create more effective interfaces for a variety of searchapplications.
157

A definition and measure of workflow modularity

Chin, Dawn-Marie 30 March 2005 (has links)
The purpose of this study is to define and measure workflow modularity. There is an increasing need for organizations to implement processes that can be easily configured to offer distinctive capabilities compared to the competition. The concept of modularity provides the foundation for organizations to design flexible processes. The Event-Driven Process Chain (EPC) approach is used to model an example workflow to illustrate. Based on the model of atomic tasks, rules are developed to guide the creation of modules with high cohesion between tasks in a module and loose-coupling between modules. Matrices of atomic tasks interdependencies are developed and tasks are then clustered based on interdependence strengths. The main deliverable is a mathematical model for defining and analyzing a modular workflow to enable the creation of flexible workflow processes. The modularization model represents tasks relationships that maximizes cohesion between tasks, minimizes coupling between modules, while minimizing workflow time.
158

A simulation-based decision support system for evaluating operating policies in an emergency room facility

Alvarez, Adriana M. 11 March 1999 (has links)
Increased pressure to control costs and increased competition has prompted health care managers to look for tools to effectively operate their institutions. This research sought a framework for the development of a Simulation-Based Decision Support System (SB-DSS) to evaluate operating policies. A prototype of this SB-DSS was developed. It incorporates a simulation model that uses real or simulated data. ER decisions have been categorized and, for each one, an implementation plan has been devised. Several issues of integrating heterogeneous tools have been addressed. The prototype revealed that simulation can truly be used in this environment in a timely fashion because the simulation model has been complemented with a series of decision-making routines. These routines use a hierarchical approach to organize the various scenarios under which the model may run and to partially reconfigure the ARENA model at run time. Hence, the SB-DSS tailors its responses to each node in the hierarchy.
159

Predicting employee compliance with safety regulations, factoring risk perception

Diaz, Yenny Farinas 21 November 2000 (has links)
The purpose of this research was to develop a methodology that would evaluate employees’ personality traits, demographic characteristics, and workplace parameters to predict safety compliance along with the moderating effect of risk perception. One hundred and twenty five employees of a manufacturing facility were given questionnaires to gather their demographic and perception information. Surveys were also used to measure their personality characteristics, and periodic observations were recorded to document employee’s safety compliance. A significant correlation was found between compliance and the worker's perception of management's commitment to safety (r = 0.27, p < 0.01), as well as with gender (r = -0.19, p < 0.05). Females showed a significantly higher average compliance (78%), than males (69%). These findings demonstrated the value of developing a model to predict safety behavior that would assist companies In maintaining a safe work environment, preventing accidents, ensuring compliance, and reducing associated costs.
160

Robot planning for automated burn debridement

Nwodoh, Thomas Anayochukwu 01 January 1996 (has links)
A basic tenet in the treatment of burns is the removal of the burned dead tissue which acts as a haven for harmful bacteria. Contemporary techniques for the removal of burned skin tissue involve a tedious process of serial skin excisions in order to leave behind only viable skin tissue. This results in significant risk to the patient as it is accompanied by marked blood loss. Recently laser therapy has been used for burned tissue excision. However, such laser surgery: is presently manually controlled; is tedious to perform; and is virtually impossible to accomplish bloodlessly due to inability to manually control laser depth penetration accurately. This research develops the robot plan for a system that automatically debrides burned dead tissue on burn victims using a high energy laser for the ablation of the burned tissue. The automated robotic system consists of: a robot whose end effector is equipped with a laser head whence the laser beam emanates and a vision system that is used to acquire the 3-D coordinates of some points on the body surface; 3-D surface modelling routines for generating the surface model of the treatment area; and control and interface hardware and software for control and integration of all the system components. The entire process of automated burn debridement is achieved in two phases: an initial survey phase during which a model of the treatment area on the skin is built and used to plan an appropriate trajectory for the robot in the subsequent phase--the treatment phase during which the laser surgery is performed. During the survey phase, the vision system employs a camera to acquire points on the surface of the patient's body by using the camera to capture the contour traced by a plane of light generated by a low power laser distinct from the treatment laser. The camera's image is then processed. Selected points on the camera's 2-D image frame are used as input to a process that generates 3-D body surface points as the intersection point of the plane of light and the line of sight between the camera's image point and the body surface point. The acquired surface points are then used to generate a computational model of the treatment area using the Non-Uniform Rational B-Splines (NURBS) surface modelling technique. The constructed NURBS surface model is used to generate a treatment plan for the execution of the treatment procedure. During the treatment phase, the robot plan for ablating the dead tissue is generated. To achieve this, the burned area is first defined on the model. Then, based on the shape of the burned area, treatment patterns are generated to cover the entire area to be treated. Thirdly, based on the treatment patterns, the trajectory to be followed by the laser head to accomplish complete debridement of the dead tissue is generated. Fourthly, with the end effector's point interpolated trajectory necessary for effective treatment obtained, the robot plans the motions necessary for laser ablation of the dead tissue without an over- or under-cut. Accomplishing these steps leads to the generation of a plan for effective dead tissue ablation.

Page generated in 0.0902 seconds