Spelling suggestions: "subject:"bindustrial engineering."" "subject:"0industrial engineering.""
261 |
A Systems Approach to Assessing, Interpreting and Applying Human Error Mishap Data to Mitigate Risk of Future Incidents in a Space Exploration Ground Processing Operations EnvironmentAlexander, Tiffaney 01 January 2016 (has links)
Research results have shown that more than half of aviation, aerospace and aeronautics mishaps/incidents are attributed to human error. Although many existing incident report systems have been beneficial for identifying engineering failures, most of them are not designed around a theoretical framework of human error, thus failing to address core issues and causes of the mishaps. Therefore, it is imperative to develop a human error assessment framework to identify these causes. This research focused on identifying causes of human error and leading contributors to historical Launch Vehicle Ground Processing Operations mishaps based on past mishaps, near mishaps, and close calls. Three hypotheses were discussed. The first hypothesis addressed the impact Human Factor Analysis and Classification System (HFACS) contributing factors (unsafe acts of operators, preconditions for unsafe acts, unsafe supervision, and/or organizational influences) have on human error events (i.e. mishaps, close calls, incident or accidents) in NASA Ground Processing Operations. The second hypothesis focused on determining if the HFACS framework conceptual model could be proven to be a viable analysis and classification system to help classify both latent and active underlying contributors and causes of human error in ground processing operations. Lastly, the third hypothesis focused on determining if the development of a model using the Human Error Assessment and Reduction Technique (HEART) could be used as a tool to help determine the probability of human error occurrence in ground processing operations. A model to analyze and classify contributing factors to mishaps or incidents, and generate predicted Human Error Probabilities (HEPs) of future occurrence was developed using the HEART and HFACS tools. The research methodology was applied (retrospectively) to six Ground Processing Operations (GPO) Scenarios and 30 years of Launch Vehicle Related Mishap Data. Surveys were used to provide Subject Matter Experts' (SMEs) subjective assessments of the impact Error Producing Conditions (EPC) had on specific tasks. In this research a Logistic Binary Regression model, which identified the four most significant contributing HFACS human error factors was generated. This model provided predicted probabilities of future occurrence of mishaps when these contributing factors are present. The results showed that the HEART and HFACS methods, when modified, can be used as an analysis tool to identify contributing factors, their impact on human error events, and predict the potential probability of future human error occurrence. This methodology and framework was validated through consistency and comparison to other related research. A contribution methodology for other space operations and similar complex operations to follow was provided from this research. Future research should involve broadening the scope to explore and identify other existing models of human error management systems to integrate into complex space systems beyond what was conducted in this research.
|
262 |
Quality by Design Procedure for Continuous Pharmaceutical Manufacturing: An Integrated Flowsheet Model ApproachVezina, Ashley 01 January 2017 (has links)
Pharmaceutical manufacturing is crucial to global healthcare and requires a higher, more consistent level of quality than any other industry. Yet, the traditional pharmaceutical batch manufacturing has remained largely unchanged in the last fifty years due to high R&D costs, shorter patent durations, and regulatory uncertainty. This has led regulatory bodies to promote modernization of manufacturing process to continuous pharmaceutical manufacturing (CPM) by introducing new methodologies including quality by design, design space, and process analytical technology (PAT). This represents a shift away from the traditional pharmaceutical manufacturing way of thinking towards a risk based approach that promotes increased product and process knowledge through a data-rich environment. While both literature and regulatory bodies acknowledge the need for modernization, manufacturers have been slow to modernize due to uncertainty and lack of confidence in the applications of these methodologies. This paper aims to describe the current applications of QbD principles in literature and the current regulatory environment to identify gaps in literature through leveraging regulatory guidelines and CPM literature. To aid in closing the gap between QbD theory and QbD application, a QbD algorithm for CPM using an integrated flowsheet models is also developed and analyzed. This will help to increase manufacturing confidence in CPM by providing answers to questions about the CPM business case, applications of QbD tools, process validation and sensitivity, and process and equipment characteristics. An integrated flowsheet model will aid in the decision-making process and process optimization, breaking away from ex silico methods extensively covered in literature.
|
263 |
A Study of Nonlinear Dynamics of EEG Responses to Simulated Unmanned Vehicle TasksXu, Ziqing 01 May 2019 (has links)
In the contemporary world, mental workload becomes higher as technology evolves and task demand becomes overwhelming. The operators of a system are usually required to complete tasks with higher complicity within a shorter period of time. Continuous operation under a high level of mental workload can be a major source of risk and human error, thus put the operator in a hazardous working environment. Therefore, it is necessary to monitor and assess mental workload. In this study, an unmanned vehicle operation with visual detection tasks was investigated by means of nonlinear analysis of EEG time series. Nonlinear analysis is considered more advantageous compared with traditional power spectrum analysis of EEG. Besides, nonlinear analysis is more capable to capture the nature of EEG data and human performance, which is a process that subjects to constant changes. By examining the nonlinear dynamics of EEG, it is more likely to obtain a deeper understanding of brain activity. The objective of this study is to investigate the mental workload under different task levels through the examination of brain activity via nonlinear dynamics of EEG time series in simulated unmanned ground vehicle visual detection tasks. The experiment was conducted by the team lead by Dr. Lauren Reinerman Jones at Institute for Simulation & Training, University of Central Florida. One hundred and fifty subjects participated the experiment to complete four visual detection task scenarios (1) change detection, (2) threat detection task, (3) dual task with different change detection task rates, and (4) dual task with different threat detection task rates. Their EEG was recorded during performing the tasks at nine EEG channels. This study develops a massive data processing program to calculate the largest Lyapunov exponent, correlation dimension of the EEG data. This study also develops the program for performing 0-1 test on the EEG data in Python language environment. The result of this study verifies the existence of chaotic dynamics in EEG time series, reveals the change in brain activity as the effect of changing task demand in more detailed level, and obtains new insights from the psychophysiological mental workload measurement used in the preliminary study. The results of this study verified the existence of the chaotic dynamics in the EEG time series. This study also supported the hypothesis that EEG data exhibits change in the level of nonlinearity corresponding to differed task levels. The nonlinear analysis of EEG time series data is able to discriminate the change in brain activity derived from the changes in task load. All nonlinear dynamics analysis techniques used in this study is able to find the difference of nonlinearity in EEG among task levels, as well as between single task scenario and dual task scenario.
|
264 |
A Methodology for Data-Driven Decision-Making in Last Mile Delivery OperationsGutierrez Franco, Edgar 01 January 2019 (has links)
Across all industries, from manufacturing to services, decision-makers must deal day to day with the outcomes from past and current decisions that affect their business. Last-mile delivery is the term used in supply chain management to describe the movement of goods from a hub to final destinations. This research proposes a methodology that supports decision making for the execution of last-mile delivery operations in a supply chain. This methodology offers diverse, hybrid, and complementary techniques (e.g., optimization, simulation, machine learning, and geographic information systems) to understand last-mile delivery operations through data-driven decision-making. The hybrid modeling might create better warning systems and support the delivery stage in a supply chain. The methodology proposes self-learning procedures to iteratively test and adjust the gaps between the expected and real performance. This methodology supports the process of making effective decisions promptly, optimization, simulation, and machine learning models are used to support execution processes and adjust plans according to changes in conditions, circumstances, and critical factors. This research is applied in two case studies. The first one is in maritime logistics, which discusses the decision process to find the type of vessels and routes to deliver petroleum from ships to villages. The second is in city logistics, where a network of stakeholders during the city distribution process is analyzed, showing the potential benefits of this methodology, especially in metropolitan areas. Potential applications of this system will leverage growing technological trends (e.g., machine learning in supply chain management and logistics, internet of things). The main research impact is the design and implementation of a methodology, which can support real-time decisions and adjust last-mile operations depending on the circumstances. The methodology allows taking decisions under conditions of stakeholder behavior patterns like vehicle drivers, customers, locations, and traffic. As the main benefit is the possibility to predict future scenarios and plan strategies for the most likely situations in last-mile delivery. This will help determine and support the accurate calculation of performance indicators. The research brings a unified methodology, where different solution approaches can be used in a synchronized form, which allows researches and other interested people to see the connection between techniques. With this research, it was possible to bring advanced technologies in routing practices and algorithms to decrease operating cost and leverage the use of offline and online information, thanks to connected sensors to support decisions.
|
265 |
The Challenges and Barriers Facing Successful Lean Implementation in the Qatari Manufacturing OrganizationsAlyousef, Abdulwahab 01 January 2019 (has links)
The success of Toyota, the inventor of Lean Systems has drawn the attention of academics, researchers, and executives in the last decades. However, existing lean research is largely focused on developed economies in Europe, the Americas, and in Southeast Asia with little focus on the Arab world in general and Qatar specifically. There has been an emergence of manufacturing organizations and Small and Medium- sized Enterprises (SME) in Qatar recently which was the focus of this research. Despite the many attempts to implement lean in organizations, there have been a large number of failed attempts. This research aims to study the Qatari Manufacturing Organizations attempts in implementing lean. The main goal of the thesis is to shed a light on the current state of awareness of lean in Qatari Manufacturing Industry and SME and the potential barriers and challenges that prevent the successful adoption of lean. In order to do so, an online survey was designed through the SurveyMonkey tool and organizations were asked to participate. Numerical and statistical results from the survey are discussed along with Pareto charts and two hypothesis tests. The finding indicates that there is no difference between ISO certified and Non-ISO certified organizations in lean implementation. Further, it is understood that most of the manufacturing organizations in Qatar have some level of awareness of lean concepts/tools and practices. The results also show that organizations are not implementing lean concepts and are not using quality management practices effectively. The main obstacle that manufacturing organizations in Qatar face when implementing lean is "Lack of Employee Engagement" and "Unsupportive Organization Culture". However, it is noted that the research has some limitations such as having most participating organizations in the survey from specific sectors.
|
266 |
The Relationship between Agile Project Management and Project Success OutcomesBergmann, Thomas 01 January 2018 (has links)
Agile project management (APM) has recently emerged as a new approach to managing complex projects. Some experts believe that APM will become the standard project management approach used in the 21st century. However, thus far, the role of agility in project management has not been widely investigated. In the recent past, the concept of agility has mainly been applied to software development projects. The literature on agility is still in its early stages, and further research needs to be conducted in new project management domains. This study is intended to determine the impact of the adoption of APM on project success as perceived by project managers. This investigative approach can be applied to any project domain. In addition, the influencing effects of project complexity on the results of projects are analyzed. Through an analysis of the existing literature, critical success factors and success criteria are identified to develop a model that can be used to assess current APM practice. The research questions are answered by means of an empirical study that collected data using an online survey that was distributed to project managers located across the United States. Confirmatory factor analysis and structural equation modeling are performed to gauge the validity of the proposed research model. The study results show a significant positive relationship between APM and project success. Furthermore, a weak negative association is identified between project complexity and project success, suggesting a need for further research into and refinement of the project complexity construct. Finally, the results reveal an apparent need for additional education and certification in the field of project management, which are expected to lead to an increased use of agile approaches to project management in the future.
|
267 |
Modeling the innovation ecosystem and development of a dynamic innovation indexSchoen, Henriette 01 August 2018 (has links)
The topic of innovation currently generates a tremendous amount of interest around the world. Innovation is considered an essential part of the solution to creating more jobs and improving the socio-economic conditions of many countries around the globe. Innovation comes about through the existence of many interrelated solutions to socio-economic problems in an extensively interconnected network, which create value for each other. Such a complex creativity and innovation value-creating network is here called an Innovation Ecosystem (IE). The main objective of this dissertation research is to improve the current understanding of the IE by developing a simulation model that uses a broad set of relevant static and dynamic variables and incorporates the principles of system dynamics (SD). The proposed model, which is named the IECO-model is based on the relationships between 91 variables and the combined influences of the 43 parameters. Available data for 32 countries, representing a full span of GDP worldwide, was used to study the level of innovation in each of these countries. The result of the developed IECO-model is a novel ranking of the level of innovation through a dynamic innovation index, called the DII. The DII is a new tool to evaluate the innovation and entrepreneurship level of a given country in the context of the global economy. The most significant differentiator from other existing indices of innovation is that the DII is focusing more on the entrepreneurship qualities in 19 of the 43 parameters by looking at cultural values and belief systems, the social context, existing entrepreneurial culture, innovation attitudes, and mentality of each of the considered countries. According to DII-based ranking, the ten most innovative countries in the world are 1. Switzerland, 2. USA, 3. Finland, 4. Netherlands, 5. Iceland, 6. Sweden, 7. Germany, 8. Denmark, 9. The United Kingdom, and 10. Austria.
|
268 |
An Engineering Analytics Based Framework for Computational Advertising SystemsChen, Mengmeng 01 January 2018 (has links)
Engineering analytics is a multifaceted landscape with a diversity of analytics tools which comes from emerging fields such as big data, machine learning, and traditional operations research. Industrial engineering is capable to optimize complex process and systems using engineering analytics elements and the traditional components such as total quality management. This dissertation has proven that industrial engineering using engineering analytics can optimize the emerging area of Computational Advertising. The key was to know the different fields very well and do the right selection. However, people first need to understand and be experts in the flow of the complex application of Computational Advertising and based on the characteristics of each step map the right field of Engineering analytics and traditional Industrial Engineering. Then build the apparatus and apply it to the respective problem in question. This dissertation consists of four research papers addressing the development of a framework to tame the complexity of computational advertising and improve its usage efficiency from an advertiser's viewpoint. This new framework and its respective systems architecture combine the use of support vector machines, Recurrent Neural Networks, Deep Learning Neural Networks, traditional neural networks, Game Theory/Auction Theory with Generative adversarial networks, and Web Engineering to optimize the computational advertising bidding process and achieve a higher rate of return. The system is validated with an actual case study with commercial providers such as Google AdWords and an advertiser's budget of several million dollars.
|
269 |
Hybrid Life Cycle Sustainability Assessment-based Multi-objective Optimization: A Case for Sustainable Fleet MixSen, Burak 01 May 2019 (has links)
Sustainable transportation idea includes not only switching from conventional energy sources to alternative fuel resources, but also diverging from private vehicle use and shifting to alternative transportation modes. As a part of alternative transportation mode, utilizing alternative fuels in public transportation operation supports sustainable transportation at it full-glance. Given their implications in terms of air quality and sustainable movement of people, transit buses, which provide the primary public transportation service, are considered an ideal domain for the deployment of alternative fuels. An input-output (IO) model is developed based on Eora database – a detailed IO database that consists of national IO tables. Using the Eora-based IO model, this study quantifies and assesses the environmental, economic, and social impacts of alternative fuel buses in Atlanta, GA, and Miami, FL based on 6 macro-level sustainability indicators. The life cycle sustainability performance of these buses are then compared to that of a diesel transit bus as well as a regional comparison is carried out based on the two U.S. metropolitan areas. Based on these results, a multi-objective optimization model is constructed to find an optimal transit bus fleet for the studied U.S. regions. It has been found that a transit fleet that is composed of diesel buses operating in Atlanta has 30% more global warming potential than that of a transit fleet operating in Miami. The same bus fleet operating in Atlanta incurs a life cycle cost (LCC) that is more than double the LCC of the fleet operating in Miami. The study presents a way in which transit agencies can strategize their transitioning to a sustainable bus fleet.
|
270 |
An Agent Based Model to Assess Crew Temporal Variability During U.S. Navy Shipboard OperationsMuhs, Kevin 01 May 2018 (has links)
Understanding the factors that affect human performance variability as well as their temporal impacts is an essential element in fully integrating and designing complex, adaptive environments. This understanding is particularly necessary for high stakes, time-critical routines such as those performed during nuclear reactor, air traffic control, and military operations. Over the last three decades significant efforts have emerged to demonstrate and apply a host of techniques to include Discrete Event Simulation, Bayesian Belief Networks, Neural Networks, and a multitude of existing software applications to provide relevant assessments of human task performance and temporal variability. The objective of this research was to design and develop a novel Agent Based Modeling and Simulation (ABMS) methodology to generate a timeline of work and assess impacts of crew temporal variability during U.S. Navy Small Boat Defense operations in littoral waters. The developed ABMS methodology included human performance models for six crew members (agents) as well as a threat craft, and incorporated varying levels of crew capability and task support. AnyLogic ABMS software was used to simultaneously provide detailed measures of individual sailor performance and of system-level emergent behavior. This methodology and these models were adapted and built to assure extensibility across a broad range of U.S. Navy shipboard operations. Application of the developed ABMS methodology effectively demonstrated a way to visualize and quantify impacts/uncertainties of human temporal variability on both workload and crew effectiveness during U.S. Navy shipboard operations.
|
Page generated in 0.1312 seconds