• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 295
  • 84
  • 84
  • 67
  • 31
  • 25
  • 14
  • 13
  • 12
  • 7
  • 6
  • 3
  • 1
  • 1
  • 1
  • Tagged with
  • 637
  • 449
  • 229
  • 138
  • 127
  • 124
  • 119
  • 116
  • 108
  • 100
  • 99
  • 97
  • 97
  • 89
  • 84
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
321

Enabling cyber-physical system for manufacturing systems using augmented reality

Beigveder Durante, Pablo January 2023 (has links)
This project focuses on addressing the challenges faced by manufacturing lines such as complexity and flexibility through the integration of Augmented Reality (AR), Internet of Things (IoT), and Big Data technologies. The objective is to develop a framework that enhances the efficiency, flexibility, and sustainability of manufacturing processes in the context of Industry 4.0.  The project involves the design and implementation of an artifact solution using the UNITY platform. The solution enables users to remotely control and monitor a manufacturing line in real-time through an AR interface. By taking advantage of leveraging IoT devices and sensors, real-time data is collected from the production line, providing valuable insights into performance, maintenance needs, and resource optimization. The collected data is processed and analyzed using Big Data techniques, enabling predictive maintenance, quality control, and optimization of manufacturing processes.  The outcomes of this project will provide valuable insights into the potential of AR, IoT, and Big Data technologies in revolutionizing the manufacturing industry. The artifact solution serves as a proof-of-concept, demonstrating the feasibility and benefits of adopting these technologies for sustainable manufacturing in the context of Industry 4.0. Future research and development can build upon this work to further refine and scale the solution for broader industrial applications.
322

The Performance, Interoperability and Integration of Distributed Ledger Technologies

Palm, Emanuel January 2019 (has links)
In the wake of the financial crisis of 2008, Bitcoin emerged as a radical new alternative to the fiat currencies of the traditional banking sector. Through the use of a novel kind of probabilistic consensus algorithm, Bitcoin proved it possible to guarantee the integrity of a digital currency by relying on network majority votes instead of trusted institutions. By showing that it was technically feasible to, at least to some extent, replace the entire banking sector with computers, many significant actors started asking what else this new technology could help automate. A subsequent, seemingly inevitable, wave of efforts produced a multitude of new distributed ledger systems, architectures and applications, all somehow attempting to leverage distributed consensus algorithms to replace trusted intermediaries, facilitating value ownership, transfer and regulation. In this thesis, we scrutinize distributed ledger technologies in terms of how they could help facilitate the digitization of contractual cooperation, especially in the context of the supply chain and manufacturing industries. Concretely, we consider them from three distinct technical perspectives, (1) performance, (2) interoperability and (3) integration. Voting systems, with or without probabilistic mechanisms, require significant time and resources to operate, for which reason it becomes relevant to investigate how the costs of running those systems can be mitigated. In particular, we consider how a blockchain, a form of distributed ledger, can be pruned to in order to reduce disk space requirements. Furthermore, no technical system part of a larger business is an island, but will have to be able to interoperate with other systems to maximize the opportunity for automation. For this reason, we also consider how transparent message translation between systems could be facilitated, as well as presenting a formalism for expressing the syntactic structure of message payloads. Finally, we propose a concrete architecture, the Exchange Network, that models contractual interactions as negotiations about token exchanges rather than as function invocations and state machine transitions, which we argue lowers the barrier to compatibility with conventional legal and business practices. Even if no more trusted institutions could be replaced by any forthcoming distributed ledger technologies, we believe contractual interactions becoming more digital would lead to an increased opportunity for using computers to monitor, assist or even directly participate in the negotiation, management and tracking of business agreements, which we see as more than enough to warrant the cost of further developing of the technology. Such computer involvement may not just save time and reduce costs, but could also enable new kinds of computer-driven economies. In the long run, this may enable new levels of resource optimization, and not just within large organizations, but also smaller companies, or even the homes of families and individuals.
323

Attribute-based Approaches for Secure Data Sharing in Industry

Chiquito, Alex January 2022 (has links)
The Industry 4.0 revolution relies heavily on data to generate value, innovation, new services, and optimize current processes [1]. Technologies such as Internet of Things (IoT), machine learning, digital twins, and much more depend directly on data to bring value and innovation to both discrete manufacturing and process industries. The origin of data may vary from sensor data to financial statements and even strictly confidential user or business data. In data-driven ecosystems, collaboration between different actors is often needed to provide services such as analytics, logistics, predictive maintenance, process improvement, and more. Data therefore cannot be considered a corporate internal asset only. Hence, data needs to be shared among organizations in a data-driven ecosystem for it to be used as a strategic resource for creating desired values, innovations, or process improvements [2]. When sharing business critical and sensitive data, the access to the data needs to be accurately controlled to prevent leakage to authorized users and organizations.  Access control is a mechanism to control actions of users over objects, e.g., to read, write, and delete files, accessing data, writing over registers, and so on. This thesis studies one of the latest access control mechanisms in Attribute Based Access Control (ABAC) for industrial data sharing. ABAC emerges as an evolution of the commonly industry-wide used Role-based Access Control. ABAC presents the idea of attributes to create access policies, rather than manually assigned roles or ownerships, enabling for expressive fine-granular access control policies. Furthermore, this thesis presents approaches to implement ABAC into industrial IoT data sharing applications, with special focus on the manageability and granularity of the attributes and policies.  The thesis also studies the implications of outsourced data storage on third party cloud servers over access control for data sharing and explores how to integrate cryptographic techniques and paradigms into data access control. In particular, the combination of ABAC and Attribute-Based Encryption (ABE) is investigated to protect privacy over not-fully trusted domains. In this, important research gaps are identified. / Arrowhead Tools
324

Machine Learning: Requisitos y nuevas técnicas para la aplicación en entornos industriales e Internet of Things

Barrera, Jose Manuel 23 January 2024 (has links)
Con la conversión de la industria tradicional a industria 4.0, la inteligencia artificial (IA), Internet de las Cosas (IoT), el Machine Learning (ML) y la calidad de los datos (DQ) adquieren nuevas dimensiones y una notoria relevancia en el campo de la industria. Uno de los usos más solicitados de la IA en el campo de la industria es la búsqueda de la mejora de la rentabilidad, ya sea aumentando la producción, o disminuyendo los costes del propio proceso industrial. En esta tesis se abordan los dos objetivos: para la optimización de producción, se presenta un sistema para la cuantificación de energía generada para instalaciones solares fotovoltaicas. Este modelo está basado en Open Data provenientes de satélites, sensores IoT y Redes Neuronales Artificiales (ANN). Así, facilitamos a los lectores la información necesaria para decidir cuánto deberían invertir en una ubicación en concreto, en función de la producción energética deseada. En comparación con las propuestas más avanzadas, nuestra solución proporciona una capa de abstracción centrada en la producción de energía, en lugar de en los datos de radiación, y puede entrenarse y adaptarse a diferentes ubicaciones utilizando Open Data. Por otra parte, para la disminución de costes del propio proceso industrial, se presenta un modelo de ML basado en autoencoders que permite la Detección y el Diagnosis de Errores (FDD) y acorta las interrupciones del proceso productivo. El enfoque presentado explota los datos generados por el proceso industrial y entrena una arquitectura basada en ML, combinando varios algoritmos con autoencoders y ventanas deslizantes. La solución presentada ayuda a detectar precozmente las averías y se ha sido probada con datos reales procedentes de una instalación con una turbina de cogeneración de energía eléctrica. Además, se destaca que, aunque el ejemplo usado para nuestro enfoque utiliza una instalación industrial de una turbina de gas, éste puede adaptarse a otros problemas de FDD en otros procesos industriales que podrían beneficiarse de las ventajas mencionadas. Sin embargo, durante la realización del proyecto se han encontrado dos dificultades adicionales: que no existe una metodología establecida para la captación de requisitos para proyectos de ML; y que no existe información adecuada y suficiente sobre el efecto de las desviaciones de la Calidad de lo Datos (DQ) en dichos modelos ML. Por consiguiente, en esta tesis se presentan dos soluciones adicionales para dichos problemas. Para solventar la captación de requisitos en proyectos de ML, se presenta una particularización del modelo de requisitos iStar (i*). Mediante esta particularización, los recién llegados y los no expertos en el campo del ML pueden seguir una metodología que les guiará por el camino correcto a seguir, evitando modelos no válidos. El marco i* es un popular lenguaje de modelado para capturar el entorno y los requisitos de un sistema. Sin embargo, se ha construido sobre una capa de abstracción muy alta, y sus usos en un campo específico dependen en gran medida de la experiencia del diseñador. La propuesta presentada resuelve esto: especifica el marco i*, cubriendo las principales lagunas entre el ML y el modelado conceptual. Así, proporcionamos una línea de base adecuada a seguir que captura los requisitos y cumple las numerosas restricciones del campo del ML. Además, se presenta una guía basada en preguntas para aplicar dicha propuesta, y se aplica la metodología descrita en el proyecto de la turbina de gas descrito previamente. Así, se puede ver su viabilidad y cómo su uso filtra diseños no válidos. Respecto al punto de ver la cuantificación del efecto de DQ en los proyectos de ML, en esta tesis se presenta un enfoque sistemático, basado en la norma ISO 25012, para estimar el impacto de la degradación de la DQ en diferentes algoritmos, con el fin de cuantificar dicho efecto en la salida de un modelo de ML de una instalación real (la de la turbina de gas de cogeneración). Se debe destacar que en una instalación real puede haber fuertes restricciones de tiempo y espacio, y la limpieza de datos como tal no siempre es posible, por lo que el modelo de ML debe lidiar de con dichos problemas de DQ. Para ello, se ha definido una metodología mediante la cual, se contamina progresivamente los datos para disminuir dos características de D, accuracy y currentness. Como resumen, la tesis se centra en 4 puntos: optimización de un proceso industrial de generación solar fotovoltaica, mejora en el mantenimiento industrial mediante mantenimiento predictivo en una instalación real, metodología para la captación de requisitos en un proyecto de ML, y efectos de DQ sobre las salidas de los modelos de ML en un entorno industrial. Así pues, esta tesis mejora de manera integral distintos aspectos relacionados con el ML, el IoT y las instalaciones industriales. / Esta Tesis Doctoral ha sido posible gracias a la financiación recibida de diversas fuentes. En primer lugar, la beca UAIND18-08A de la Universidad de Alicante, bajo el título "Técnicas Analíticas en Sistemas IoT". El objetivo de esta beca es la realización del doctorado industrial soportado por el convenio LUCENTIALAB2-18Y, que establece el convenio de colaboración en el que la tesis está co-financiada por la Universidad de Alicante y la empresa Lucentia Lab S.L. Por otra parte, a lo largo de la tesis se ha participado en diversos proyectos, que han hecho posible el envío de artículos a diversas conferencias y revistas. Entre estos proyectos se encuentran los proyectos nacionales ECLIPSE-UA (RTI2018-094283-B-C32) y AETHER-UA (PID2020-112540RB-C43) financiados por el Ministerio de Economía y Empresa y el Ministerio de Ciencia e Innovación respectivamente, y el proyecto regional BALLADEER (PROMETEO/2021/088) financiado por la Generalitat Valenciana.
325

Effective Digitization in Brownfield Factories : A conceptualized model for technology transfer to brownfield production factories through smart factory lab

Gajanan Naik, Harshavardhan January 2024 (has links)
The exploration of Smart Factories and Industry 4.0 technologies has indeed sparked curiosity and interest in the industrial world. The potential of these advancements to revolutionize manufacturing processes, enhance efficiency, and drive innovation is immense. However, there is a gap in research when it comes to the practical implementation of these advanced technologies in real-world production settings, especially in already established factories so-called Brownfield Factories. This thesis work was conducted within one such brownfield factory to comprehend the tangible challenges associated with transferring smart technologies. Within this specific company, a laboratory had already been established for testing novel smart technologies in the context of production and logistics. The aim in companies is to test smart technologies in a controlled environment without causing any disruption to the ongoing profit-generating production processes. This laboratory setup also serves the additional purpose of educating the personnel within traditional production facilities about the upcoming smart technologies in the market. The Lab showcases the potential of new and emerging technologies in addressing long-standing issues with a fresh perspective, thereby inspiring innovation. The central approach of this thesis revolves around the establishment of a standardized laboratory work process through which smart technology can be tested in a structured way. In this context, an illustrative example of a technology, namely "Virtual Training for Assembly Operators," was chosen as a case study to explore and comprehend the challenges associated with technology transfer. This case study also played a pivotal role in assessing the credibility of the standard technology transfer model formulated within the company. Notably, it was deduced that knowledge and competence are two key obstacles impeding the smooth transfer of technology. Building upon the insights garnered from the case study on virtual training technology and drawing from interviews with engineers and managers employed at the case company, a refined technology transfer process named the "Smart Factory Lab Process" was developed. This process aims to enable the effective transfer of smart technologies, informed by the lessons learned from the practical application of technology in real-world scenarios.
326

The impact of applying participatory design methods in an industry 4.0 environment

Rosenlew, Matilda January 2022 (has links)
Industry 4.0 (I4.0) productions are complex environments driven by production data to make informed decisions affecting the events and items on the production line. This complexity can have a negative effect on the factory workers’ adoption rate of the new technology. More specifically, it can lead to the factory workers feeling passive and lacking influence over the tools used. Therefore, new UX methods and increased UX maturity are called for, to better suit the ever changing environments of I4.0 organizations. To ensure adoption, positive attitudes and intentions regarding user ownership, expertise and knowledge sharing are required. In this thesis project, participatory design (PD) methods are used to evaluate, whether PD has a positive effect on such attitudes and intentions toward new tools introduced on the production line. Five participants, employees from the I4.0 company Northvolt, were recruited to take part in a PD workshop to design a human-machine interface (HMI). The participants attitudes and intentions towards the tool were measured and explored through the PD workshop, surveys and user interviews. The outcome was also compared to the survey results on the tools already in use on the production line. The study resulted in increased positive attitudes and intentions towards user ownership, knowledge sharing and expertise concerning the HMI. Thus, the application of PD in I4.0 environments had an overall positive impact. Researchers are called to assess these effects in the long term, by allowing the participants to use the tool in a practical context overtime. / Industry 4.0 (4.0) produktioner är komplexa miljöer drivna av produktionsdata för att kunna göra informerade beslut som påverkar händelserna och produkterna på produktionslinjen. Denna komplexitet kan ha en negativ effekt på fabriksarbetarnas adoptionsfrekvens av den nya teknologin. Mer specifikt, kan det leda till att fabriksarbetarna känner passivitet och att de saknar inflytande över de digitala verktygen som används. För att bättre passa de föränderliga miljöerna i I4.0 organisationer, behövs nya User Experience (UX) metoder och ökad UX mognad. För att säkerställa adoption, positiva attityder och avsikter angående ”user ownership”, behövs expertis och kunskapsdelning. I detta examensprojekt, används ”participatory design” (PD) metoder för att evaluera om PD har en positiv effekt på sådana attityder och avsikter gentemot nya digitala verktyg introducerade på produktionslinjen. Fem deltagare, anställda från I4.0 företaget Northvolt, rekryterades för att ta del av en PD workshop för att designa ett ”human-machine interface” (HMI). Deltagarnas attityder och avsikter gentemot verktyget mättes och utforskades genom PD workshopen, enkäter och användarintervjuer. Utfallet blev jämfört med enkätresultat gällande digitala verktyg som redan används på produktionslinjen. Projektet resulterade i ökade positiva attityder och avsikter rörande user ownership, kunskapsdelning och expertis gentemot HMIt. Således, appliceringen av PD i I4.0 miljöer hade en övergripande positiv påverkan. Forskare uppmanas att bedöma dessa effekter långsiktigt, genom att tillåta deltagarna att använda det digitala verktyget i en praktiken över tid.
327

Durchgängige Digitalisierung industrieller Abläufe am Beispiel der Modellfabrik der FH Münster

Salewski, Falk, Bodenburg, Sven, Malechka, Tatsiana 03 March 2023 (has links)
Die Modellfabrik der FH Münster erlaubt durch den Umfang und die Komplexität der enthaltenen Automatisierungsaufgaben sowie einen Aufbau aus industriellen Komponenten eine praxisnahe Lehre im Bereich aktueller Anlagenautomatisierung und darüber hinausgehenden Funktionen im Sinne einer durchgängigen Digitalisierung. Die verwendete Unterscheidung der durchgängigen Digitalisierung in horizontale und vertikale Verknüpfungen wird veranschaulicht. Aufbauend auf Erfahrungen mit der Vorgängeranlage werden Neuerungen der 2021 aufgebauten neuen Modellfabrik vorgestellt. Neuerungen umfassen insbesondere die Modularisierung der Anlage, das umgesetzte Sicherheitskonzept, einen Webshop mit Onlinekonfigurator, eine Webvisualiserung des Anlagenzustandes inklusive der Energieverbräuche, sowie Möglichkeiten zur virtuellen Inbetriebnahme. Weiterhin wird das aktuelle Konzept zur Erweiterung der horizontalen digitalen Durchgängigkeit mittels der Einbindung eines autonomen mobilen Roboters in die Modellfabrik vorgestellt.
328

Durchgängige Digitalisierung industrieller Abläufe am Beispiel der Modellfabrik der FH Münster

Salewski, Falk, Bodenburg, Sven, Malechka, Tatsiana 24 March 2023 (has links)
Die Modellfabrik der FH Münster erlaubt durch den Umfang und die Komplexität der enthaltenen Automatisierungsaufgaben sowie einen Aufbau aus industriellen Komponenten eine praxisnahe Lehre im Bereich aktueller Anlagenautomatisierung und darüber hinausgehenden Funktionen im Sinne einer durchgängigen Digitalisierung. Die verwendete Unterscheidung der durchgängigen Digitalisierung in horizontale und vertikale Verknüpfungen wird veranschaulicht. Aufbauend auf Erfahrungen mit der Vorgängeranlage werden Neuerungen der 2021 aufgebauten neuen Modellfabrik vorgestellt. Neuerungen umfassen insbesondere die Modularisierung der Anlage, das umgesetzte Sicherheitskonzept, einen Webshop mit Onlinekonfigurator, eine Webvisualiserung des Anlagenzustandes inklusive der Energieverbräuche, sowie Möglichkeiten zur virtuellen Inbetriebnahme. Weiterhin wird das aktuelle Konzept zur Erweiterung der horizontalen digitalen Durchgängigkeit mittels der Einbindung eines autonomen mobilen Roboters in die Modellfabrik vorgestellt.
329

End-to-end QoS Mapping and Traffic Forwarding in Converged TSN-5G Networks

Satka, Zenepe January 2023 (has links)
The advancement of technology has led to an increase in the demand for ultra-low end-to-end network latency in real-time applications with a target of below 10ms. The IEEE 802.1 Time-Sensitive Networking (TSN) is a set of standards that supports the required low-latency wired communication with ultra-low jitter for real-time applications. TSN is designed for fixed networks thus it misses the flexibility of wireless networks.To overcome this limitation and to increase its applicability in different applications, an integration of TSN with other wireless technologies is needed. The fifth generation of cellular networks (5G) supports real-time applications with its Ultra-Reliable Low Latency Communication (URLLC) service. 5G URLLC is designed to meet the stringent timing requirements of these applications, such as providing reliable communication with latencies as low as 1ms. Seamless integration of TSN and 5G is needed to fully utilize the potential of these technologies in contemporary and future industrial applications. However, to achieve the end-to-end Quality of Service (QoS) requirements of a TSN-5G network, a significant effort is required due to the large dissimilarity between these technologies. This thesis presents a comprehensive and well-structured snapshot of the existing research on TSN-5G integration that identifies gaps in the current research and highlights the opportunities for further research in the area of TSN-5G integration. In particular, the thesis identifies that the state of the art lacks an end-to-end mapping of QoS requirements and traffic forwarding mechanisms for a converged TSN-5G network. This lack of knowledge and tool support hampers the utilisation of ground-breaking technologies like TSN and 5G. Hence, the thesis develops novel techniques to support the end-to-end QoS mapping and traffic forwarding of a converged TSN-5G network for predictable communication.Furthermore, the thesis presents a translation technique between TSN and 5G with a proof-of-concept implementation in a well-known TSN network simulator. Moreover, a novel QoS mapping algorithm is proposed to support the systematic mapping of QoS characteristics and integration of traffic flows in a converged TSN-5G network. / PROVIDENT
330

Using Machine Learning as a Tool to Improve Train Wheel Overhaul Efficiency

Gert, Oskar January 2020 (has links)
This thesis develops a method for using machine learning in a industrial pro-cess. The implementation of this machine learning model aimed to reduce costsand increase efficiency of train wheel overhaul in partnership with the AustrianFederal Railroads, Oebb. Different machine learning models as well as categoryencodings were tested to find which performed best on the data set. In addition,differently sized training sets were used to determine whether size of the trainingset affected the results. The implementation shows that Oebb can save moneyand increase efficiency of train wheel overhaul by using machine learning andthat continuous training of prediction models is necessary because of variationsin the data set.

Page generated in 0.045 seconds