• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 162
  • 20
  • 11
  • 11
  • 4
  • 3
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 323
  • 323
  • 135
  • 111
  • 81
  • 69
  • 66
  • 44
  • 43
  • 42
  • 39
  • 38
  • 36
  • 35
  • 34
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
311

Data analysis for predictive maintenance and potential challenges associated with the technology integration of steel industry machines.

Nath, Pradip January 2024 (has links)
The recharge is the focus of data analysis of the different situations with the integration of the system and development of the two-stage 2/2 proportional cartridge valve for the steel industry machine. Using the statistical analysis technique to visualize the valve signal data behavior identify the accuracy of the machine data and apply the statistical feature extracting model using classification and clustering algorithms of real-time data analysis for the manufacturing. The fundamental principles of data analysis with a particular emphasis on its key function in the collection, cleansing, and analysis of substantial amounts of data to develop significant insights. Moreover, we explore the importance of data visualization in effectively presenting intricate research outcomes. We get the data accuracy of 76 percent for train and test set data in the statistical analysis feature indicating the best accuracy in the early stage. Our model gives high accuracy of the recommendation data automation system of the steel industry. Analysis of the valve data in multiple ways for the predictive maintenance of conditional monitoring of the tubes mail production machine. PdM is used for data processing of predictive manufacturing, behavior patterns of machines data, and correlation of statistical model for decision making for the maintenance activity avoiding downtime.  The data consists of different channels in the steel industry machine. Some automation process is used for the feature combination of the analysis of valve data in industry between each feature and signals. Using a dataset comprised of sensor data, operation logs, and maintenance records industrial control data of machines and use of this predictive model has the potential to yield significant cost savings for the steel industry through the prevention of unplanned maintenance, while also enhancing operational safety manufacturing of machine in the industry.
312

Towards the Implementation of Condition-based Maintenance in Continuous Drug Product Manufacturing Systems

Rexonni B Lagare (8707320) 12 December 2023 (has links)
<p dir="ltr">Condition-based maintenance is a proactive maintenance strategy that prevents failures or diminished functionality in process systems through proper monitoring and management of process conditions. Despite being considered a mature maintenance management strategy in various industries, condition-based maintenance remains underutilized in pharmaceutical manufacturing. This situation needs to change, especially as the pharmaceutical industry continues to shift from batch to continuous manufacturing, where the implementation of CBM as a maintenance strategy assumes a greater importance.</p><p dir="ltr">This dissertation focused on addressing the challenges of implementing CBM in a continuous drug product manufacturing system. These challenges stem from the unique aspects of pharmaceutical drug product manufacturing, which includes the peculiar behavior of particulate materials and the evolutionary nature of pharmaceutical process development. The proposed solutions to address these challenges revolve around an innovative framework for the practical development of condition monitoring systems. Overall, this framework enables the incorporation of limited process knowledge in creating condition monitoring systems, which has the desired effect of empowering data-driven machine learning models.</p><p dir="ltr">A key feature of this framework is a formalized method to represent the process condition, which is usually vaguely defined in literature. This representation allows the proper mapping of preexisting condition monitoring systems, and the segmentation of the entire process condition model into smaller modules that have more manageable condition monitoring problems. Because this representation methodology is based on probabilistic graphical modelling, the smaller modules can then be holistically integrated via their probabilistic relationships, allowing the robust operation of the resulting condition monitoring system and the process it monitors.</p><p dir="ltr">Breaking down the process condition model into smaller segments is crucial for introducing novel fault detection capabilities, which enhances model prediction transparency and ensures prediction acceptance by a human operator. In this work, a methodology based on prediction probabilities was introduced for developing condition monitoring systems with novel fault detection capabilities. This approach relies on high-performing machine learning models capable of consistently classifying all the initially known conditions in the fault library with a high degree of certainty. Simplifying the condition monitoring problem through modularization facilitates this, as machine learning models tend to perform better on simpler systems. Performance indices were proposed to evaluate the novel fault detection capabilities of machine learning models, and a formal approach to managing novel faults was introduced.</p><p dir="ltr">Another benefit of modularization is the identification of condition monitoring blind spots. Applying it to the RC led to sensor development projects such as the virtual sensor for measuring granule flowability. This sensor concept was demonstrated successfully by using a data-driven model to predict granule flowability based on size and shape distribution measurements. With proper model selection and feature extraction guided by domain expertise, the resulting sensor achieved the best prediction performance reported in literature for granule flowability.</p><p dir="ltr">As a demonstration exercise in examining newly discovered faults, this work investigated a roll compaction phenomenon that is usually concealed from observation due to equipment design. This phenomenon results in the ribbon splitting along its thickness as it comes out of the rolls. In this work, important aspects of ribbon splitting were elucidated, particularly its predictability based on RC parameters and the composition of the powder blend used to form the ribbon. These findings have positive ramifications for the condition monitoring of the RC, as correspondence with industrial practitioners suggests that a split ribbon is desirable in some cases, despite being generally regarded as undesirable in the limited literature available on the subject.</p><p dir="ltr">Finally, this framework was primarily developed for the pharmaceutical dry granulation line, which consists of particle-based systems with a moderate level of complexity. However, it was also demonstrated to be feasible for the Tennessee Eastman Process (TEP), a more complex liquid-gas process system with a greater number of process faults, variables, and unit operations. Applying the framework resulted in machine learning models that yielded one of the best fault detection performances reported in literature for the TEP, while also introducing additional capabilities not yet normally reported in literature, such as fault diagnosis and novel fault detection.</p>
313

A Case Study: Optimising PAP ambulance location with data and travel time analysis

Lukas, Kurasinski, Jason, Tan January 2022 (has links)
The mental health concerns in Sweden have been increasing since the beginning of the 2000’s, where Skåne County in the southern parts of Sweden has shown to be slightly higher in a proportion of reported cases in comparison to other regions. To address the growing need for psychiatric healthcare, the health services of the region of Skåne (Region Skåne) have introduced a psychiatric ambulance unit as a part of first responders. The Prehospital Acute Psychiatry (PAP) ambulance is manned by health care professionals trained in mental health issues. The goal of Region Skåne is to reach 90% of the population within 20 minutes and 99% of the population within 35 minutes. This case study aims to provide valuable and useful information to the decision-makers in Region Skåne when placing additional PAP ambulance units. A PAP ambulance placed in an ambulance station uses an optimisation model previously created and is based on data and travel time analysis. The data analysis consists of K-Means clustering and Linear regression, to find similarities in the data as well as trends in the number of cases. The travel time analysis and the area and population coverage is based on 20, 35, and 60 minutes travel time from a station. The travel time is dependent on the road conditions as well as population density when considering ambulance stations for additional PAP ambulances. Malmö, Helsingborg, and Kristianstad/Hässleholm PAP stations are shown to be optimal choices, due to favourable road conditions and densely populated regions. Ambulances placed in these stations can cover much ground while also being able to attend to a major portion of the population. The data analysis also shows that it is beneficial to place ambulances in these stations, due to an increasing trend of mental illness cases in these areas symbolising a medium to a high number of cases in relation to the rest of Skåne.
314

Inferencing Gene Regulatory Networks for Drosophila Eye Development Using an Ensemble Machine Learning Approach

Abdul Jawad Mohammed (18437874) 29 April 2024 (has links)
<p dir="ltr">The primary purpose of this thesis is to propose and demonstrate BioGRNsemble, a modular and flexible approach for inferencing gene regulatory networks from RNA-Seq data. Integrating the GENIE3 and GRNBoost2 algorithms, this ensembles-of-ensembles method attempts to balance the outputs of both models through averaging, before providing a trimmed-down gene regulatory network consisting of transcription and target genes. Using a Drosophila Eye Dataset, we were able to successfully test this novel methodology, and our validation analysis using an online database determined over 3500 gene links correctly detected, albeit out of almost 530,000 predictions, leaving plenty of room for improvement in the future.</p>
315

Statistical Machine Learning in Biomedical Engineering

González Cebrián, Alba 15 April 2024 (has links)
[ES] Esta tesis, desarrollada bajo una beca de formación de personal investigador de la Universitat Politècnica de València, tiene como objetivo proponer y aplicar metodologías de Statistical Machine Learning en contextos de Ingeniería Biomédica. Este concepto pretende aunar el uso de modelos de aprendizaje automático junto con la búsqueda de comprensión e interpretabilidad clásica del razonamiento estadístico, dando lugar a soluciones tecnológicas de problemas biomédicos que no pasen únicamente por el objetivo de optimizar el desempeño predictivo de los modelos. Para ello, se han dibujado dos objetivos principales que vertebran además el documento: proponer metodologías novedosas dentro del paraguas del Statistical Machine Learning, y aplicar soluciones a problemas biomédicos reales manteniendo esta filosofía en mente. Estos objetivos se han materializado en contribuciones metodológicas para la simulación de valores atípicos y la imputación de datos faltantes en presencia de datos atípicos, y en contribuciones aplicadas a casos reales para la mejora de procesos de atención médica, la mejora en el diagnóstico y pronóstico de enfermedades, y la estandarización de procedimientos de medición en entornos biotecnológicos. Dichas contribuciones se han artículado en capítulos correspondientes a las dos partes principales ya mencionadas. Finalmente, las conclusiones y líneas futuras cierran el documento, recalcando los mensajes principales de las contribuciones de la tesis doctoral en general, y sentando además las bases para líneas futuras que se han dibujado a consecuencia del trabajo realizado a lo largo del doctorado. / [CA] Aquesta tesi, desenvolupada sota una beca de formació de personal investigador de la Universitat Politècnica de València, té com a objectiu proposar i aplicar metodologies de Statistical Machine Learning en contextos d'Enginyeria Biomèdica. Aquest concepte pretén unir l'ús de models d'aprenentatge automàtic juntament amb la cerca de comprensió i interpretació clàssica del raonament estadístic, donant lloc a solucions tecnològiques de problemes biomèdics que no passen únicament per l'objectiu d'optimitzar el rendiment predictiu dels models. Per a això, s'han dibuixat dos objectius principals que vertebren a més el document: proposar metodologies noves dins del paraigua del Statistical Machine Learning, i aplicar solucions a problemes biomèdics reals mantenint aquesta filosofia en ment. Aquests objectius s'han materialitzat en contribucions metodològiques per a la simulació de valors atípics i la imputació de dades mancants en presència de valors atípics, i en contribucions aplicades a casos reals per a la millora de processos d'atenció mèdica, la millora en el diagnòstic i pronòstic de malalties, i l'estandardització de procediments de mesurament en entorns biotecnològics. Aquestes contribucions s'han articulat en capítols corresponents a les dues parts principals ja esmentades. Finalment, les conclusions i línies futures tanquen el document, recalant els missatges principals de les contribucions, de la tesi doctoral en general, i assentant a més les bases per a línies futures que s'han dibuixat com a consequència del treball realitzat al llarg del doctorat. / [EN] This thesis, developed under a research personnel formation grant from the Universitat Politècnica de València, aims to propose and apply methodologies of Statistical Machine Learning in Biomedical Engineering contexts. This concept seeks to combine machine learning models with the classic understanding and interpretability of statistical reasoning, resulting in technological solutions for biomedical problems that go beyond solely optimizing the predictive performance of models. To achieve this, two main objectives have been outlined, which also structure the document: proposing novel methodologies within the umbrella of Statistical Machine Learning and applying solutions to real biomedical problems while keeping this philosophy in mind. These objectives have materialized into methodological contributions for simulating outliers and imputing missing data in the presence of outliers and applied contributions to real cases for improving healthcare processes, enhancing disease diagnosis and prognosis, and standardizing measurement procedures in biotechnological environments. These contributions are articulated in chapters corresponding to the aforementioned two main parts. Finally, the conclusions and future lines of research conclude the document, reiterating the main messages of the contributions and the overall doctoral thesis and laying the groundwork for future lines of inquiry stemming from the work conducted throughout the doctorate. / González Cebrián, A. (2024). Statistical Machine Learning in Biomedical Engineering [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/203529
316

CyberWater: An open framework for data and model integration

Ranran Chen (18423792) 03 June 2024 (has links)
<p dir="ltr">Workflow management systems (WMSs) are commonly used to organize/automate sequences of tasks as workflows to accelerate scientific discoveries. During complex workflow modeling, a local interactive workflow environment is desirable, as users usually rely on their rich, local environments for fast prototyping and refinements before they consider using more powerful computing resources.</p><p dir="ltr">This dissertation delves into the innovative development of the CyberWater framework based on Workflow Management Systems (WMSs). Against the backdrop of data-intensive and complex models, CyberWater exemplifies the transition of intricate data into insightful and actionable knowledge and introduces the nuanced architecture of CyberWater, particularly focusing on its adaptation and enhancement from the VisTrails system. It highlights the significance of control and data flow mechanisms and the introduction of new data formats for effective data processing within the CyberWater framework.</p><p dir="ltr">This study presents an in-depth analysis of the design and implementation of Generic Model Agent Toolkits. The discussion centers on template-based component mechanisms and the integration with popular platforms, while emphasizing the toolkit’s ability to facilitate on-demand access to High-Performance Computing resources for large-scale data handling. Besides, the development of an asynchronously controlled workflow within CyberWater is also explored. This innovative approach enhances computational performance by optimizing pipeline-level parallelism and allows for on-demand submissions of HPC jobs, significantly improving the efficiency of data processing.</p><p dir="ltr">A comprehensive methodology for model-driven development and Python code integration within the CyberWater framework and innovative applications of GPT models for automated data retrieval are introduced in this research as well. It examines the implementation of Git Actions for system automation in data retrieval processes and discusses the transformation of raw data into a compatible format, enhancing the adaptability and reliability of the data retrieval component in the adaptive generic model agent toolkit component.</p><p dir="ltr">For the development and maintenance of software within the CyberWater framework, the use of tools like GitHub for version control and outlining automated processes has been applied for software updates and error reporting. Except that, the user data collection also emphasizes the role of the CyberWater Server in these processes.</p><p dir="ltr">In conclusion, this dissertation presents our comprehensive work on the CyberWater framework's advancements, setting new standards in scientific workflow management and demonstrating how technological innovation can significantly elevate the process of scientific discovery.</p>
317

Stylometry: Quantifying Classic Literature For Authorship Attribution : - A Machine Learning Approach

Yousif, Jacob, Scarano, Donato January 2024 (has links)
Classic literature is rich, be it linguistically, historically, or culturally, making it valuable for future studies. Consequently, this project chose a set of 48 classic books to conduct a stylometric analysis on the defined set of books, adopting an approach used by a related work to divide the books into text segments, quantify the resulting text segments, and analyze the books using the quantified values to understand the linguistic attributes of the books. Apart from the latter, this project conducted different classification tasks for other objectives. In one respect, the study used the quantified values of the text segments of the books for classification tasks using advanced models like LightGBM and TabNet to assess the application of this approach in authorship attribution. From another perspective, the study utilized a State-Of-The-Art model, namely, RoBERTa for classification tasks using the segmented texts of the books instead to evaluate the performance of the model in authorship attribution. The results uncovered the characteristics of the books to a reasonable degree. Regarding the authorship attribution tasks, the results suggest that segmenting and quantifying text using stylometric analysis and supervised machine learning algorithms is practical in such tasks. This approach, while showing promise, may still require further improvements to achieve optimal performance. Lastly, RoBERTa demonstrated high performance in authorship attribution tasks.
318

Accessible Real-time Eye-Gaze Tracking For Neurocognitive Health Assessments, A Multimodal Web-based Approach

Tisdale, Daniel C 01 June 2024 (has links) (PDF)
We introduce a novel integration of real-time, predictive eye-gaze tracking models into a multimodal dialogue system tailored for remote health assessments. This system is designed to be highly accessible requiring only a conventional webcam for video input along with minimal cursor interaction and utilizes engaging gaze-based tasks that can be performed directly in a web browser. We have crafted dynamic subsystems that capture high-quality data efficiently and maintain quality through instances of user attrition and incomplete calls. Additionally, these subsystems are designed with the foresight to allow for future re-analysis using improved predictive models, as well as enable the creation and training of new eye-gaze tracking datasets. As we explored gaze patterns for various user-performed tasks, we developed generalizable eye-gaze metrics that capture and reflect the distinct gaze trends among different cohorts. And through testing various feature extraction and classification methods, we have found promising results that have enabled us to effectively classify individuals with Mild Neurocognitive Disorder (MiNCD) / Mild Cognitive Impairment (MCI) in a crowdsourced pilot study (N = 35) with an average accuracy of 0.94 (f1 = 0.83). Although just the beginning, this work represents the first step towards establishing predictive eye-gaze tracking as an accessible and important modality for healthcare applications moving forward, with the potential to significantly impact remote screening and monitoring of neurocognitive health.
319

Development of direct measurement techniques for the in-situ internal alignment of accelerating structures

Galindo Muñoz, Natalia 16 April 2018 (has links)
Las exigentes tolerancias de alineación en los componentes de los futuros colisionadores lineales de partículas requieren el desarrollo de nuevas técnicas de alineación más precisas que las existentes. Este es el caso del Colisionador Lineal Compacto (Compact Linear Collider, CLIC), cuyos objetivos altamente restrictivos de alineamiento alcanzan los 10 um. Para poder lograr el máximo rendimiento del acelerador, es necesario que el posicionamiento de las estructuras que aceleran las partículas y de los campos que las guían cumplan las tolerancias de alineación para dirigir el haz a lo largo de la trayectoria diseñada. Dicho procedimiento consiste en relacionar la posición de los ejes de referencia de cada componente con respecto a objetos externos, o fiduciales, lo cual resulta muy tedioso y económicamente costoso. Los errores sistemáticos y aleatorios se van acumulando en cada paso del proceso y, en consecuencia, la precisión final de alineamiento es todo un desafío. En este contexto, nace el proyecto PACMAN (Particle Accelerator Components Metrology and Alignment to the Nanometre scale), subvencionado por la Unión Europea en el programa FP7 de financiación para la investigación e innovación. El objetivo principal de PACMAN es investigar, desarrollar e implementar una solución integrada alternativa que incorpore todos los pasos de alineación en una misma ubicación, con el objetivo de mejorar la precisión de alineación de los componentes de los aceleradores, en concreto: las estructuras aceleradoras, los cuadrupolos y los monitores de posición de haz. La viabilidad de las soluciones desarrolladas y la precisión de alineamiento alcanzada deben de demostrarse en un banco de pruebas utilizando componentes de CLIC. La estrategia de PACMAN para alcanzar el objetivo técnico se divide en tres pasos. El primero consiste en la fiducialización de los componentes y sus soportes. El segundo paso es el ensamblaje de los componentes en dos tipos de soporte, uno compuesto por un monitor de posición de haz y un cuadrupolo, y otro con cuatro estructuras aceleradoras, tomando como referencia su centro electromagnético. Finalmente, ambos soportes se transportan al túnel para su alineación final utilizando técnicas de hilos tensados. En esta tesis doctoral, se describe el desarrollo de una nueva técnica no destructiva para localizar los ejes electromagnéticos de estructuras aceleradoras y su validación experimental. Para ello, se ha utilizado una estructura aceleradora de CLIC conocida como TD24. Debido a la complejidad mecánica de la TD24, su difícil acceso y su diámetro medio de iris de 5.5 mm, se desarrolla una nueva técnica denominada en esta tesis como 'el método perturbativo' y se realiza una propuesta experimental de validación. El estudio de viabilidad de este método, cumpliendo con los requisitos impuestos de precisión en la medida de 10 um, ha sido realizado con una campaña extensa de simulaciones de campos electromagnéticos en tres dimensiones utilizando la herramienta de software conocida como HFSS. Los resultados de simulación han permitido el desarrollo de un algoritmo muy completo de medidas y han proporcionado las especificaciones técnicas para el diseño conceptual de un banco de pruebas para la medida de los ejes electromagnéticos de la TD24. El preciso ensamblaje del banco de pruebas y sus correspondientes calibraciones, la incorporación de nuevos tratamientos de las medidas en el algoritmo final y la caracterización de fuentes de error en la medida, favorecieron la localización del centro electromagnético en la TD24 con una precisión menor a 1 um con un error estimado menor que 8.5 um, cumplimiendo con los objetivos de precisión establecidos. / In the next generation of linear particle accelerators, challenging alignment tolerances are required in the positioning of the components focusing, accelerating and detecting the beam over the accelerator length in order to achieve the maximum machine performance. In the case of the Compact Linear Collider (CLIC), accelerating structures, beam position monitors and quadrupole magnets need to be aligned in their support with respect to their reference axes with an accuracy of 10 um. To reach such objective, the PACMAN (Particle Accelerator Components Metrology and Alignment to the Nanometer Scale) project strives for the improvement of the current alignment accuracy by developing new methods and tools, whose feasibility should be validated using the major CLIC components. This Ph.D. thesis concerns the investigation, development and implementation of a new non-destructive intracavity technique, referenced here as 'the perturbative method', to determine the electromagnetic axes of accelerating structures by means of a stretched wire, acting as a reference of alignment. Of particular importance is the experimental validation of the method through the 5.5 mm iris-mean aperture CLIC prototype known as TD24, with complex mechanical features and difficult accessibility, in a dedicated test bench. In the first chapter of this thesis, the alignment techniques in particle accelerators and the novel proposals to be implemented in the future linear colliders are introduced, and a detailed description of the PACMAN project is provided. The feasibility study of the method, carried out with extensive electromagnetic fields simulations, is described in chapter 2, giving as a result, the knowledge of the theoretical accuracy expected in the measurement of the electromagnetic axes and facilitating the development of a measurement algorithm. The conceptual design, manufacturing and calibration of the automated experimental set-up, integrating the solution developed to measure the electromagnetic axes of the TD24, are covered in chapter 3. The future lines of research and developments of the perturbative method are also explored. In chapter 4, the most significant results obtained from an extensive experimental work are presented, analysed and compared with simulations. The proof-of-principle is completed, the measurement algorithm is optimised and the electromagnetic centre is measured in the TD24 with a precision less than 1 um and an estimated error less than 8.5 um. Finally, in chapter 5, the developments undertaken along this research work are summarised, the innovative achievements accomplished within the PACMAN project are listed and its impact is analysed. / En la generació pròxima d'acceleradors de partícules lineals, desafiant toleràncies d'alineament és requerit en el posicionament dels components que enfoquen, accelerant i detectant la biga sobre la longitud d'accelerador per tal d'aconseguir l'actuació de màquina màxima. En el cas del Colisionador Compacte Lineal (CLIC), accelerant estructures, monitors de posició de fes i imants necessiten ser alineats en el seu suport amb respectar a les seves destrals de referència amb una precisió de 10 um. Per assolir tal objectiu, el PACMAN (Metrologia de Components de l'Accelerador de partícules i Alineament al Nanometer Escala) projecte s'esforça per la millora de l'actual precisió d'alineament per mètodes nous en desenvolupament i eines, la viabilitat dels quals hauria de ser validada utilitzant els components de CLIC importants. Aquesta tesi concerneix la investigació, desenvolupament i implementació d'un nou no-destructiu tècnica interna, va referenciar ací mentre 'el mètode de pertorbació' per determinar les destrals electromagnètiques d'accelerar estructures mitjançant un cable estès, actuant com a referència d'alineament. De la importància particular és la validació experimental del mètode a través del 5.5 mm iris-roí obertura prototipus de CLIC sabut com TD24, amb característiques mecàniques complexes i accessibilitat difícil, en un banc de prova dedicat. En el primer capítol d'aquesta tesi, les tècniques d'alineament en acceleradors de partícules i les propostes novelles per ser implementades en el futur colisionador lineal és introduït, i una descripció detallada del projecte PACMAN és proporcionat. L'estudi de viabilitat el mètode de pertorbació, va dur a terme amb simulacres de camps electromagnètics extensos, és descrit dins capitol 2, donant com a resultat, el coneixement de la precisió teòrica esperada en la mida de les destrals electromagnètiques i facilitant el desenvolupament d'un algoritme de mida. El disseny conceptual, fabricació i calibratge del conjunt experimental automatitzat-amunt, integrant la solució desenvolupada per mesurar les destrals electromagnètiques del TD24, és cobert dins capitol 3. Les línies futures de recerca i desenvolupaments del mètode és també va explorar. Dins capitol 4, la majoria de resultats significatius van obtenir d'una faena experimental extensa és presentada, analitzat i comparat amb simulacres. La prova-de-el principi és completat, l'algoritme de mida és optimitzat i el centre electromagnètic és mesurat en el TD24 amb una precisió menys d'1 um i un error calculat menys de 8.5 um. Finalment, dins capitol 5, els desenvolupaments empresos al llarg d'aquesta faena de recerca és resumit, les consecucions innovadores van acomplir dins del projecte PACMAN és llistat i el seu impacte és analitzat. / Galindo Muñoz, N. (2018). Development of direct measurement techniques for the in-situ internal alignment of accelerating structures [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/100488 / TESIS
320

Auditable Computations on (Un)Encrypted Graph-Structured Data

Servio Ernesto Palacios Interiano (8635641) 29 July 2020 (has links)
<div>Graph-structured data is pervasive. Modeling large-scale network-structured datasets require graph processing and management systems such as graph databases. Further, the analysis of graph-structured data often necessitates bulk downloads/uploads from/to the cloud or edge nodes. Unfortunately, experience has shown that malicious actors can compromise the confidentiality of highly-sensitive data stored in the cloud or shared nodes, even in an encrypted form. For particular use cases —multi-modal knowledge graphs, electronic health records, finance— network-structured datasets can be highly sensitive and require auditability, authentication, integrity protection, and privacy-preserving computation in a controlled and trusted environment, i.e., the traditional cloud computation is not suitable for these use cases. Similarly, many modern applications utilize a "shared, replicated database" approach to provide accountability and traceability. Those applications often suffer from significant privacy issues because every node in the network can access a copy of relevant contract code and data to guarantee the integrity of transactions and reach consensus, even in the presence of malicious actors.</div><div><br></div><div>This dissertation proposes breaking from the traditional cloud computation model, and instead ship certified pre-approved trusted code closer to the data to protect graph-structured data confidentiality. Further, our technique runs in a controlled environment in a trusted data owner node and provides proof of correct code execution. This computation can be audited in the future and provides the building block to automate a variety of real use cases that require preserving data ownership. This project utilizes trusted execution environments (TEEs) but does not rely solely on TEE's architecture to provide privacy for data and code. We thoughtfully examine the drawbacks of using trusted execution environments in cloud environments. Similarly, we analyze the privacy challenges exposed by the use of blockchain technologies to provide accountability and traceability.</div><div><br></div><div>First, we propose AGAPECert, an Auditable, Generalized, Automated, Privacy-Enabling, Certification framework capable of performing auditable computation on private graph-structured data and reporting real-time aggregate certification status without disclosing underlying private graph-structured data. AGAPECert utilizes a novel mix of trusted execution environments, blockchain technologies, and a real-time graph-based API standard to provide automated, oblivious, and auditable certification. This dissertation includes the invention of two core concepts that provide accountability, data provenance, and automation for the certification process: Oblivious Smart Contracts and Private Automated Certifications. Second, we contribute an auditable and integrity-preserving graph processing model called AuditGraph.io. AuditGraph.io utilizes a unique block-based layout and a multi-modal knowledge graph, potentially improving access locality, encryption, and integrity of highly-sensitive graph-structured data. Third, we contribute a unique data store and compute engine that facilitates the analysis and presentation of graph-structured data, i.e., TruenoDB. TruenoDB offers better throughput than the state-of-the-art. Finally, this dissertation proposes integrity-preserving streaming frameworks at the edge of the network with a personalized graph-based object lookup.</div>

Page generated in 0.0503 seconds