• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 50
  • 8
  • 6
  • 4
  • 4
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 98
  • 24
  • 22
  • 20
  • 13
  • 13
  • 13
  • 12
  • 11
  • 10
  • 10
  • 10
  • 9
  • 9
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Transformace a perzistence XML v relační databázi / XML Transformation and Persistence in Relational Database

Hernych, Radim January 2009 (has links)
This master's thesis deals with problems of effective storage and querying of XML documents in A relational database. In the first part of this thesis XML and related technologies are described, with emphasis on languages for XML Schema definition and it's querying. Then an XML mapping method based on Hybrid method and XML Schema is described here. Hereafter the design and implementation of a native XML database front-end for object-relational database is described. Designed front-end allows storing and querying of XML documents into the underlying database. The last part contains results and evaluation of performance tests of the implemented system.
52

Environmental Implications of Cu-Based Nanoparticles and Biocides Products

Tegenaw, Ayenachew G., Ph.D. January 2019 (has links)
No description available.
53

A Value Proposition for Cloud-Enabled Process Planning

Tollander, Sofia January 2019 (has links)
To stay competitive in today’s fast-paced market, manufacturing companies must shorten their time-to-market and decrease their costs by efficiently utilizing their resources. Here, improved software and better software integration throughout the product realization process is considered to be a key enabler. The aim of this thesis work has been to investigate the current workflow in design and process planning to outline a cloud-based application to support these activities. Pains and bottlenecks in these workflows have been identified through interview and field studies at six Swedish manufacturing companies of different sizes, in different industries, and with different operational models. The major areas of improvement were identified whereof one them, the initial activity of understanding customers’ needs, was decided to further focus on. From receiving a request for quotation from the customer to acceptance of an order, the following time-consuming activities were recognized: understand and discuss design intent as well as suggest possible design changes to improve manufacturability, analyze and review 2D drawings and 3D models, and develop order quotations. In this thesis work, a mock-up prototype has been put forward. The intent with this is to bridge a gap that has been identified through the mapping between manufacturers needs and functionality of available CAD/CAM software and the identified areas of improvement from the workflow investigations. The proposed solution, as presented in the mock-up prototype, has been validated together with three of the studied companies. At its current state, further improvements and validations are needed. Nevertheless, if further developed, it has the potential to create value within the entire manufacturing value chain. / För att vara konkurrenskraftig på dagens snabbrörliga marknad måste tillverkande företag minska sina tid till marknad och reducera sina kostnader genom att utnyttja sina resurser mer effektivt. Här anses förbättrad programvara och bättre mjukvaruintegration genom hela produktrealiseringsprocessen vara en viktig möjliggörare. Målet med detta arbete har varit att undersöka det aktuella arbetsflödet i design och processplanering för att föreslå en molnbaserad applikation för att stödja dessa aktiviteter. Problem och flaskhalsar i dessa arbetsflöden har identifierats genom intervjuer och fältstudier hos sex svenska tillverkande bolag i olika storlekar, i olika branscher och med olika operativa modeller. De viktigaste förbättringsområdena identifierades varav en, den initiala aktiviteten för att förstå kundernas behov, beslutades att ytterligare fokusera på. Från att ha mottagit en offertförfrågan från kunden till acceptans av en beställning kunde följande tidskrävande aktiviteter identifieras: förstå och diskutera designintention samt föreslå möjliga designändringar för att förbättra tillverkningsförmåga, analysera och granska 2D-ritningar och 3D-modeller, och utveckla ordernoteringar. I det här arbetet har en prototyplösning tagits fram. Avsikten med denna är att överbrygga ett gap som har identifierats genom kartläggning mellan tillverkarens behov och funktionalitet av tillgänglig CAD / CAM-programvara och de identifierade förbättringsområdena från arbetsflödesutredningarna. Den föreslagna lösningen, som presenterades i prototypen, har validerats tillsammans med tre av de studerade företagen. Vid sitt nuvarande tillstånd behövs ytterligare förbättringar och valideringar. Om den vidareutvecklas har den dock potentialen att skapa värde för hela tillverkningskedjan.
54

Blockchain-enabled Secure and Trusted Personalized Health Record

Dong, Yibin 20 December 2022 (has links)
Longitudinal personalized electronic health record (LPHR) provides a holistic view of health records for individuals and offers a consistent patient-controlled information system for managing the health care of patients. Except for the patients in Veterans Affairs health care service, however, no LPHR is available for the general population in the U.S. that can integrate the existing patients' electronic health records throughout life of care. Such a gap may be contributed mainly by the fact that existing patients' electronic health records are scattered across multiple health care facilities and often not shared due to privacy and security concerns from both patients and health care organizations. The main objective of this dissertation is to address these roadblocks by designing a scalable and interoperable LPHR with patient-controlled and mutually-trusted security and privacy. Privacy and security are complex problems. Specifically, without a set of access control policies, encryption alone cannot secure patient data due to insider threat. Moreover, in a distributed system like LPHR, so-called race condition occurs when access control policies are centralized while decisions making processes are localized. We propose a formal definition of secure LPHR and develop a blockchain-enabled next generation access control (BeNGAC) model. The BeNGAC solution focuses on patient-managed secure authorization for access, and NGAC operates in open access surroundings where users can be centrally known or unknown. We also propose permissioned blockchain technology - Hyperledger Fabric (HF) - to ease the shortcoming of race condition in NGAC that in return enhances the weak confidentiality protection in HF. Built upon BeNGAC, we further design a blockchain-enabled secure and trusted (BEST) LPHR prototype in which data are stored in a distributed yet decentralized database. The unique feature of the proposed BEST-LPHR is the use of blockchain smart contracts allowing BeNGAC policies to govern the security, privacy, confidentiality, data integrity, scalability, sharing, and auditability. The interoperability is achieved by using a health care data exchange standard called Fast Health Care Interoperability Resources. We demonstrated the feasibility of the BEST-LPHR design by the use case studies. Specifically, a small-scale BEST-LPHR is built for sharing platform among a patient and health care organizations. In the study setting, patients have been raising additional ethical concerns related to consent and granular control of LPHR. We engineered a Web-delivered BEST-LPHR sharing platform with patient-controlled consent granularity, security, and privacy realized by BeNGAC. Health organizations that holding the patient's electronic health record (EHR) can join the platform with trust based on the validation from the patient. The mutual trust is established through a rigorous validation process by both the patient and built-in HF consensus mechanism. We measured system scalability and showed millisecond-range performance of LPHR permission changes. In this dissertation, we report the BEST-LPHR solution to electronically sharing and managing patients' electronic health records from multiple organizations, focusing on privacy and security concerns. While the proposed BEST-LPHR solution cannot, expectedly, address all problems in LPHR, this prototype aims to increase EHR adoption rate and reduce LPHR implementation roadblocks. In a long run, the BEST-LPHR will contribute to improving health care efficiency and the quality of life for many patients. / Doctor of Philosophy / Longitudinal personalized electronic health record (LPHR) provides a holistic view of health records for individuals and offers a consistent patient-controlled information system for managing the health care of patients. Except for the patients in Veterans Affairs health care service, however, no LPHR is available for the general population in the U.S. that can integrate the existing patients' electronic health records throughout life of care. Such a gap may be contributed mainly by the fact that existing patients' electronic health records are scattered across multiple health care facilities and often not shared due to privacy and security concerns from both patients and health care organizations. The main objective of this dissertation is to address these roadblocks by designing a scalable and interoperable LPHR with patient-controlled and mutually-trusted security and privacy. We propose a formal definition of secure LPHR and develop a novel blockchain-enabled next generation access control (BeNGAC) model, that can protect security and privacy of LPHR. Built upon BeNGAC, we further design a blockchain-enabled secure and trusted (BEST) LPHR prototype in which data are stored in a distributed yet decentralized database. The health records on BEST-LPHR are personalized to the patients with patient-controlled security, privacy, and granular consent. The unique feature of the proposed BEST-LPHR is the use of blockchain technology allowing BeNGAC policies to govern the security, privacy, confidentiality, data integrity, scalability, sharing, and auditability. The interoperability is achieved by using a health care data exchange standard. We demonstrated the feasibility of the BEST-LPHR design by the use case studies. Specifically, a small-scale BEST-LPHR is built for sharing platform among a patient and health care organizations. We engineered a Web-delivered BEST-LPHR sharing platform with patient-controlled consent granularity, security, and privacy realized by BeNGAC. Health organizations that holding the patient's electronic health record (EHR) can join the platform with trust based on the validation from the patient. The mutual trust is established through a rigorous validation process by both the patient and built-in blockchain consensus mechanism. We measured system scalability and showed millisecond-range performance of LPHR permission changes. In this dissertation, we report the BEST-LPHR solution to electronically sharing and managing patients' electronic health records from multiple organizations, focusing on privacy and security concerns. While the proposed BEST-LPHR solution cannot, expectedly, address all problems in LPHR, this prototype aims to increase EHR adoption rate and reduce LPHR implementation roadblocks. In a long run, the BEST-LPHR will contribute to improving health care efficiency and the quality of life for many patients.
55

Practical Mitigations Against Memory Corruption and Transient Execution Attacks

Ismail, Mohannad Adel Abdelmoniem Ahmed 31 May 2024 (has links)
Memory corruption attacks have existed in C and C++ for more than 30 years, and over the years many defenses have been proposed. In addition to that, a new class of attacks, Spectre, has emerged that abuse speculative execution to leak secrets and sensitive data through micro-architectural side channels. Many defenses have been proposed to mitigate Spectre as well. However, with every new defense a new attack emerges, and then a new defense is proposed. This is an ongoing cycle between attackers and defenders. There exists many defenses for many different attack avenues. However, many suffer from either practicality or effectiveness issues, and security researchers need to balance out their compromises. Recently, many hardware vendors, such as Intel and ARM, have realized the extent of the issue of memory corruption attacks and have developed hardware security mechanisms that can be utilized to defend against these attacks. ARM, in particular, has released a mechanism called Pointer Authentication in which its main intended use is to protect the integrity of pointers by generating a Pointer Authentication Code (PAC) using a cryptographic hash function, as a Message Authentication Code (MAC), and placing it on the top unused bits of a 64-bit pointer. Placing the PAC on the top unused bits of the pointer changes its semantics and the pointer cannot be used unless it is properly authenticated. Hardware security features such as PAC are merely mechanisms not full fledged defences, and their effectiveness and practicality depends on how they are being utililzed. Naive use of these defenses doesn't alleviate the issues that exist in many state-of-the-art software defenses. The design of the defense that utilizes these hardware security features needs to have practicality and effectiveness in mind. Having both practicality and effectiveness is now a possible reality with these new hardware security features. This dissertation describes utilizing hardware security features, namely ARM PAC, to build effective and practical defense mechanisms. This dissertation first describes my past work called PACTight, a PAC based defense mechanism that defends against control-flow hijack- ing attacks. PACTight defines three security properties of a pointer such that, if achieved, prevent pointers from being tampered with. They are: 1) unforgeability: A pointer p should always point to its legitimate object; 2) non-copyability: A pointer p can only be used when it is at its specific legitimate location; 3) non-dangling: A pointer p cannot be used after it has been freed. PACTight tightly seals pointers and guarantees that a sealed pointer cannot be forged, copied, or dangling. PACTight protects all sensitive pointers, which are code pointers and pointers that point to code pointers. This completely prevents control-flow hijacking attacks, all while having low performance overhead. In addition to that, this dissertation proposes Scope-Type Integrity (STI), a new defense policy that enforces pointers to conform to the programmer's intended manner, by utilizing scope, type, and permission information. STI collects information offline about the type, scope, and permission (read/write) of every pointer in the program. This information can then be used at runtime to ensure that pointers comply with their intended purpose. This allows STI to defeat advanced pointer attacks since these attacks typically violate either the scope, type, or permission. We present Runtime Scope-Type Integrity (RSTI). RSTI leverages ARM Pointer Authentication (PA) to generate Pointer Authentication Codes (PACs), based on the information from STI, and place these PACs at the top bits of the pointer. At runtime, the PACs are then checked to ensure pointer usage complies with STI. RSTI overcomes two drawbacks that were present in PACTight: 1) PACTight relied on a large external metadata for protection, whereas RSTI uses very little metadata. 2) PACTight only protected a subset of pointers, whereas RSTI protects all pointers in a program. RSTI has large coverage with relatively low overhead. Also, this dissertation proposes sPACtre, a new and novel defense mechanism that aims to prevent Spectre control-flow attacks on existing hardware. sPACtre is an ARM-based defense mechanism that prevents Spectre control-flow attacks by relying on ARM's Pointer Authentication hardware security feature, annotations added to the program on the secrets that need to be protected from leakage and a dynamic tag-based bounds checking mechanism for arrays. We show that sPACtre can defend against these attacks. We evaluate sPACtre on a variety of cryptographic libraries with several cryptographic algorithms, as well as a synthetic benchmark, and show that it is efficient and has low performance overhead Finally, this dissertation explains a new direction for utilizing hardware security features to protect energy harvesting devices from checkpoint-recovery errors and malicious attackers. / Doctor of Philosophy / In recent years, cyber-threats against computer systems have become more and more preva- lent. In spite of many recent advancements in defenses, these attacks are becoming more threatening. However, many of these defenses are not implemented in the real-world. This is due to their high performance overhead. This limited efficiency is not acceptable in the real-world. In addition to that, many of these defenses have limited coverage and do not cover a wide variety of attacks. This makes the performance tradeoff even less convincing. Thus, there is a need for effective and practical defenses that can cover a wide variety of attacks. This dissertation first provides a comprehensive overview of the current state-of-the-art and most dangerous attacks. More specifically, three types of attacks are examined. First, control-flow hijacking attacks, which are attacks that divert the proper execution of a pro- gram to a malicious execution. Second, data oriented attacks. These are attacks that leak sensitive data in a program. Third, Spectre attacks, which are attacks that rely on sup- posedly hidden processor features to leak sensitive data. These "hidden" features are not entirely hidden. This dissertation explains these attacks in detail and the corresponding state-of-the-art defenses that have been proposed by the security research community to mitigate them. This dissertation then discusses effective and practical defense mechanisms that can mitigate these attacks. The dissertation discusses past work, PACTight, as well as its contributions, RSTI and sPACtre, presenting the full design, threat model, implementation, security eval- uation and performance evaluation of each one of these mechanisms. The dissertation relies on insights derived from the nature of the attack and compiler techniques. A compiler is a tool that transforms human-written code into machine code that is understandable by the computer. The compiler can be modified and used to make programs more secure with compiler techniques. The past work, PACTight, is a defense mechanism that defends against the first type of attacks, control-flow hijacking attacks, by preventing an attacker from abusing specific code in the program to divert the program to a malicious execution. Then, this dissertation presents RSTI, a new defense mechanism that overcomes the limitations of PACTight and extends it to cover data oriented attacks and prevent attackers from leaking sensitive data from the program. In addition to that, this dissertation presents sPACtre, a novel defesnse mechanism that defends against Spectre attacks, and prevents an attacker from abusing a processor's hidden features. Finally, this dissertation briefly discusses a possible future direction to protect a different class of devices, referred to as energy-harvesting devices, from attackers.
56

Edge-based blockchain enabled anomaly detection for insider attack prevention in Internet of Things

Tukur, Yusuf M., Thakker, Dhaval, Awan, Irfan U. 31 March 2022 (has links)
Yes / Internet of Things (IoT) platforms are responsible for overall data processing in the IoT System. This ranges from analytics and big data processing to gathering all sensor data over time to analyze and produce long-term trends. However, this comes with prohibitively high demand for resources such as memory, computing power and bandwidth, which the highly resource constrained IoT devices lack to send data to the platforms to achieve efficient operations. This results in poor availability and risk of data loss due to single point of failure should the cloud platforms suffer attacks. The integrity of the data can also be compromised by an insider, such as a malicious system administrator, without leaving traces of their actions. To address these issues, we propose in this work an edge-based blockchain enabled anomaly detection technique to prevent insider attacks in IoT. The technique first employs the power of edge computing to reduce the latency and bandwidth requirements by taking processing closer to the IoT nodes, hence improving availability, and avoiding single point of failure. It then leverages some aspect of sequence-based anomaly detection, while integrating distributed edge with blockchain that offers smart contracts to perform detection and correction of abnormalities in incoming sensor data. Evaluation of our technique using real IoT system datasets showed that the technique remarkably achieved the intended purpose, while ensuring integrity and availability of the data which is critical to IoT success. / Petroleum Technology Development Fund(PTDF) Nigeria, Grant/Award Number:PTDF/ED/PHD/TYM/858/16
57

Toward privacy-preserving component certification for metal additive manufacturing

Bappy, Mahathir Mohammad 13 August 2024 (has links) (PDF)
Metal-based additive manufacturing (AM) has emerged as a cutting-edge technology for fabricating complex geometries with high precision. However, the major challenges to the wider adoption of metal AM technologies are process uncertainty-induced quality issues. Consequently, there is an urgent need for fast and reliable certification techniques for AM components, which can be achieved by leveraging Artificial Intelligence (AI)-enabled modeling. Developing a robust AI-enabled model presents a significant challenge because of the costly and time-intensive nature of acquiring diverse and high volume of datasets. In this context, the data-sharing attributes of Manufacturing-as-a-Service (MaaS) platforms can facilitate the development of AI-enabled certification techniques in a collaborative manner. However, sharing process data poses critical concerns about protecting users’ intellectual property and privacy since it contains confidential product design information. To address these challenges, the overarching goal of this research is to investigate how process data and process physics can be leveraged to develop in-situ component certification techniques focusing on data privacy for metal AM systems. This dissertation aims to address the need for novel quality monitoring methodologies by utilizing diverse data sources derived from a range of printed samples. Specifically, the research effort focuses on 1) the use of in-situ thermal history data and ex-situ X-ray computed tomography data for real-time layer-wise anomaly detection method development by analyzing the morphological dynamics of melt pool images; 2) the development of a framework to evaluate the design information disclosure of various thermal history-based feature extraction methods for anomaly detection; and 3) the privacy-preserving and utility-aware adaptive AM data deidentification method development that takes thermal history data as input.
58

<b>VERIFICATION AND VALIDATION OF AN AI-ENABLED SYSTEM</b>

Ibukun Phillips (6622694) 11 November 2024 (has links)
<p dir="ltr">Recent advancements in Machine Learning (ML) algorithms and increasing computational power have driven significant progress in Artificial Intelligence (AI) systems, especially those that leverage ML techniques. These AI-enabled systems incorporate components and data designed to simulate learning and problem-solving, distinguishing them from traditional systems. Despite their widespread application across various industries, certifying AI systems through verification and validation remains a formidable challenge. This difficulty primarily arises from the probabilistic nature of AI and ML components, which leads to unpredictable behaviors.</p><p dir="ltr">This dissertation investigates the verification and validation aspects within the Systems Engineering (SE) lifecycle, utilizing established frameworks and methodologies that support system realization from inception to retirement. It is comprised of three studies focused on applying formal methods, particularly model checking, to enhance the accuracy, value, and trustworthiness of models of engineered systems that use digital twins for modeling the system. The research analyzes digital twin data to understand physical asset behavior more thoroughly by applying both an exploratory method, system identification, and a confirmatory technique, machine learning. This dual approach not only aids in uncovering unknown system dynamics but also enhances the validation process, contributing to a more robust modeling framework.</p><p dir="ltr">The findings provide significant insights into the model-based design of AI-enabled digital twins, equipping systems engineers<del>,</del> and researchers with methods for effectively designing, simulating and modeling complex systems. Ultimately, this work aims to bridge the certification gap in AI-enabled technologies, thereby increasing public trust and facilitating the broader adoption of these innovative systems.</p>
59

A Sociotechnical Systems Analysis of Building Information Modelling (STSaBIM) Implementation in Construction Organisations

Sackey, Enoch January 2014 (has links)
The concept of BIM is nascent but evolving rapidly, thus, its deployment has become the latest shibboleth amongst both academics and practitioners in the construction sector in the recent couple of years. Due to construction clients buy-in of the BIM concept, the entire industry is encouraged to pursue a vision of changing work practices in line with the BIM ideas. Also, existing research recognises that the implementation of BIM affects all areas of the construction process from design of the building, through the organisation of projects, to the way in which the construction process is executed and how the finished product is maintained. The problem however is that, existing research in technology utilisation in general, and BIM literature in particular, has offered limited help to practitioners trying to implement BIM, for focusing predominantly, on technology-centric views. Not surprisingly therefore, the current BIM literature emphasises on topics such as capability maturity models and anticipated outcomes of BIM rollouts. Rarely does the extant literature offer practitioners a cohesive approach to BIM implementation. Such technology-centric views inevitably represent a serious barrier to utilising the inscribed capabilities of BIM. This research therefore is predicated on the need to strengthen BIM implementation theory through monitoring and analysing its implementation in practice. Thus, the focus of this thesis is to carry out a sociotechnical systems (STS) analysis of BIM implementation in construction organisations. The concept of STS accommodates the dualism of the inscribed functions of BIM technologies and the contextual issues in the organisations and allows for the analysis of their interactive combination in producing the anticipated effect from BIM appropriation. An interpretive research methodology is adopted to study practitioners through a change process, involving the implementation of BIM in their work contexts. The study is based on constructivist ontological interpretations of participants. The study adopts an abductive research approach which ensures a back-and-forth movement between research sites and the theoretical phenomenon, effectively comparing the empirical findings with the existing theories and to eventually generate a new theoretical understanding and knowledge regarding the phenomenon under investigation. A two-stage process is also formulated for the empirical data collection - comprising: 1) initial exploratory study to help establish the framework for analysing BIM implementation in the construction context; and 2) case studies approach to provide a context for formulating novel understanding and validation of theory regarding BIM implementation in construction organisations. The analysis and interpretation of the empirical work follows the qualitative content analysis technique to observe and reflect on the results. The findings have shown that BIM implementation demands a complete breakaway from the status quo. Contrary to the prevailing understanding of a top-down approach to BIM utilisation, the study revealed that different organisations with plethora of visions, expectations and skills combine with artefacts to form or transform BIM practices. The rollout and appropriation of BIM occurs when organisations shape sociotechnical systems of institutions, processes and technologies to support certain practices over others. The study also showed that BIM implementation endures in a causal chain of influences as different project organisations with their localised BIM ambitions and expectations combine to develop holistic BIM-enabled project visions. Thus, distributed responsibilities on holistic BIM protocols among the different levels of influences are instituted and enforced under binding contractual obligations. The study has illuminated the centrality of both the technical challenges and sociological factors in shaping BIM deployment in construction. It is also one of the few studies that have produced accounts of BIM deployment that is strongly mediated by the institutional contexts of construction organisations. However, it is acknowledged that the focus of the research on qualitative interpretive enquiry does not have the hard and fast view of generalising from specific cases to broader population/contexts. Thus, it is suggested that further quantitative studies, using much larger data sample of BIM-enabled construction organisations could provide an interesting point of comparison to the conclusions derived from the research findings.
60

Multiplexing Techniques and Design-Automation Tools for FRET-Enabled Optical Computing

Mottaghi, Mohammad January 2014 (has links)
<p>FRET-enabled optical computing is a new computing paradigm that uses the energy of incident photons to perform computation in molecular-scale circuits composed of inter-communicating photoactive molecules. Unlike conventional computing approaches, computation in these circuits does not require any electric current; instead, it relies on the controlled-migration of energy in the circuit through a phenomenon called Förster Resonance Energy Transfer (FRET). This, coupled with other unique features of FRET circuits can enable computing in new domains that are unachievable by the conventional semiconductor-based computing, such as in-cell computing or targeted drug delivery. In this thesis, we explore novel FRET-based multiplexing techniques to significantly increase the storage density of optical storage media. Further, we develop analysis algorithms, and computer-aided design tools for FRET circuits.</p><p>Existing computer-aided design tools for FRET circuits are predominantly ad hoc and specific to particular functionalities. We develop a generic design-automation framework for FRET-circuit optimization that is not limited to any particular functionality. We also show that within a fixed time-budget, the low-speed of Monte-Carlo-based FRET-simulation (MCS) algorithms can have a potentially-significant negative impact on the quality of the design process, and to address this issue, we design and implement a fast FRET-simulation algorithm which is up to several million times faster than existing MCS algorithms. We finally exploit the unique features of FRET-enabled optical computing to develop novel multiplexing techniques that enable orders of magnitude higher storage density compared to conventional optical storage media, such as DVD or Blu-Ray.</p> / Dissertation

Page generated in 0.0386 seconds