1 |
A Cloud-based Surveillance and Performance Management Architecture for Community HealthcareEze, Benjamin 03 June 2019 (has links)
Governments and healthcare providers are under increasing pressure to streamline their processes to reduce operational costs while improving service delivery and quality of care. Systematic performance management of healthcare processes is important to ensure that quality of care goals are being met at all levels of the healthcare ecosystem. The challenge is that measuring these goals requires the aggregation and analysis of large amounts of data from various stakeholders in the healthcare industry. With the lack of interoperability between stakeholders in current healthcare compute and storage infrastructure, as well as the volume of data involved, our ability to measure quality of care across the healthcare system is limited.
Cloud computing is an emerging technology that can help provide the needed interoperability and management of large volumes of data across the entire healthcare system. Cloud computing could be leveraged to integrate heterogeneous healthcare data silos if a regional health authority provided data hosting with appropriate patient identity management and privacy compliance.
This thesis proposes a cloud-based architecture for surveillance and performance management of community healthcare. Our contributions address five critical roadblocks to interoperability in a cloud computing context: infrastructure for surveillance and performance management services, a common data model, a patient identity matching service, an anonymization service, and a privacy compliance model. Our results are validated through a pilot project, and two experimental case studies done in collaboration with a regional health authority for community care.
|
2 |
Privacy-aware Use of Accountability EvidenceReuben, Jenni January 2017 (has links)
This thesis deals with the evidence that enable accountability, the privacy risks involved in using them and a privacy-aware solution to the problem of unauthorized evidence disclosure. Legal means to protect privacy of an individual is anchored on the data protection perspective i.e., on the responsible collection and use of personal data. Accountability plays a crucial role in such legal privacy frameworks for assuring an individual’s privacy. In the European context, accountability principle is pervasive in the measures that are mandated by the General Data Protection Regulation. In general, these measures are technically achieved through automated privacy audits. System traces that record the system activities are the essential inputs to those automated audits. Nevertheless, the traces that enable accountability are themselves subject to privacy risks, because in most cases, they inform about processing of the personal data. Therefore, ensuring the privacy of the accountability traces is equally important as ensuring the privacy of the personal data. However, by and large, research involving accountability traces is concerned with storage, interoperability and analytics challenges rather than on the privacy implications involved in processing them. This dissertation focuses on both the application of accountability evidence such as in the automated privacy audits and the privacy aware use of them. The overall aim of the thesis is to provide a conceptual understanding of the privacy compliance research domain and to contribute to the solutions that promote privacy-aware use of the traces that enable accountability. To address the first part of the objective, a systematic study of existing body of knowledge on automated privacy compliance is conducted. As a result, the state-of-the-art is conceptualized as taxonomies. The second part of the objective is accomplished through two results; first, a systematic understanding of the privacy challenges involved in processing of the system traces is obtained, second, a model for privacy aware access restrictions are proposed and formalized in order to prevent illegitimate access to the system traces. Access to accountability traces such as provenance are required for automatic fulfillment of accountability obligations, but they themselves contain personally identifiable information, hence in this thesis we provide a solution to prevent unauthorized access to the provenance traces. / This thesis deals with the evidence that enables accountability, the privacy risks involved in using it and proposes a privacy-aware solution for preventing unauthorized evidence disclosure. Accountability plays a crucial role in the legal privacy frameworks for assuring individuals’ privacy. In the European context, accountability principle is pervasive in the measures that are mandated by the General Data Protection Regulation. In general, these measures are technically achieved through automated privacy audits. Traces that record the system activities are the essential inputs to those audits. Nevertheless, such traces that enable accountability are themselves subject to privacy risks, because in most cases, they inform about the processing of the personal data. Therefore, ensuring the privacy of the traces is equally important as ensuring the privacy of the personal data. The aim of the thesis is to provide a conceptual understanding of the automated privacy compliance research and to contribute to the solutions that promote privacy-aware use of the accountability traces. This is achieved in this dissertation through a systematic study of the existing body of knowledge in automated privacy compliance, a systematic analysis of the privacy challenges involved in processing the traces and a proposal of a privacy-aware access control model for preventing illegitimate access to the traces.
|
3 |
Complying with the GDPR in the context of continuous integrationLi, Ze Shi 08 April 2020 (has links)
The full enforcement of the General Data Protection Regulation (GDPR) that began on May 25, 2018 forced any organization that collects and/or processes personal data from European Union citizens to comply with a series of stringent and comprehensive privacy regulations. Many software organizations struggled to comply with the entirety of the GDPR's regulations both leading up and even after the GDPR deadline. Previous studies on the subject of the GDPR have primarily focused on finding implications for users and
organizations using surveys or interviews. However, there is a dearth of in-depth studies that investigate compliance practices and compliance challenges in software organizations. In particular, small and medium enterprises are often neglected in these previous studies, despite small and medium enterprises representing the majority of organizations in the EU. Furthermore, organizations that practice continuous integration have largely been ignored in studies on GDPR compliance. Using design science methodology, we conducted an in-depth study over the span of 20 months regarding GDPR compliance practices and challenges in collaboration with a small, startup organization. Our first step helped identify our collaborator's business problems. Subsequently, we iteratively developed two artifacts to address those business problems: a set of privacy requirements operationalized from GDPR principles, and an automated GDPR tool that tests these GDPR-derived privacy requirements. This design science approach resulted in five implications for research and for practice about ongoing challenges to compliance. For instance, our research reveals that GDPR regulations can be partially operationalized and tested through automated means, which is advantageous for achieving long term compliance. In contrast, more research is needed to create more efficient and effective means to disseminate and manage GDPR knowledge among software developers. / Graduate
|
4 |
Privacy-Preserving Ontology Publishing for EL Instance StoresBaader, Franz, Kriegel, Francesco, Nuradiansya, Adrian 26 June 2020 (has links)
We make a first step towards adapting an existing approach for privacy-preserving publishing of linked data to Description Logic (DL) ontologies. We consider the case where both the knowledge about individuals and the privacy policies are expressed using concepts of the DL EL , which corresponds to the setting where the ontology is an EL instance store. We introduce the notions of compliance of a concept with a policy and of safety of a concept for a policy, and show how optimal compliant (safe) generalizations of a given EL concept can be computed. In addition, we investigate the complexity of the optimality problem.
|
Page generated in 0.0541 seconds