Spelling suggestions: "subject:"informatics""
261 |
Methods, Policies and Technologies for Compliance-aware Management of electronic Health RecordsStevovic, Jovan January 2014 (has links)
Medical record sharing across healthcare organisations is fundamental for improving quality of care and reducing assistance costs. However, healthcare organisations are still struggling in building cross-organisation data sharing solutions due to strict data protection regulations that varies across states and regions, availability of a variety of technical standards for medical record sharing, and differences among organisations’ IT infrastructures that have been built over the years to satisfy organisations’ specific needs and requirements. This thesis reports our findings based on various research and industrial projects aiming at connecting healthcare organisations. The primary contributions of this dissertation are: • A methodology and an execution environment to define and execute cross- organisation data sharing processes in compliance with both data protection regulations and organisations’ requirements. The methodology consists of multiple steps that start with the extraction of compliance requirements from regulations and gathering of business requirements from the involved stakeholders, and end with the definition of data sharing processes and policies to satisfy the collected requirements. The modelling framework that supports the methodology provides to users the modelling tools and guidelines to define the business processes and policies for sharing privacy-sensitive data. The execution framework maps the business processes into actionable operations to manage privacy-sensitive data and data protection policies. • An event-driven service integration approach to support cross-organisation data sharing. The integration approach focuses on identifying data dependencies among institutions (i.e., data they produce, consume and would like to exchange) in form of events rather than analysing internal data structures. To support this approach, we propose a privacy-aware event-driven data-sharing protocol and a system architecture based on combination of Service Oriented (SOA) and Event Driven (EDA) architectural patterns. The data-sharing protocol and the underlying fine-grained access control policies provide control on the access and dissemination of sensitive information among the involved organisations. • A set of algorithms to detect and to prevent access control policy violations in data integration caused by the presence of functional dependencies. In data integration typically each source specifies its local access control policies and cannot anticipate the functional dependencies among sets of attributes (or any other type of data inference) that can arise when data is integrated. Functional dependencies can allow malicious users to obtain prohibited information by linking multiple queries and thus violating the local policies. To solve such issues, we propose algorithms to identify the sets of queries that can lead to such privacy violations. We then propose algorithms to identify additional policies that are able to prevent the identified queries from completion and thus prevent policy violations. We show how the proposed solutions have been applied in practice in building Electronic Health Record and Business Intelligence systems that involve cross-organisation sharing of privacy-sensitive data. The thesis reports also the validations of the proposed technologies with end-users and privacy experts, and the lessons learned after deploying an instance of the developed system in a multi-organisation scenario in Trentino, Italy.
|
262 |
Taking care of end-of-life settingsDi Fiore, Angela January 2018 (has links)
End of life care concerns medical services dedicated to incurable patients that are living the last year, or months, of their lives. End-of-life patients have complex social, spiritual and medical needs, and they are usually cared for in family environments, such as home or residential care settings. Nowadays, there is an emerging need of technology to support the collaborative care practices that entangle families and professionals. However, conducting design processes in sensitive contexts like end-of-life environments presents several challenges due to the high social complexity of the field. In this scenario, this thesis explores the realm of end-of-life, in order to inform the design processes in sensitive contexts.
This work is based on two different end-of-life contexts: pediatric palliative care and nursing homes. These studies explored two different types of end-of-life services, showing similar care dynamics but also different relational assets. The findings presented in this thesis can inform the conduction of design processes in end-of-life settings by: presenting the recurring organizational, communication and relational issues; analyzing the multifaceted role of technology in such contexts, which can be perceive both an enabler of quality care and a dangerous thing; providing also methodological insights to both embrace the stories of our informants and to also take care of the emotional wellbeing of design researchers.
|
263 |
Efficient Solving of the Satisfiability Modulo Bit-Vectors Problem and Some Extensions to SMTFranzen, Per Anders January 2010 (has links)
Decision procedures for expressive logics such as linear arithmetic, bit-vectors, uninterpreted functions, arrays or combinations of theories are becoming increasingly important in various areas of hardware and software development and verification such as test pattern generation, equivalence checking, assertion based verification and model checking.
In particular, the need for bit-precise reasoning is an important target for research into decision procedures. In this thesis we will describe work on creating an efficient decision procedure for Satisfiability Modulo the Theory of fixed-width bit-vectors, and how such a solver can be used in a real-world application.
We will also introduce some extensions of the basic decision procedure allowing for optimisation, and compact representation of constraints in a SMT solver, showing how these can be succinctly and elegantly described as a theory allowing for the extension with minimal changes to SMT solvers.
|
264 |
Statistical Model Checking of Web ServicesSchivo, Stefano January 2010 (has links)
In recent years, the increasing interest on service-oriented paradigm has given rise to a series of supporting tools and languages. In particular, COWS (Calculus for Orchestration of Web Services) has been attracting the attention of part of the scientific community for its peculiar effort in formalising the semantics of the de facto standard Web Services orchestration language WS-BPEL. The purpose of the present work is to provide the tools for representing and evaluating the performance of Web Services modelled through COWS. In order to do this, a stochastic version of COWS is proposed: such a language allows us to describe the performance of the modelled systems and thus to represent Web Services both from the qualitative and quantitative points of view. In particular, we provide COWS with an extension which maintains the polyadic matching mechanism: this way, the language will still provide the capability to explicitly model the use of session identifiers. The resulting Scows is then equipped with a software tool which allows us to effectively perform model checking without incurring into the problem of state-space explosion, which would otherwise thwart the computation efforts even when checking relatively small models. In order to obtain this result, the proposed tool relies on the statistical analysis of simulation traces, which allows us to deal with large state-spaces without the actual need to completely explore them. Such an improvement in model checking performances comes at the price of accuracy in the answers provided: for this reason, users can trade-off speed against accuracy by modifying a series of parameters. In order to assess the efficiency of the proposed technique, our tool is compared with a number of existing model checking softwares.
|
265 |
On the provenance of digital imagesPhan, Quoc-Tin January 2019 (has links)
Digital images are becoming the most commonly used multimedia data nowadays thanks to the massive manufacturing of cheap acquisition devices coupled with the unprecedented popularity of Online Social Networks. As two sides of a coin, massive use of digital images triggers the development of user-friendly editing tools and intelligent techniques that violate image authenticity. By this respect, digital images are less and less trustable as they are easily modified not only by experts or researchers, but also by unexperienced users. It has been also witnessed that malicious use of images has tremendous impact on human perception as well as system reliability. Those concerns highlight the importance to verify image authenticity. In practice, digital images are created, manipulated, and diffused world-wide via many channels. Simply answering to the question "Is an image authentic?'' appears insufficient. Further steps aiming at understanding the provenance of images with respect to acquisition device or distributed platforms as well as the processing history have to be considered significant. This doctoral study contributes solutions to recover digital image provenance under multiple aspects: i) image acquisition device, ii) social network origin, and iii) source-target disambiguation in image copy-move forgery.
|
266 |
Mining human Behaviors: automated behavioral Analysis from small to big DataStaiano, Jacopo January 2014 (has links)
This research thesis aims to address complex problems in Human Behavior Understanding from a computational standpoint: to develop novel methods for enabling machines to capture not only what their sensors are perceiving but also how and why the situation they are presented with is evolving in a certain manner. Touching several fields, from Computer Vision to Social Psychology through Natural Language Processing and Data Mining, we will move from more to less constrained scenarios, describing models for automated behavioral analysis in different contexts: from the individual perspective, e.g. a user interacting with technology, to the group perspective, e.g. a brainstorming session; from living labs, e.g. hundreds of people transparently tracked in their everyday life through smart-phone sensors, to the World Wide Web.
|
267 |
Engineering Law-Compliant Requirements: the Nomos FrameworkSiena, Alberto January 2010 (has links)
In modern societies, both business and private life are deeply pervaded by software and information systems.
Using software has extended human capabilities, allowing information to cross physical and ethical barriers.
To handle misuse dangers, governments are increasingly laying down new laws and introducing obligations, rights and responsibilities concerned with the use of software.
As a consequence, laws are assuming a steering role in the specification of software requirements, which must be compliant to avoid fines and penalties.
This work proposes a model-based approach to the problem of law compliance of software requirements. It aims at extending state-of-the-art goal-oriented requirements engineering techniques with the capability to argue about compliance, through the use and analysis of models. It is based on a language for modelling legal prescriptions. Upon the language, compliance can be defined as a condition that depends on a set of properties. Such a condition is achieved through an iterative modelling process.
Specifically, we investigated the nature of legal prescription to capture their conceptual language. From jurisprudence literature, we adopted a taxonomy of legal concepts, which has been elaborated and translated into a conceptual meta-model. Moreover, this metamodel was integrated with the meta-model of a goal-oriented modelling language for requirements engineering, in order to provide a common legal-intentional meta-model.
Requirements models built with the proposed language consist of graphs, which ultimately can be verified automatically. Compliance amounts then in a set of properties the graph must have.
The compliance condition gains relevance in two cases. Firstly, when a requirements model has already been developed, and it needs to be reconciled with a set of laws. Secondly, when requirements have to be modelled from scratch, and they are need to be compliant. In both cases, compliance results from a design process.
The proposed modelling language, as well as the compliance condition and the corresponding design process, have been applied to two case studies.
The obtained results confirm the validity of the approach, and point out interesting research directions for the future.
|
268 |
Exploiting the Volatile Nature of Data and Information in Evolving Repositories and Systems with User Generated ContentBykau, Siarhei January 2013 (has links)
Modern technological advances have created a plethora of an extremely large, highly heterogeneous and distributed collection of datasets that are highly volatile. This volatile nature makes their understanding, integration and management a challenging task.
One of the first challenging issues is to create the right models that will capture not only the changes that have taken place on the values of the data but also the semantic evolution of the concepts that the data structures represent. Once this information has been captured, the right mechanisms should be put in place to enable the exploitation of the evolution information in query formulation, reasoning, answering and representation. Additionally, the continuously evolving nature of the data hinders the ability of determining the quality of the data that is observed at a specific moment, since there is a great deal of uncertainty on whether this information will remain as is. Finally, an important task in this context, known as information filtering, is to match a specific piece of information which is recently updated (or added) in a repository to a user or query at hand.
In this dissertation, we propose a novel framework to model and query data which have the explicit evolution relationships among concepts. As a query language we present an expressive evolution graph traversal query language which is tested on a number of real case scenarios: the history of Biotechnology, the corporate history of US companies and others. In turn, to support query evaluation we introduce an algorithm using the idea of finding Steiner trees on graphs which is capable of computing answers on-the-fly taking into account the evolution connections among concepts.
To address the problem of data quality in user generated repositories (e.g. Wikipedia) we present a novel algorithm which detects individual controversies by using the substitutions in the revision history of a content. The algorithm groups the disagreements between users by means of a context, i.e. the surrounding content, and by applying custom filters. In the extensive experimental evaluation we showed that the proposed ideas lead to high effectiveness on a various sources of controversies.
Finally, we exploit the problem of producing recommendations in evolving repositories by focusing on the cold start problem, i.e. when no or little past information about the users and/or items is given. In the dissertation we present a number of novel algorithms which cope with the cold-start by leveraging the item features using the k-neighbor classifier, Naive Bayes classifier and maximum entropy principle. The obtained results enable recommender systems to operate in rapidly updated domains such that news, university courses and social data.
|
269 |
Broadband Radio Interfaces Design for " 4G and Beyond" Cellular Systems in smart urban EnvironmentsRahman, Talha January 2015 (has links)
Broadband, ubiquitous and energy-efficient wireless networking is one of the pillars in the definition of a really smart urban environment. The latest developments in such a field concern with the forthcoming LTE-A standard, which will also involve small cell deployment for broadband coverage yielding increased quality of experience and reduced power consumption. Some open issues related to small cell LTE-A networking for smart city applications are discussed, together with feasible solutions that are investigated in terms of robust PHY-layer configurations, and fully-wireless backhaul (point-to-point transmission, point-to-multipoint etc.). One such issue is related to power-constrained uplink transmission, where cooperative multipoint (CoMP) in small cell network is considered assuring better quality of service and energy efficiency for user terminal. Moreover, a novel MIMO detection is conceived for LTE-A application based on MCBEP criterion that is suited for size-constrained small base station and guaranteeing near-optimum performance. A door open to upcoming mobile standards by proposing constant envelope techniques in the uplink providing flexible tradeoff between spectral and power efficiency for 5th generation applications. A complete wireless backhaul based on millimeter wave (mmWave), for network of small cells, is considered due to its cost effectiveness and flexible operations. A robust PHY-layer waveform based on space-time MIMO techniques have proven to be the right choice for non-line of sight operations whereas TH-IR UWB techniques are providing significant data rates in line-of-sight case. SDR-Implementation of advanced wireless strategies is important in order to realize network reconfigurability in future cellular networks where network functionalities can be changed on the fly .
|
270 |
Domain Specific Mashup Platforms as a ServiceSoi, Stefano January 2013 (has links)
Since the Web 2.0 advent, Web users have gained more and more power moving their role from simple information consumers to producers. In line with this trend, mashup technologies have immediately attracted a lot of attention and many research investments since their birth. The big mashup promise was to bring application development to the masses, so that any Web-educated worker, also non-IT skilled, could implement his/her own situational applications (i.e., relatively small applications meant to address a temporary need of one or few persons) exploiting the simple paradigms and visual metaphors provided by mashup tools. After a decade from the first mashup tools, though, mashups are not really part of people’s everyday life and are still rather unknown technologies that — beside some exceptions — hardly find concrete application in the real world. During our research in this field our high-level goal was to foster the adoption of mashup technologies by end users. Aiming at this, we identified three main characteristics that must be reached by mashup technologies to get to the expected diffusion: (i) usefulness and (ii) usability for the end users and (iii) affordability for the developers of the respective mashup tools (in terms of required skills, time and cost). We identified lacks in these achievements as main hindering factors for the wide adoption of mashup technologies. Making mashup technologies useful, usable and affordable, therefore, are the three main challenges we addressed in our research work. This work contributes to the achievement of all these three major goals: first, by enabling a so-called universal integration paradigm, focussing on the creation of more powerful and complete mashups allowing data, application logic and user interface integration in one single language and tool; then, by introducing and developing domain specific mashup technologies, able to lower mashups’ complexity and make them usable by domain experts (i.e., end users expert of a given target domain); finally, by realizing a system able to generate domain specific mashup platforms as a service, basically relieving developers of platforms implementation and, therefore, making platform development affordable. This thesis specifically focusses on the last two points, i.e., on the domain specific mashup approach and on the semi-automatic generation of domain specific mashup platforms.
|
Page generated in 0.054 seconds