181 |
Automating Geospatial RDF Dataset Integration and EnrichmentSherif, Mohamed Ahmed Mohamed 12 May 2016 (has links)
Over the last years, the Linked Open Data (LOD) has evolved from a mere 12 to more than 10,000 knowledge bases. These knowledge bases come from diverse domains including (but not limited to) publications, life sciences, social networking, government, media, linguistics. Moreover, the LOD cloud also contains a large number of crossdomain knowledge bases such as DBpedia and Yago2. These knowledge bases are commonly managed in a decentralized fashion and contain partly verlapping information. This architectural choice has led to knowledge pertaining to the same domain being published by independent entities in the LOD cloud. For example, information on drugs can be found in Diseasome as well as DBpedia and Drugbank. Furthermore, certain knowledge bases such as DBLP have been published by several bodies, which in turn has lead to duplicated content in the LOD . In addition, large amounts of geo-spatial information have been made available with the growth of heterogeneous Web of Data.
The concurrent publication of knowledge bases containing related information promises to become a phenomenon of increasing importance with the growth of the number of independent data providers. Enabling the joint use of the knowledge bases published by these providers for tasks such as federated queries, cross-ontology question answering and data integration is most commonly tackled by creating links between the resources described within these knowledge bases. Within this thesis, we spur the transition from isolated knowledge bases to enriched Linked Data sets where information can be easily integrated and processed. To achieve this goal, we provide concepts, approaches and use cases that facilitate the integration and enrichment of information with other data types that are already present on the Linked Data Web with a focus on geo-spatial data.
The first challenge that motivates our work is the lack of measures that use the geographic data for linking geo-spatial knowledge bases. This is partly due to the geo-spatial resources being described by the means of vector geometry. In particular, discrepancies in granularity and error measurements across knowledge bases render the selection of appropriate distance measures for geo-spatial resources difficult. We address this challenge by evaluating existing literature for point set measures that can be used to measure the similarity of vector geometries. Then, we present and evaluate the ten measures that we derived from the literature on samples of three real knowledge bases.
The second challenge we address in this thesis is the lack of automatic Link Discovery (LD) approaches capable of dealing with geospatial knowledge bases with missing and erroneous data. To this end, we present Colibri, an unsupervised approach that allows discovering links between knowledge bases while improving the quality of the instance data in these knowledge bases. A Colibri iteration begins by generating links between knowledge bases. Then, the approach makes use of these links to detect resources with probably erroneous or missing information. This erroneous or missing information detected by the approach is finally corrected or added.
The third challenge we address is the lack of scalable LD approaches for tackling big geo-spatial knowledge bases. Thus, we present Deterministic Particle-Swarm Optimization (DPSO), a novel load balancing technique for LD on parallel hardware based on particle-swarm optimization. We combine this approach with the Orchid algorithm for geo-spatial linking and evaluate it on real and artificial data sets. The lack of approaches for automatic updating of links of an evolving knowledge base is our fourth challenge. This challenge is addressed in this thesis by the Wombat algorithm. Wombat is a novel approach for the discovery of links between knowledge bases that relies exclusively on positive examples. Wombat is based on generalisation via an upward refinement operator to traverse the space of Link Specifications (LS). We study the theoretical characteristics of Wombat and evaluate it on different benchmark data sets.
The last challenge addressed herein is the lack of automatic approaches for geo-spatial knowledge base enrichment. Thus, we propose Deer, a supervised learning approach based on a refinement operator for enriching Resource Description Framework (RDF) data sets. We show how we can use exemplary descriptions of enriched resources to generate accurate enrichment pipelines. We evaluate our approach against manually defined enrichment pipelines and show that our approach can learn accurate pipelines even when provided with a small number of training examples.
Each of the proposed approaches is implemented and evaluated against state-of-the-art approaches on real and/or artificial data sets. Moreover, all approaches are peer-reviewed and published in a conference or a journal paper. Throughout this thesis, we detail the ideas, implementation and the evaluation of each of the approaches. Moreover, we discuss each approach and present lessons learned. Finally, we conclude this thesis by presenting a set of possible future extensions and use cases for each of the proposed approaches.
|
182 |
Study of Persistent and Flaring Gamma-Ray Emission from Active Galactic Nuclei with the MAGIC Telescopes and Prospects for Future Open Data Formats in Gamma-Ray AstronomyNigro, Cosimo 17 October 2019 (has links)
Angetrieben durch die Akkretion von Materie in ein super massives Schwarzes Loch in ihrem Zentrum, stellen aktive Galaxien die stärksten und beständigsten Strahlungsquellen im Universum dar. Ihre elektromagnetische Emission kann sich bis in den Gammastrahlenbereich ausbreiten. Das Ziel dieser Arbeit ist, diese Mechanismen und die Orte jenseits der hoch energetischen Emission zu charakterisieren. Dafür werden die Observationen von zwei Aktiven Galaxien im Bereich von hunderten von GeV verwendet, welche mit den Cherenkov Teleskopen MAGIC aufgenommen wurden. Die physikalische Interpretation wird durch Beobachtungen mit dem Fermi Gamma-ray Space Teleskop und durch Multiwellenlängendaten unterstützt. Es werden zwei Aktive Galaxien mit Jet untersucht: PKS 1510-089 und NGC 1275. Die MAGIC Teleskope, welche PKS 1510-089 seit 2012 immer wieder beobachten, detektieren eine signifikante Emission über dutzende von Observationsstunden, was auf schwache aber kontinuierliche Gammastrahlung aus dieser Quelle hinweist. NGC 1275 zeigte in der Periode von September 2016 bis Februar 2017 einen großen Ausbruch im Gammerstrahlenbereich: MAGIC zeichnete eine Variabilität in der Größenordnung von wenigen Stunden und die erstmalige Emission von TeV Photonen. Aus beiden untersuchten Quellen ist ersichtlich, dass die Kombination von Daten aus verschiedenen Instrumenten die physische Diskussion entscheidend beeinflusst. Der Übergang zu zugänglichen und interoperablen Daten wird zu einem zwingenden Thema für Gammastrahlenastronomen, und diese Arbeit stellt das technische Bestreben dar, standardisierte hochrangige Daten für Gammastrahleninstrumente zu erzeugen. Ein Beispiel für eine zukünftige Analyse, die einheitliche High-Level-Daten von einem Gammastrahlensatelliten und vier Cherenkov-Teleskopen kombiniert, wird vorgestellt. Der neue Ansatz, der vorgeschlagen wird, führt die Datenanalyse durch und verbreitet die Ergebnisse, wobei nur Open-Source-Ressourcen verwendet werden. / Powered by the accretion of matter to a supermassive black hole, active galactic nuclei constitute the most powerful and persistent sources of radiation in the universe, with emission extending in the gamma-ray domain. The aim of this work is to characterise the mechanisms and sites beyond this highly-energetic radiation employing observations of two galaxies at hundreds of GeV, conducted with the MAGIC imaging Cherenkov telescopes. The physical interpretation is supported with observations by the Fermi Gamma-ray Space Telescope and with multi-wavelength data. Two peculiar jetted galaxies are studied: PKS 1510-089 and NGC 1275. The first source, monitored by MAGIC since 2012, presents a significant emission over tens of observation hours, in what appears to be a low but persistent gamma-ray state. The second source has instead shown, in the period between September 2016 and February 2017, a major outburst in its gamma-ray activity with variability of the order of few hours and emission of TeV photons. The broad band emission of jetted galaxies is commonly modelled with the radiative processes of a population of electrons accelerated in the jet. While PKS 1510-089 conforms to this scenario, modelling the gamma-ray outburst of NGC 1275 requires placing the acceleration and radiation of electrons close to the event horizon of the black hole. From both the sources studied it is evident that the combination of data from different instruments critically drives the physical discussuion. Moving towards accessible and interoperable data becomes a compelling issue for gamma-ray astronomers and this thesis presents the technical endeavour to produce standardised high-level data for gamma-ray instruments. An example of a future analysis combining uniformed high-level data from a gamma-ray satellite and four Cherenkov telescopes is presented. The novel approach proposed performs the data analysis and disseminates the results making use only of open-source assets.
|
183 |
Making ATLAS Data from CERN Accessible to the General Public : The Development and Evaluation of a Learning Resource in Experimental Particle Physics / Tillgängliggörandet av ATLAS-data från CERN för allmänheten : Utveckling och utvärdering av ett läromedel i experimentell partikelfysikEkelin, Svea, Hagesjö, Louise January 2017 (has links)
In 2016, the ATLAS experiment at CERN released data from 100 trillion proton-proton collisions to the general public. In connection to this release the ATLAS Outreach group has developed several tools for visualizing and analyzing the data, one of which is a Histogram analyzer. The focus of this project is to bridge the gap between the general public's knowledge in physics and what is needed to use this Histogram analyzer. The project consists of both the development and an evaluation of a learning resource that explains experimental particle physics for a general public audience. The learning resource is a website making use of analogies and two perspectives on learning: Variation Theory and Cognitive Load Theory. The evaluation of the website was done using a survey with 10 respondents and it focused on whether analogies and the perspectives on learning helped their understanding. In general the respondents found the analogies to be helpful for their learning, and to some degree they found the explanations based on Variation Theory to be helpful. The implementations of Cognitive Load Theory were considered to be helpful by the respondents who noticed them, but the majority did not, implying that improvements of the design are needed. The results indicate that analogies and the two perspectives on learning can be helpful for explaining experimental particle physics, but there might be other learning theories more suitable for this purpose. / ATLAS-experimentet på CERN släppte år 2016 data från 100 biljoner proton-kollisioner fritt till allmänheten. I anslutning till detta har ATLAS Outreach-grupp utvecklat ett flertal verktyg för att visualisera och analysera datan, varav en är en analys med hjälp av histogram. Fokus för detta projekt är att överbrygga klyftan mellan allmänhetens kunskaper i fysik och vad som behövs for att kunna använda Histogram-analysverktyget. Projektet består både av utvecklandet och utvärderingen av ett läromedel som förklarar experimentell partikelfysik med målgruppen allmänheten. Läromedlet är en webbsida som använder sig av analogier och två perspektiv på lärande, Variationsteori och Kognitiv Belastningsteori. Utvärderingen av webbsidan gjordes med en enkät med tio respondenter, med fokus på huruvida analogier och perspektiven på lärande hjälpte deras förståelse. I allmänhet fann respondenterna analogierna hjälpsamma för deras lärande, och de fann Variationsteori hjälpsamt i viss utsträckning. Implementeringarna av Kognitiv Belastningsteori ansågs vara hjälpsamma av de respondenter som lade märke till dem, men majoriteten gjorde inte det, vilket tyder på att förbättringar av implementeringen krävs. Resultaten indikerar att analogier och de två perspektiven på lärande kan vara hjälpsamma för att förklara experimentell partikelfysik, men det kan finnas andra lärandeteorier som uppfyller syftet bättre.
|
184 |
Framework for digital preservation of electronic government in GhanaAdu, Kofi Koranteng January 2015 (has links)
The global perspective on digital revolution is one that has received a rapturous approval from information professionals, scholars and practitioners. However, such an approval has come at a great cost to memory institutions as the preservation of digital information has proved to be a complex phenomenon to memory institutions. Guided by the multi method design and underpinned by the triangulation of questionnaires, interviews, observation and document analysis, the study examined digital preservation of e-government in Ghana. Findings revealed that the creation of databases, digital publication, emails, website information and tweets were often ocassioned by the use of ICT, e-government, and application of legislations and public policies. It observed that these types of digital records were in urgent need for preservation as most of the ministries and agencies were unable to access their digital records.
While the application of a digital preservation tool (Lots of Copies Keeps Stuff Safe) was a familiar terrain to the ministries and agencies, there was expressed lack of awareness about digital preservation support organisations and digital preservation standards.
The study identified funding, level of security and privacy, skills training and technological obsolescence as factors that pose key threats to digital preservation. It noted backup strategy, migration, metadata and trusted repositories as the most widely implemented preservation strategy across the ministries and agencies. On the other hand, cloud computing, refreshing and emulation were the least implemented preservation strategies used to address the digital preservation challenges .
The study recommends that the ministries and agencies can address many of the digital preservation challenges if they leverage on collaborative and participatory opportunities. Such collaborative and participatory opportunities involve the use of experts from other institutions to share resources and use a common protocol through cloud computing and Open Data. It further recommends that the process of developing a digital preservation policy can be guided by a template document from other jurisdictions / Information Science / D. Litt et Phil. (Information Science)
|
185 |
L’acquisition et l’extraction de connaissances dans un contexte patrimoniale peu documenté / Knowledge acquisition and extraction in the context of poorly documented cultural heritageAmad, Ashraf 06 December 2017 (has links)
L’importance de la documentation du patrimoine culturel croit parallèlement aux risques auxquels il est exposé tels que les guerres, le développement urbain incontrôlé, les catastrophes naturelles, la négligence et les techniques ou stratégies de conservation inappropriées. De plus, la documentation constitue un outil fondamental pour l'évaluation, la conservation, le suivi et la gestion du patrimoine culturel. Dès lors, cet outil majeur nous permet d’estimer la valeur historique, scientifique, sociale et économique de ce patrimoine. Selon plusieurs institutions internationales dédiées à la conservation du patrimoine culturel, il y a un besoin réel de développer et d’adapter de solutions informatiques capables de faciliter et de soutenir la documentation du patrimoine culturel peu documenté surtout dans les pays en développement où il y a un manque flagrant de ressources. Parmi ces pays, la Palestine représente un cas d’étude pertinent dans cette problématique de carence en documentation de son patrimoine. Pour répondre à cette problématique, nous proposons une approche d’acquisition et d’extraction de connaissances patrimoniales dans un contexte peu documenté. Nous prenons comme cas d’étude l’église de la Nativité en Palestine et nous mettons en place notre approche théorique par le développement d’une plateforme d’acquisition et d’extraction de connaissances patrimoniales à l’aide d’un Framework pour la documentation de patrimoine culturel.Notre solution est basée sur les technologies sémantiques, ce qui nous donne la possibilité, dès le début, de fournir une description ontologique riche, une meilleure structuration de l'information, un niveau élevé d'interopérabilité et un meilleur traitement automatique (lisibilité par les machines) sans efforts additionnels.De plus, notre approche est évolutive et réciproque car l’acquisition de connaissance (sous forme structurée) améliore l’extraction de connaissances patrimoniales à partir de texte non structuré et vice versa. Dès lors, l’interaction entre les deux composants de notre système ainsi que les connaissances patrimoniales se développent et s’améliorent au fil de temps surtout que notre système utilise les contributions manuelles et validations des résultats automatiques (dans les deux composants) par les experts afin d’optimiser sa performance. / The importance of cultural heritage documentation increases in parallel with the risks to which it is exposed, such as wars, uncontrolled urban development, natural disasters, neglect and inappropriate conservation techniques or strategies. In addition, this documentation is a fundamental tool for the assessment, the conservation, and the management of cultural heritage. Consequently, this tool allows us to estimate the historical, scientific, social and economic value of this heritage. According to several international institutions dedicated to the preservation of cultural heritage, there is an urgent need to develop computer solutions to facilitate and support the documentation of poorly documented cultural heritage especially in developing countries where there is a lack of resources. Among these countries, Palestine represents a relevant case study in this issue of lack of documentation of its heritage. To address this issue, we propose an approach of knowledge acquisition and extraction in the context of poorly documented heritage. We take as a case study the church of the Nativity in Palestine and we put in place our theoretical approach by the development of a platform for the acquisition and extraction of heritage knowledge. Our solution is based on the semantic technologies, which gives us the possibility, from the beginning, to provide a rich ontological description, a better structuring of the information, a high level of interoperability and a better automatic processing without additional efforts.Additionally, our approach is evolutionary and reciprocal because the acquisition of knowledge (in structured form) improves the extraction of heritage knowledge from unstructured text and vice versa. Therefore, the interaction between the two components of our system as well as the heritage knowledge develop and improve over time especially that our system uses manual contributions and validations of the automatic results (in both components) by the experts to optimize its performance.
|
186 |
Dados abertos no governo federal brasileiro : desafios de transparência e interoperabilidadePossamai, Ana Júlia January 2016 (has links)
A pesquisa dedica-se ao estudo dos fatores institucionais críticos para a adoção dos dados abertos governamentais como referencial de tratamento das informações públicas na Era Digital. Dados abertos governamentais (open government data, OGD) são dados públicos, publicados na Web em formato aberto, estruturado e compreensível logicamente, de modo que qualquer pessoa possa livremente acessar, reutilizar, modificar e redistribuir, para quaisquer finalidades, estando sujeito a, no máximo, exigências de creditar a sua autoria e compartilhar sob a mesma licença. Os OGD carregam consigo a premissa de que afetam positivamente a democracia, ao promover a transparência, facilitando o acesso a informações necessárias à participação e à realização do controle social. Afetam igualmente a capacidade estatal, ao possibilitar a integração e a interoperabilidade, viabilizando o acesso tempestivo a dados essenciais à gestão e à tomada de decisão, bem como a colaboração governo-sociedade, a partir da reutilização dos dados na criação de novos serviços e aplicações. Contudo, apesar dos benefícios propalados pelos defensores dos OGD, variáveis institucionais modulam os esforços de abertura dos dados das organizações públicas e, consequentemente, limitam os ganhos democráticos e de efetividade governamental a ele associados. Com base no institucionalismo histórico e no conceito de políticas públicas digitalmente mediadas (PPDM), proposto por Jane Fountain, argumentamos que variáveis associadas não só à regulamentação do acesso das informações públicas e à estrutura de implementação da política, mas também decorrentes da dependência da trajetória de escolhas tecnológicas legadas, afetam a incorporação (embeddedness) dos OGD Em virtude desses fatores institucionais, uma política de dados abertos governamentais apresentará um padrão de desenvolvimento incremental, a despeito da característica disruptiva das tecnologias da Era Digital. A pesquisa emprega uma metodologia qualitativa e o método do estudo de caso exploratório da política de dados abertos do Governo Federal, lançada em 2012 e conduzida pela Infraestrutura Nacional de Dados Abertos (INDA). O Brasil foi o primeiro país a prever a publicação de dados em formato aberto em sua Lei de Acesso à Informação (Lei nº 12.527/2011), reunindo em um mesmo âmbito as perspectivas democrática e técnica dos dados abertos. A análise dos dados abrange o período que vai desde 1988, quando do reconhecimento do direito de acesso à informação pública na Constituição Federal, até a publicação dos primeiros planos de dados abertos (PDAs) decorrentes do Decreto nº 8.777/2016, que regulamentou a abertura dos dados públicos no Governo Federal. As técnicas de pesquisa empregadas compreendem a análise de documentos, registros de atividades, relatos de entrevistas e notícias, com vistas a reconstituir as trajetórias de institucionalização da transparência pública e da interoperabilidade no Governo Federal – ambas as searas nas quais se inserem os OGD enquanto PPDM A pesquisa identificou um padrão de desenvolvimento incremental da Política, inserida na longa construção da agenda tanto da transparência (no âmbito da Controladoria-Geral da União), quanto da governança digital (pelo Ministério do Planejamento, Orçamento e Gestão). Não obstante o incrementalismo, identificaram-se alguns momentos de maior avanço, especialmente a instituição oficial da Política por meio do Decreto nº 8.777/2016, com a conseguinte publicação de PDAs de vários órgãos federais. Ao problematizar os OGD e identificar fatores críticos para a sua adoção, a pesquisa cumpre seu objetivo geral de contribuir com subsídios teórico-práticos para o processo de aprendizagem e planejamento de estratégias de abertura de dados mais realista e adequadamente escalonadas segundo incentivos e recursos disponíveis. / This research is dedicated to the study of the critical institutional factors for the adoption of open government data as a reference to the treatment of public information in the Digital Age. Open government data (OGD) are public data published on the Web in an open format, structured and logically understandable, so that anyone can freely access, re-use, modify and redistribute them, for any purpose, subject only, at most, to the requirement to attribute and sharealike. OGD carry the premise of affecting democracy positively by promoting transparency, since they simplify the access to the information necessary for participation and social control. OGD also affect the state capacity by enabling integration and interoperability, which allow the opportune access to critical data for governance and decision-making, and the collaboration between government and society through data re-use, in order to create new services and applications. Despite the benefits publicized by the proponents of OGD, institutional variables modulate the efforts of opening public organizations data. Therefore, they limit the democratic gains and government effectiveness associated with OGD. Based on the historical institutionalism and on the concept of digitally mediated public policy (DMPP), proposed by Jane Fountain, we argue that variables associated with the regulation of the access to public information, and with the implementation framework of the policy, as well as variables resultant from the path dependence of technological choices, affect the embeddedness of OGD Due to these institutional factors, an open government data policy will present an incremental development pattern, despite the disruptive feature of Digital Age technologies. The investigation employs a qualitative methodology and the method of exploratory case study of the Brazilian federal government open data policy, launched in 2012 and conducted by the National Infrastructure of Open Data (INDA, in portuguese). Brazil was the first country to state the publication of data in open format through its Access to Information Act (Law nº 12.527/2011), gathering in the same context the democratic and technical perspectives of open data. The analysis of the data ranges from 1988, when the right of access to public information was recognized by the Federal Constitution, to the publication of the first open data plans following the Decree nº 8.777/2016, which regulated the opening of Federal Government public data. The research techniques employed include analysis of documents, activity reports, interviews notes and news, in order to rebuild the trajectories of public transparency and interoperability institutionalizations in the Federal Government, both fields in which OGD work as PPDM The investigation identified an incremental development pattern of the Open Data Policy, as part of the extensive process of agenda building of both transparency (under the Office of the Comptroller General of the Union) and digital governance (by the Ministry of Planning, Budget, and Management). Regardless of the incrementalism, it was identified some moments of greater advance, particularly with the official adoption of the Policy by the Decree nº 8.777/2016 and the following publication of open data plans by several federal agencies. By problematizing OGD and identifying the critical factors to its implementation, this investigation accomplishes its overall objective of contributing with theoretical and practical arguments for more realistic learning and planning processes of data opening strategies, considering the incentives and available resources.
|
187 |
Framework for digital preservation of electronic government in GhanaAdu, Kofi Koranteng January 2015 (has links)
The global perspective on digital revolution is one that has received a rapturous approval from information professionals, scholars and practitioners. However, such an approval has come at a great cost to memory institutions as the preservation of digital information has proved to be a complex phenomenon to memory institutions. Guided by the multi method design and underpinned by the triangulation of questionnaires, interviews, observation and document analysis, the study examined digital preservation of e-government in Ghana. Findings revealed that the creation of databases, digital publication, emails, website information and tweets were often ocassioned by the use of ICT, e-government, and application of legislations and public policies. It observed that these types of digital records were in urgent need for preservation as most of the ministries and agencies were unable to access their digital records.
While the application of a digital preservation tool (Lots of Copies Keeps Stuff Safe) was a familiar terrain to the ministries and agencies, there was expressed lack of awareness about digital preservation support organisations and digital preservation standards.
The study identified funding, level of security and privacy, skills training and technological obsolescence as factors that pose key threats to digital preservation. It noted backup strategy, migration, metadata and trusted repositories as the most widely implemented preservation strategy across the ministries and agencies. On the other hand, cloud computing, refreshing and emulation were the least implemented preservation strategies used to address the digital preservation challenges .
The study recommends that the ministries and agencies can address many of the digital preservation challenges if they leverage on collaborative and participatory opportunities. Such collaborative and participatory opportunities involve the use of experts from other institutions to share resources and use a common protocol through cloud computing and Open Data. It further recommends that the process of developing a digital preservation policy can be guided by a template document from other jurisdictions / Information Science / D. Litt et Phil. (Information Science)
|
188 |
Dados abertos no governo federal brasileiro : desafios de transparência e interoperabilidadePossamai, Ana Júlia January 2016 (has links)
A pesquisa dedica-se ao estudo dos fatores institucionais críticos para a adoção dos dados abertos governamentais como referencial de tratamento das informações públicas na Era Digital. Dados abertos governamentais (open government data, OGD) são dados públicos, publicados na Web em formato aberto, estruturado e compreensível logicamente, de modo que qualquer pessoa possa livremente acessar, reutilizar, modificar e redistribuir, para quaisquer finalidades, estando sujeito a, no máximo, exigências de creditar a sua autoria e compartilhar sob a mesma licença. Os OGD carregam consigo a premissa de que afetam positivamente a democracia, ao promover a transparência, facilitando o acesso a informações necessárias à participação e à realização do controle social. Afetam igualmente a capacidade estatal, ao possibilitar a integração e a interoperabilidade, viabilizando o acesso tempestivo a dados essenciais à gestão e à tomada de decisão, bem como a colaboração governo-sociedade, a partir da reutilização dos dados na criação de novos serviços e aplicações. Contudo, apesar dos benefícios propalados pelos defensores dos OGD, variáveis institucionais modulam os esforços de abertura dos dados das organizações públicas e, consequentemente, limitam os ganhos democráticos e de efetividade governamental a ele associados. Com base no institucionalismo histórico e no conceito de políticas públicas digitalmente mediadas (PPDM), proposto por Jane Fountain, argumentamos que variáveis associadas não só à regulamentação do acesso das informações públicas e à estrutura de implementação da política, mas também decorrentes da dependência da trajetória de escolhas tecnológicas legadas, afetam a incorporação (embeddedness) dos OGD Em virtude desses fatores institucionais, uma política de dados abertos governamentais apresentará um padrão de desenvolvimento incremental, a despeito da característica disruptiva das tecnologias da Era Digital. A pesquisa emprega uma metodologia qualitativa e o método do estudo de caso exploratório da política de dados abertos do Governo Federal, lançada em 2012 e conduzida pela Infraestrutura Nacional de Dados Abertos (INDA). O Brasil foi o primeiro país a prever a publicação de dados em formato aberto em sua Lei de Acesso à Informação (Lei nº 12.527/2011), reunindo em um mesmo âmbito as perspectivas democrática e técnica dos dados abertos. A análise dos dados abrange o período que vai desde 1988, quando do reconhecimento do direito de acesso à informação pública na Constituição Federal, até a publicação dos primeiros planos de dados abertos (PDAs) decorrentes do Decreto nº 8.777/2016, que regulamentou a abertura dos dados públicos no Governo Federal. As técnicas de pesquisa empregadas compreendem a análise de documentos, registros de atividades, relatos de entrevistas e notícias, com vistas a reconstituir as trajetórias de institucionalização da transparência pública e da interoperabilidade no Governo Federal – ambas as searas nas quais se inserem os OGD enquanto PPDM A pesquisa identificou um padrão de desenvolvimento incremental da Política, inserida na longa construção da agenda tanto da transparência (no âmbito da Controladoria-Geral da União), quanto da governança digital (pelo Ministério do Planejamento, Orçamento e Gestão). Não obstante o incrementalismo, identificaram-se alguns momentos de maior avanço, especialmente a instituição oficial da Política por meio do Decreto nº 8.777/2016, com a conseguinte publicação de PDAs de vários órgãos federais. Ao problematizar os OGD e identificar fatores críticos para a sua adoção, a pesquisa cumpre seu objetivo geral de contribuir com subsídios teórico-práticos para o processo de aprendizagem e planejamento de estratégias de abertura de dados mais realista e adequadamente escalonadas segundo incentivos e recursos disponíveis. / This research is dedicated to the study of the critical institutional factors for the adoption of open government data as a reference to the treatment of public information in the Digital Age. Open government data (OGD) are public data published on the Web in an open format, structured and logically understandable, so that anyone can freely access, re-use, modify and redistribute them, for any purpose, subject only, at most, to the requirement to attribute and sharealike. OGD carry the premise of affecting democracy positively by promoting transparency, since they simplify the access to the information necessary for participation and social control. OGD also affect the state capacity by enabling integration and interoperability, which allow the opportune access to critical data for governance and decision-making, and the collaboration between government and society through data re-use, in order to create new services and applications. Despite the benefits publicized by the proponents of OGD, institutional variables modulate the efforts of opening public organizations data. Therefore, they limit the democratic gains and government effectiveness associated with OGD. Based on the historical institutionalism and on the concept of digitally mediated public policy (DMPP), proposed by Jane Fountain, we argue that variables associated with the regulation of the access to public information, and with the implementation framework of the policy, as well as variables resultant from the path dependence of technological choices, affect the embeddedness of OGD Due to these institutional factors, an open government data policy will present an incremental development pattern, despite the disruptive feature of Digital Age technologies. The investigation employs a qualitative methodology and the method of exploratory case study of the Brazilian federal government open data policy, launched in 2012 and conducted by the National Infrastructure of Open Data (INDA, in portuguese). Brazil was the first country to state the publication of data in open format through its Access to Information Act (Law nº 12.527/2011), gathering in the same context the democratic and technical perspectives of open data. The analysis of the data ranges from 1988, when the right of access to public information was recognized by the Federal Constitution, to the publication of the first open data plans following the Decree nº 8.777/2016, which regulated the opening of Federal Government public data. The research techniques employed include analysis of documents, activity reports, interviews notes and news, in order to rebuild the trajectories of public transparency and interoperability institutionalizations in the Federal Government, both fields in which OGD work as PPDM The investigation identified an incremental development pattern of the Open Data Policy, as part of the extensive process of agenda building of both transparency (under the Office of the Comptroller General of the Union) and digital governance (by the Ministry of Planning, Budget, and Management). Regardless of the incrementalism, it was identified some moments of greater advance, particularly with the official adoption of the Policy by the Decree nº 8.777/2016 and the following publication of open data plans by several federal agencies. By problematizing OGD and identifying the critical factors to its implementation, this investigation accomplishes its overall objective of contributing with theoretical and practical arguments for more realistic learning and planning processes of data opening strategies, considering the incentives and available resources.
|
189 |
Dados abertos no governo federal brasileiro : desafios de transparência e interoperabilidadePossamai, Ana Júlia January 2016 (has links)
A pesquisa dedica-se ao estudo dos fatores institucionais críticos para a adoção dos dados abertos governamentais como referencial de tratamento das informações públicas na Era Digital. Dados abertos governamentais (open government data, OGD) são dados públicos, publicados na Web em formato aberto, estruturado e compreensível logicamente, de modo que qualquer pessoa possa livremente acessar, reutilizar, modificar e redistribuir, para quaisquer finalidades, estando sujeito a, no máximo, exigências de creditar a sua autoria e compartilhar sob a mesma licença. Os OGD carregam consigo a premissa de que afetam positivamente a democracia, ao promover a transparência, facilitando o acesso a informações necessárias à participação e à realização do controle social. Afetam igualmente a capacidade estatal, ao possibilitar a integração e a interoperabilidade, viabilizando o acesso tempestivo a dados essenciais à gestão e à tomada de decisão, bem como a colaboração governo-sociedade, a partir da reutilização dos dados na criação de novos serviços e aplicações. Contudo, apesar dos benefícios propalados pelos defensores dos OGD, variáveis institucionais modulam os esforços de abertura dos dados das organizações públicas e, consequentemente, limitam os ganhos democráticos e de efetividade governamental a ele associados. Com base no institucionalismo histórico e no conceito de políticas públicas digitalmente mediadas (PPDM), proposto por Jane Fountain, argumentamos que variáveis associadas não só à regulamentação do acesso das informações públicas e à estrutura de implementação da política, mas também decorrentes da dependência da trajetória de escolhas tecnológicas legadas, afetam a incorporação (embeddedness) dos OGD Em virtude desses fatores institucionais, uma política de dados abertos governamentais apresentará um padrão de desenvolvimento incremental, a despeito da característica disruptiva das tecnologias da Era Digital. A pesquisa emprega uma metodologia qualitativa e o método do estudo de caso exploratório da política de dados abertos do Governo Federal, lançada em 2012 e conduzida pela Infraestrutura Nacional de Dados Abertos (INDA). O Brasil foi o primeiro país a prever a publicação de dados em formato aberto em sua Lei de Acesso à Informação (Lei nº 12.527/2011), reunindo em um mesmo âmbito as perspectivas democrática e técnica dos dados abertos. A análise dos dados abrange o período que vai desde 1988, quando do reconhecimento do direito de acesso à informação pública na Constituição Federal, até a publicação dos primeiros planos de dados abertos (PDAs) decorrentes do Decreto nº 8.777/2016, que regulamentou a abertura dos dados públicos no Governo Federal. As técnicas de pesquisa empregadas compreendem a análise de documentos, registros de atividades, relatos de entrevistas e notícias, com vistas a reconstituir as trajetórias de institucionalização da transparência pública e da interoperabilidade no Governo Federal – ambas as searas nas quais se inserem os OGD enquanto PPDM A pesquisa identificou um padrão de desenvolvimento incremental da Política, inserida na longa construção da agenda tanto da transparência (no âmbito da Controladoria-Geral da União), quanto da governança digital (pelo Ministério do Planejamento, Orçamento e Gestão). Não obstante o incrementalismo, identificaram-se alguns momentos de maior avanço, especialmente a instituição oficial da Política por meio do Decreto nº 8.777/2016, com a conseguinte publicação de PDAs de vários órgãos federais. Ao problematizar os OGD e identificar fatores críticos para a sua adoção, a pesquisa cumpre seu objetivo geral de contribuir com subsídios teórico-práticos para o processo de aprendizagem e planejamento de estratégias de abertura de dados mais realista e adequadamente escalonadas segundo incentivos e recursos disponíveis. / This research is dedicated to the study of the critical institutional factors for the adoption of open government data as a reference to the treatment of public information in the Digital Age. Open government data (OGD) are public data published on the Web in an open format, structured and logically understandable, so that anyone can freely access, re-use, modify and redistribute them, for any purpose, subject only, at most, to the requirement to attribute and sharealike. OGD carry the premise of affecting democracy positively by promoting transparency, since they simplify the access to the information necessary for participation and social control. OGD also affect the state capacity by enabling integration and interoperability, which allow the opportune access to critical data for governance and decision-making, and the collaboration between government and society through data re-use, in order to create new services and applications. Despite the benefits publicized by the proponents of OGD, institutional variables modulate the efforts of opening public organizations data. Therefore, they limit the democratic gains and government effectiveness associated with OGD. Based on the historical institutionalism and on the concept of digitally mediated public policy (DMPP), proposed by Jane Fountain, we argue that variables associated with the regulation of the access to public information, and with the implementation framework of the policy, as well as variables resultant from the path dependence of technological choices, affect the embeddedness of OGD Due to these institutional factors, an open government data policy will present an incremental development pattern, despite the disruptive feature of Digital Age technologies. The investigation employs a qualitative methodology and the method of exploratory case study of the Brazilian federal government open data policy, launched in 2012 and conducted by the National Infrastructure of Open Data (INDA, in portuguese). Brazil was the first country to state the publication of data in open format through its Access to Information Act (Law nº 12.527/2011), gathering in the same context the democratic and technical perspectives of open data. The analysis of the data ranges from 1988, when the right of access to public information was recognized by the Federal Constitution, to the publication of the first open data plans following the Decree nº 8.777/2016, which regulated the opening of Federal Government public data. The research techniques employed include analysis of documents, activity reports, interviews notes and news, in order to rebuild the trajectories of public transparency and interoperability institutionalizations in the Federal Government, both fields in which OGD work as PPDM The investigation identified an incremental development pattern of the Open Data Policy, as part of the extensive process of agenda building of both transparency (under the Office of the Comptroller General of the Union) and digital governance (by the Ministry of Planning, Budget, and Management). Regardless of the incrementalism, it was identified some moments of greater advance, particularly with the official adoption of the Policy by the Decree nº 8.777/2016 and the following publication of open data plans by several federal agencies. By problematizing OGD and identifying the critical factors to its implementation, this investigation accomplishes its overall objective of contributing with theoretical and practical arguments for more realistic learning and planning processes of data opening strategies, considering the incentives and available resources.
|
190 |
How to improve value towards third-party developers : An analysis of the open data platform TrafiklabSöderman, Anton January 2016 (has links)
This thesis studies the open dataplatform Trafiklab, which provides openaccess to data regarding public transportin Sweden. The study is from the perspective ofthird-party developersand deals with the question of how theyvalue creating mechanisms towards themcan be improved. It is based on twodifferent surveys and severalinterviews conducted with third-partydevelopers using Trafiklab. The resultsshow that Trafiklab needs to improvetheir documentation, communication,initial use, and change theirperspective and role towards thedevelopers using Trafiklab. To improveopen data, in general, a greater focus ontransparency rather then transparency issuggested. / Detta examensarbete syftar till att skapa förbättringsförslag till en specifik typ avmjukvara utifrån använderupplevelse. Detta område är inom den öppna data branschen,öppna data innebär att man öppnar upp data så att vem som helst ta del och använda sigav den. Inom kollektivtrafikbranschen distribueras öppen data via en plattform somheter Trafiklab, som ägs gemensamt av branschens parter via en organisation mednamnet Samtrafiken. Det är inom detta område som denna uppsats försöker besvarafrågan hur Trafiklab kan bli mer värdefullt för tredjepartsutvecklare utifrån enmultidisciplinär ansats. Detta kommer att genomföras genom att både försöka ta reda påvad för värde Trafiklab idag har och hur deras nätverk gör värdeskapande möjligt.Detta projekt bestod av tre delprojekt som alla byggde på varandra. Först genomfördeen förstudie att skapa en grund för vidare undersökning presenteras teori kring öppendata och plattform. En plattform är något som skapar en grund som andra aktörer kanbygga på, ett exempel på detta är Apples iPhone där andra aktörer kan byggakomplementerande applikationer. När man har plattformer som tillhandahåller öppendata är det viktig med en balans mellan kontroll och tillgänglighet, något somsammanfattas i teorin kring ”plattform gränsresurser”. Det andra delprojektet varutformades och genomfördes av en enkät, baserat delvis på en redan genomförd enkät.Detta leda till det tredje delprokelt till mer djupgående intervjuer med enkät deltagarnaför att få en mer detaljerad bild av situationen.Resultatet från studien som helhet presenteras i form av en genomgång av Samtrafikenoch Trafiklab följt av en presentation av tredjepartsutvecklares åsikter om Trafiklab.Utvecklarnas kommentarer var i grunden positiva men det fanns områden därförbättringar skulle kunna genomföras. Detta var inom områden som dokumentation,kommunikation, förenkling att börja utveckla och Trafiklabs syn påtredjepartsutvecklare och deras relation. Detta resultat sammanfattas och diskuterassedan med hjälp av de teoretiska begreppen. Där presenteras en kartläggning avTrafiklabs plattforms ekosystem. Detta följs av en mer grundläggande teoretiskdiskussion som avslutas med att ”plattform gränsresurs modellen” appliceras ochförbättringsförslag ges.Uppsatsen avslutas med att föreslå att begreppet öppen data inte enbart betonartillgänglighet men också transparens. Detta gäller både för Trafiklab och för öppen datai Sverige generellt. Några ytterligare förslag till Trafiklab och på fortsatt intressantforsking inom området ges också som avslutning.
|
Page generated in 0.0448 seconds