1 |
MagIC as a FAIR Repository for America's Directional Archaeomagnetic Legacy DataJones, Shelby A., Blinman, Eric, Tauxe, Lisa, Cox, J. R., Lengyel, Stacey, Sternberg, Robert, Eighmy, Jeffrey, Wolfman, Daniel, DuBois, Robert 01 October 2021 (has links)
Beginning in 1964, an academic lineage of Robert DuBois and his students, Daniel Wolfman and Jeffrey Eighmy, developed dedicated United States-based archaeomagnetic research programs. Collectively, they analyzed over 5,377 archaeomagnetic sites, primarily from North America, dated to less than 2,000 years old. Yet despite their decades of effort, few journal publications resulted. Most of their published results are embedded in archeological reports, often without technical data, which limits the data's accessibility. Furthermore, when published, the results are generally averaged at the site level using statistical conventions different from today's standards, limiting the data's comparability and (re)usability. In 2015, we undertook a salvage archival study to digitize the surviving data and metadata from the scientists' individual estates and emeritus collections. We digitized measurement data from more than 51,000 specimens, reinterpreted them using modern conventions, and uploaded them to the FAIR-adhering magnetic data repository, earthref.org/MagIC. The reinterpreted site-level results from the three laboratories are mutually consistent, permitting the individual data sets to be combined and analyzed as single regional entities. Through incorporation into the MagIC repository, these legacy data are now accessible for incorporation into archaeomagnetic and global magnetic field modeling efforts, critical to understanding Earth's magnetic field variation through time. In the Four Corners region of the United States Southwest, this digitized archive advances the development of a new regional paleosecular variation curve used in archaeomagnetic dating. This project highlights both the value and complexities of managing legacy data; the many lessons learned to set a precedent for future paleomagnetic data recovery efforts.
|
2 |
Challenges for Global Ocean Observation of Life in the SeaMüller, Kankou January 2024 (has links)
Globally sustained observations of the marine ecosystem and biodiversity are crucial to understand changes in the ocean environment, manage ocean resources and assess progress towards internationally agreed targets such as the SDGs. Efforts in the observing community are growing to close the current gap in the collection of ecosystem and biological data through harmonizing and coordinating monitoring activities around the Biological and Ecosystem Essential Ocean Variables (BioEco EOVs). This thesis looks into the challenges the implementation of a sustainable and coordinated BioEco ocean observation system could meet through conducting a systematic literature review and key informant interviews. It identified a number of key areas of challenges for BioEco ocean observation and corresponding challenges within these areas. The discussion and analysis of the results led to the identification of 11 priority recommendations to implement successful and sustained observations of life in the sea globally: (1) Clear communication of the BioEco EOV concept to the global observing community and international agreement on standards and best practices for data collection; (2) Create an overview of the various scattered databases for BioEco ocean data to realize a “world wide web of oceanographic data”; (3) Promotion and widespread adoption of the FAIR data principles coupled with the development of strong and adaptive data infrastructures and architectures to enable data and database interoperability; (4) Increased capacity in marine science to increase the understanding of large-scale ecological processes and interactions and thereby the quality of data analysis, which can enable better data products catering to the needs of society and decision-makers; (5) Implement mechanisms for better coordination, communication and collaboration across disciplines, institutes, monitoring programs and geographical scales to promote knowledge exchange, resource and capacity sharing; (6) Unification of the fragmented ocean governance framework, implementation of clear governance structures for glocal BioEco ocean observation and harmonization of ocean data integration into policies and decision-making; (7) Unification of the scattered ocean observation efforts under one transparent system that is adaptive to its user needs and has strong links between its components; (8) Implementation of standards and best practices within the system while still encouraging innovation; (9) Implementation of sustainable long-term funding mechanisms at all scales while making the observing system more cost-effective and -efficient; (10) Implementation of continuous capacity development activities for all system components; (11) Improved participation of developing nations through targeted capacity development and strong collaboration processes including capacity and resource sharing as well as knowledge and technology transfer.
|
3 |
Fit for purpose? : a metascientific analysis of metabolomics data in public repositoriesSpicer, Rachel January 2019 (has links)
Metabolomics is the study of metabolites and metabolic processes. Due to the diversity of structures and polarities of metabolites, no single analytical technique is able to measure the entire metabolome - instead a varied set of experimental designs and instrumental technologies are used to measure specific portions. This has led to the development of many distinct data analysis and processing methods and software. There is hope that metabolomics can be utilized for clinical applications, in toxicology and to measure the exposome. However, for these applications to be realised data must be high quality, sufficiently standardised and annotated, and FAIR (Findable, Accessible, Interoperable and Reproducible). For this purpose, it is also important that standardised, FAIR software workflows are available. There has also recently been much concern over the reproducibility of scientific research, which FAIR and open data, and workflows can help to address. To this end, this thesis aims to assess current practices and standards of sharing data within the field of metabolomics, using metascientific approaches. The types of functions of software for processing and analysing metabolomics data is also assessed. Reporting standards are designed to ensure that the minimum information required to un- derstand and interpret the results of analysis are reported. However, poor reporting standards are ignored and not complied with. Compliance to the biological context Metabolomics Standards Initiative (MSI) guidelines was examined, in order to investigate their timeliness. The state of open data within the metabolomics community was examined by investigating how much publicly available metabolomics data there is and where has it been deposited. To explore whether journal data sharing policies are driving open metabolomics data, which journals publish articles that have their underlying data made open was also examined. However, open data alone is not inherently useful: if data is incomplete, lacking in quality or missing crucial metadata, it is not valuable. Conversely, if data are reused, this can demonstrate the worth of public data archiving. Levels of reuse of public metabolomics data were therefore examined. With greater than 250 software tools specific for metabolomics, practitioners are faced with a daunting task to select the best tools for data collection and analysis. To help educate researchers about what software is available, a taxonomy of metabolomics software tools and a GitHub pages wiki, which provides extensive details about all included software, have been developed.
|
4 |
Wissenswertes rund um Forschungsdaten: 10. November 2020, 10 - 11 UhrKuhnert, Dana, Queitsch, Manuela 23 November 2020 (has links)
Die im Rahmen von Forschungsprojekten gewonnenen Forschungsdaten sind eine wesentliche Grundlage der wissenschaftlichen Arbeit. In nahezu allen Fachdisziplinen gewinnen sie immer mehr an Bedeutung. Die Nachvollziehbarkeit und die Qualität wissenschaftlicher Forschung wird durch die Dokumentation, die langfristige Sicherung und Bereitstellung der Forschungsdaten gefördert.
Außerdem stellt die Publikation und die langfristige Sicherung von Forschungsdaten bei der DFG, EU und beim BMBF in vielen Fällen eine Voraussetzung für die Förderung von Forschungsvorhaben dar.
Was genau sind Forschungsdaten? Was versteht man unter dem FAIR-Prinzip?
Offene Forschungsdaten: Welche Vorteile bringen sie für die Forschenden?
Wo kann man Forschungsdaten archivieren und veröffentlichen?
Welche Services für Forschende der TU Bergakademie Freiberg zum Thema Forschungsdaten bieten die UB Freiberg und die Kontaktstelle Forschungsdaten der SLUB/ZiH Dresden?
Diese und weitere Fragen beantworten Manuela Queitsch, Koordinatorin für Forschungsdaten an der SLUB Dresden und Teammitglied an der Kontaktstelle Forschungsdaten in Dresden und Dr. Dana Kuhnert, Fachreferentin für Wirtschafts- und Rechtswissenschaften der UB Bergakademie Freiberg.
|
5 |
Digital Humanities Day Leipzig (DHDL) 2023Piontkowitz, Vera, Kretschmer, Uwe, Burghardt, Manuel 24 January 2024 (has links)
Die Poster-Reihe des Digital Humanities Day Leipzig 2023 (DHDL) präsentiert eine facettenreiche Sammlung von Projekten und Forschungsarbeiten aus dem “Big Tent” der Digital Humanities und zeigt eindrucksvoll die interdisziplinären Verknüpfungen und die Breite des Feldes auf. Die Beiträge stammen von Forscher:innen aus Leipzig, aus der Region Mitteldeutschland und darüber hinaus. Sie bieten Einblicke in aktuelle Forschungsprojekte und demonstrieren die Anwendung digitaler Technologien in den Geisteswissenschaften. Beim DHDL 2023 stellten über 20 Gruppen aktuelle Forschungsprojekte in einer Poster-Session vor.
|
6 |
Towards an Ontology-Based Phenotypic Query ModelBeger, Christoph, Matthies, Franz, Schäfermeier, Ralph, Kirsten, Toralf, Herre, Heinrich, Uciteli, Alexandr 10 October 2023 (has links)
Clinical research based on data from patient or study data management systems plays an
important role in transferring basic findings into the daily practices of physicians. To support study
recruitment, diagnostic processes, and risk factor evaluation, search queries for such management
systems can be used. Typically, the query syntax as well as the underlying data structure vary
greatly between different data management systems. This makes it difficult for domain experts (e.g.,
clinicians) to build and execute search queries. In this work, the Core Ontology of Phenotypes is used
as a general model for phenotypic knowledge. This knowledge is required to create search queries
that determine and classify individuals (e.g., patients or study participants) whose morphology,
function, behaviour, or biochemical and physiological properties meet specific phenotype classes. A
specific model describing a set of particular phenotype classes is called a Phenotype Specification
Ontology. Such an ontology can be automatically converted to search queries on data management
systems. The methods described have already been used successfully in several projects. Using
ontologies to model phenotypic knowledge on patient or study data management systems is a viable
approach. It allows clinicians to model from a domain perspective without knowing the actual data
structure or query language.
|
7 |
Digital Humanities Day Leipzig06 February 2024 (has links)
Der Digital Humanities Day Leipzig ist eine seit 2017 jährlich am Dies academicus stattfindende Veranstaltung des Forums für Digital Humanities Leipzig (FDHL; fdhl.info), welche die regionale und überregionale Vernetzung von DH-Akteur:innen zum Ziel hat
|
8 |
Moving Data - Sicherung und Weiterentwicklung einer Wissensbasis zu Unruhestifter:innen und weiteren Akteur:innen in und um die DDR-KirchenSauer, Philipp, Silomon, Anke, Mühleder, Peter, Goldhahn, Dirk, Naether, Franziska 11 April 2024 (has links)
No description available.
|
9 |
<b>AB INITIO CALCULATIONS OF HIGH-PRESSURE PROPERTIES OF CERAMIC MATERIALS</b>Chukwuma Ezenwata (19185754) 23 July 2024 (has links)
<p dir="ltr">Electronic structure calculations, specifically plane wave Density functional theory (DFT) in conjunction with data science resources on the nanoHUB platform, are a powerful tool for the study of materials at extreme conditions. nanoHUB’s resources for FAIR (Findable, Accessible, Interoperable and Reproducible) data and workflows accelerate research and sharing of results. With these combined resources we studied the high-pressure properties and stability of various ceramic materials, such as the polytypes of silicon carbides (SiC).</p><p dir="ltr">Silicon carbides are of interest for their exceptional mechanical strength, thermal stability and chemical resistance making them attractive for applications at extreme conditions. To understand the temperature- and pressure-induced phase transitions we studied the electronic structure, phonon dispersion and elastic constants of major synthesizable SiC polytypes as a function of pressure. In addition, we report on the elastic constants as a function of pressure for the main polymorphs. This DFT workflows are published online, not only enhances the reproducibility of findings, but can also accelerates the discovery and development of new material properties.</p>
|
Page generated in 0.1916 seconds