• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 239
  • 139
  • 42
  • 40
  • 35
  • 19
  • 15
  • 10
  • 8
  • 7
  • 5
  • 5
  • 5
  • 4
  • 3
  • Tagged with
  • 621
  • 136
  • 119
  • 108
  • 108
  • 103
  • 99
  • 70
  • 62
  • 61
  • 54
  • 54
  • 53
  • 46
  • 45
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
221

Metadata quality in the cultural heritage sector: stakes, problems and solutions

Van Hooland, Seth 10 March 2009 (has links)
Contrairement à l'opinion dominante, les nouvelles technologies n'ont pas toujours un impact positif sur la qualité des métadonnées dans le secteur culturel. Après dix ans d'expérience avec les projets de numérisation dans nos musées, bibliothèques et archives, une réflexion critique se montre plus que jamais nécessaire pour évaluer à quelles conditions ce genre de projets d'informatisation peuvent offrir une valeur ajoutée pour la documentation de notre patrimoine culturel. Cette réflexion se base, entre autres, sur un ensemble de case studies représentatifs dans un contexte international. A cette fin, nous présenterons et définirons un cadre méthodologique et conceptuel original concernant l'impact des technologies sur la qualité des métadonnées. Sur cette base, nous proposons et expérimentons trois approches opérationnelles novatrices en vue d'améliorer la qualité des systèmes d'information déployés dans le secteur culturel. / Doctorat en Information et communication / info:eu-repo/semantics/nonPublished
222

Služby nad digitálními archivy v oblasti vědy a výzkumu / Services over digital archives in science and research

Lochman, Martin January 2015 (has links)
The thesis focuses on various information services over digital archives and institutional repositories in the area of science and research. It aims to ascertain these online services and introduce, analyse and assess selected examples in detail. Their quantitative as well as qualitative features are emphasized - The work is logically divided into four chapters. The first one defines the basic terminology and outlines the contemporary status of scholarly communication on the Internet. The second chapter introduces a comprehensive list of services as well as their suggested typology. The third chapter constitutes the core of the thesis and presents the descriptions and analysys of the selected services. The fourth final part includes the comparison of their relevant parameters, final assessment and the perspectives of their future development. Keywords digital archive, institutional repository, scholarly communication, information services, OAI- PMH, metadata harvesting, open access
223

Towards a comprehensive functional layered architecture for the Semantic Web

Gerber, Aurona J. 30 November 2006 (has links)
The Semantic Web, as the foreseen successor of the current Web, is envisioned to be a semantically enriched information space usable by machines or agents that perform sophisticated tasks on behalf of their users. The realisation of the Semantic Web prescribe the development of a comprehensive and functional layered architecture for the increasingly semantically expressive languages that it comprises of. A functional architecture is a model specified at an appropriate level of abstraction identifying system components based on required system functionality, whilst a comprehensive architecture is an architecture founded on established design principles within Software Engineering. Within this study, an argument is formulated for the development of a comprehensive and functional layered architecture through the development of a Semantic Web status model, the extraction of the function of established Semantic Web technologies, as well as the development of an evaluation mechanism for layered architectures compiled from design principles as well as fundamental features of layered architectures. In addition, an initial version of such a comprehensive and functional layered architecture for the Semantic Web is constructed based on the building blocks described above, and this architecture is applied to several scenarios to establish the usefulness thereof. In conclusion, based on the evidence collected as result of the research in this study, it is possible to justify the development of an architectural model, or more specifically, a comprehensive and functional layered architecture for the languages of the Semantic Web. / Computing / PHD (Computer Science)
224

THE APPLICATION OF OBJECT-ORIENTED DATA MANAGEMENT TECHNIQUES TO T&E DATA CHALLENGES

Dawson, Dan 10 1900 (has links)
ITC/USA 2006 Conference Proceedings / The Forty-Second Annual International Telemetering Conference and Technical Exhibition / October 23-26, 2006 / Town and Country Resort & Convention Center, San Diego, California / This paper describes an adaptive data management architecture capable of supporting order-of-magnitude data volume increases without a priori knowledge of data structures. The architecture allows users to generate and maintain data in optimal legacy formats while managing and extracting information with common analysis tools. This paper shows how an object-oriented data management system can manage both data and the knowledge imparted to the data by users.
225

Refining metadata for genre archives : a case of travel archives

Arnold, Leroy Joe 11 1900 (has links)
Information Science / D.Litt. et Phil. (Information Science)
226

MEASUREMENT-CENTRIC DATA MODEL FOR INSTRUMENTATION CONFIGURATION

Malatesta, William, Fink, Clay 10 1900 (has links)
ITC/USA 2007 Conference Proceedings / The Forty-Third Annual International Telemetering Conference and Technical Exhibition / October 22-25, 2007 / Riviera Hotel & Convention Center, Las Vegas, Nevada / CTEIP has launched the integrated Network Enhanced Telemetry (iNET) project to foster advances in networking and telemetry technology to meet emerging needs of major test programs. In the past these programs have been constrained by vendor proprietary equipment configuration utilities that force a significant learning curve on the part of instrumentation personnel to understand hardware idiosyncrasies and require significant human interaction and manipulation of data to be exchanged between different components of the end-to-end test system. This paper describes an ongoing effort to develop a measurement-centric data model of airborne data acquisition systems. The motivation for developing such a model is to facilitate hardware and software interoperability and to alleviate the need for vendor-specific knowledge on the part of the instrumentation engineer. This goal is driven by requirements derived from scenarios collected by the iNET program. This approach also holds the promise of decreased human interaction with and manipulation of data to be exchanged between system components.
227

Generic Metadata Handling in Scientific Data Life Cycles

Grunzke, Richard 11 May 2016 (has links) (PDF)
Scientific data life cycles define how data is created, handled, accessed, and analyzed by users. Such data life cycles become increasingly sophisticated as the sciences they deal with become more and more demanding and complex with the coming advent of exascale data and computing. The overarching data life cycle management background includes multiple abstraction categories with data sources, data and metadata management, computing and workflow management, security, data sinks, and methods on how to enable utilization. Challenges in this context are manifold. One is to hide the complexity from the user and to enable seamlessness in using resources to usability and efficiency. Another one is to enable generic metadata management that is not restricted to one use case but can be adapted with limited effort to further ones. Metadata management is essential to enable scientists to save time by avoiding the need for manually keeping track of data, meaning for example by its content and location. As the number of files grows into the millions, managing data without metadata becomes increasingly difficult. Thus, the solution is to employ metadata management to enable the organization of data based on information about it. Previously, use cases tended to only support highly specific or no metadata management at all. Now, a generic metadata management concept is available that can be used to efficiently integrate metadata capabilities with use cases. The concept was implemented within the MoSGrid data life cycle that enables molecular simulations on distributed HPC-enabled data and computing infrastructures. The implementation enables easy-to-use and effective metadata management. Automated extraction, annotation, and indexing of metadata was designed, developed, integrated, and search capabilities provided via a seamless user interface. Further analysis runs can be directly started based on search results. A complete evaluation of the concept both in general and along the example implementation is presented. In conclusion, generic metadata management concept advances the state of the art in scientific date life cycle management.
228

Social knowledge creation and emergent digital research infrastructure for early modern studies

Powell, Daniel James 02 May 2016 (has links)
This dissertation examines the creation of innovative scholarly environments, publications, and resources in the context of a social knowledge creation affordances engendered by digital technologies. It draws on theoretical and praxis-oriented work undertaken as part of the Electronic Textual Cultures Laboratory (ETCL), work that sought to model how a socially aware and interconnected domain of scholarly inquiry might operate. It examines and includes two digital projects that provide a way to interrogate the meaning of social knowledge creation as it relates to early modern studies. These digital projects – A Social Edition of the Devonshire Manuscript (BL Add. 17,492) and the Renaissance Knowledge Network – approach the social in three primary ways: they approach the social as a quality of material textuality, deriving from the editorial theories of D. F. McKenzie and Jerome McGann; as a type of knowledge work that digital technologies can facilitate; and as a function of consciously designed platforms and tools emerging from the digital humanities. In other words, digital humanities practitioners are uniquely placed to move what has until now been customarily an analytical category and enact or embed it in a practical, applied way. The social is simultaneously a theoretical orientation and a way of designing and making digital tools — an act which in turn embeds such a theoretical framework in the material conditions of knowledge production. Digital humanists have sought to explain and often re-contextualise how knowledge work occurs in the humanities; as such, they form a body of scholarship that undergirds and enriches the present discussion around how the basic tasks of humanities work—research, discovery, analysis, publication, editing—might alter in the age of Web 2.0 and 3.0. Through sustained analysis of A Social Edition of the Devonshire Manuscript (BL Add 17,492) and the Renaissance Knowledge Network, this dissertation argues argues that scholarly communication is shifting from a largely individualistic, single-author system of traditional peer-reviewed publication to a broadly collaborative, socially-invested ecosystem of peer production and public facing digital production. Further, it puts forward the idea that the insights gained from these long-term digital humanities projects – the importance of community investment and maintenance in social knowledge projects, building resources consonant with disciplinary expectations and norms, and the necessity of transparency and consultation in project development – are applicable more widely to shifting norms in scholarly communications. These insights and specific examples may change patters of behaviour that govern how humanities scholars act within a densely interwoven digital humanities. This dissertation is situated at the intersection of digital humanities, early modern studies, and to discussions of humanities knowledge infrastructure. In content it reports on and discusses two major digital humanities projects, putting a number of previous peer-reviewed, collaboratively authored publications in conversation with each other and the field at large. As the introduction discusses, each chapter other than the introduction and conclusion originally stood on its own. Incorporating previously published, peer-reviewed materials from respected journals, as well as grants, white papers, and working group documents, this project represents a departure from the proto-monograph model of dissertation work prevalent in the humanities in the United States and Canada. Each component chapter notes my role as author; for the majority of the included material, I acted as lead author or project manager, coordinating small teams of makers and writers. In form this means that the following intervenes in discussions surrounding graduate training and professionalization. Instead of taking the form of a cohesive monograph, this project is grounded in four years of theory and practice that closely resemble dissertations produced in the natural sciences. / Graduate
229

Complete Vendor-Neutral Instrumentation Configuration with IHAL and TMATS XML

Hamilton, John, Darr, Timothy, Fernandes, Ronald, Sulewski, Joe, Jones, Charles 10 1900 (has links)
ITC/USA 2011 Conference Proceedings / The Forty-Seventh Annual International Telemetering Conference and Technical Exhibition / October 24-27, 2011 / Bally's Las Vegas, Las Vegas, Nevada / Previously, we have presented an approach to achieving standards-based multi-vendor hardware configuration using the Instrumentation Hardware Abstraction Language (IHAL) and an associated Application Programming Interface (API) specification. In this paper, we extend this approach to include support for configuring PCM formats. This capability is an appropriate fit for IHAL since changes to hardware settings can affect the current telemetry format and vice versa. We describe extensions made to the IHAL API in order to support this capability. Additionally, we show how complete instrumentation configurations can be described using an integrated IHAL and TMATS XML. Finally, we describe a demonstration of this capability implemented for data acquisition hardware produced by L-3 Telemetry East.
230

Integration of relational database metadata and XML technology to develop an abstract framework to generate automatic and dynamic web entry forms

Elsheh, Mohammed Mosbah January 2009 (has links)
Developing interactive web application systems requires a large amount of effort on designing database, system logic and user interface. These tasks are expensive and error-prone. Web application systems are accessed and used by many different sets of people with different backgrounds and numerous demands. Meeting these demands requires frequent updating for Web application systems which results in a very high cost process. Thus, many attempts have been made to automate, to some degree, the construction of Web user interfaces. Three main directions have been cited for this purpose. The first direction suggested of generating user interfaces from the application's data model. This path was able to generate the static layout of user interfaces with dynamic behaviour specified programmatically. The second tendency suggested deployment of the domain model to generate both, the layout of a user interface and its dynamic behaviour. Web applications built based on this approach are most useful for domain-specific interfaces with a relatively fixed user dialogue. The last direction adopted the notion of deploying database metadata to developing dynamic user interfaces. Although the notion was quite valuable, its deployment did not present a generic solution for generating a variety of types of dynamic Web user interface targeting several platforms and electronic devices. This thesis has inherited the latter direction and presented significant improvements on the current deployment of this tendency. This thesis aims to contribute towards the development of an abstract framework to generate abstract and dynamic Web user interfaces not targeted to any particular domain or platform. To achieve this target, the thesis proposed and evaluates a general notion for implementing a prototype system that uses an internal model (i.e. database metadata) in conjunction with XML technology. Database metadata is richer than any external model and provides the information needed to build dynamic user interfaces. In addition, XML technology became the mainstream of presenting and storing data in an abstract structure. It is widely adopted in Web development society because of its ability to be transformed into many different formats with a little bit of effort. This thesis finds that only Java can provide us with a generalised database metadata based framework. Other programming languages apply some restrictions on accessing and extracting database metadata from numerous database management systems. Consequently, JavaServlets and relational database were used to implement the proposed framework. In addition, Java Data Base Connectivity was used to bridge the two mentioned technologies. The implementation of our proposed approach shows that it is possible and very straightforward to produce different automatic and dynamic Web entry forms that not targeted at any platform. In addition, this approach can be applied to a particular domain without affecting the main notion or framework architecture. The implemented approach demonstrates a number of advantages over the other approaches based on external or internal models.

Page generated in 0.0512 seconds