• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1614
  • 703
  • 185
  • 116
  • 60
  • 59
  • 50
  • 29
  • 23
  • 19
  • 17
  • 12
  • 12
  • 11
  • 11
  • Tagged with
  • 3330
  • 962
  • 888
  • 464
  • 370
  • 344
  • 343
  • 316
  • 315
  • 287
  • 281
  • 277
  • 274
  • 271
  • 258
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Towards harnessing computational workflow provenance for experiment reporting

Alper, Pinar January 2016 (has links)
We’re witnessing the era of Data-Oriented Science, where investigations routinely involve computational data analysis. The research lifecycle has now become more elaborate to support the sharing and re-use of scientific data. To establish the veracity of shared data, scientific communities aim for systematising 1) the process of analysing data, and, 2) the reporting of analyses and results. Scientific workflows are a prominent mechanism for systematising analyses by encoding them as automated processes and documenting process executions with Workflow Provenance. Meanwhile, systematic reporting calls for discipline-specific Experimental Metadata to be provided outlining the context of data analysis such as source/reference datasets and community resources used, analytical methods and their parameter settings. A natural expectation would be that investigations, which adopt a systematic, workflow-based approach to the analysis can be advantageous at the time of reporting. This premise holds weakly. While workflow provenance supports streamlined enactment of analyses, their auditability and verifiability, we conjecture that it has limited contribution to reporting. This dissertation focuses on eliciting the apparent disconnect of Workflow Provenance and Experimental Metadata as the provenance gap. We identify complexity, mixed granularity, and genericity as characteristics of workflow provenance that underlie this gap. In response we develop techniques for provenance abstraction, analysis and annotation. We argue that workflow provenance is accompanied with implicit information, that can be made explicit to inform these techniques. Through empirical evidence we show that workflow steps have common functional characteristics, which we capture in a taxonomy of Workflow Motifs. We show how formally defined Graph Transformations can exploit Motifs to identify causes of complexity in workflows and abstract them to structurally simpler forms. We build on insight from prior research to show how execution and provenance collection behaviour of a workflow system can anticipate the granularity characteristics of provenance. We provide declarative anticipatory rules for the static-analysis of workflows of the Taverna system. We observe that scientific context is often available in embedded form in data and argue that data can be lifted to become metadata by discipline-specific metadata extractors. We outline a framework, that can be plugged with extractors and provide operators that encapsulate generic procedures to annotate workflow provenance. We implement our techniques with technology-independent provenance models and we showcase their benefit using real-world workflows.
52

Rzsweep: A New Volume-Rendering Technique for Uniform Rectilinear Datasets

Chaudhary, Gautam 10 May 2003 (has links)
A great challenge in the volume-rendering field is to achieve high-quality images in an acceptable amount of time. In the area of volume rendering, there is always a trade-off between speed and quality. Applications where only high-quality images are acceptable often use the ray-casting algorithm, but this method is computationally expensive and typically achieves low frame rates. The work presented here is RZSweep, a new volume-rendering algorithm for uniform rectilinear datasets, that gives high-quality images in a reasonable amount of time. In this algorithm a plane sweeps the vertices of the implicit grid of regular datasets in depth order, projecting all the implicit faces incident on each vertex. This algorithm uses the inherent properties of a rectilinear datasets. RZSweep is an object-order, back-toront, direct volume rendering, face projection algorithm for rectilinear datasets using the cell approach. It is a single processor serial algorithm. The simplicity of the algorithm allows the use of the graphics pipeline for hardware-assisted projection, and also, with minimum modification, a version of the algorithm that is graphics-hardware independent. Lighting, color and various opacity transfer functions are implemented for giving realism to the final resulting images. Finally, an image comparison is done between RZSweep and a 3D texture-based method for volume rendering using standard image metrics like Euclidian and geometric differences.
53

Visualization of Computer-Modeled Forests for Forest Management

Mohammadi-Aragh, Mahnas Jean 11 December 2004 (has links)
Forest management is a costly and time-consuming activity. Remote sensing has the potential to improve the process by making it cheaper and more efficient, but only if appropriate characteristics can be determined from computer-models. This thesis describes the implementation of a forest visualization system and a corresponding user study that tests the accuracy of parameter estimation and forest characterization. The study uses data obtained from field-surveys to generate a computer-modeled forest. Five different stands were tested. Based on the quantitative results obtained, generally, there is no statistically significant difference in parameter estimation when comparing field-recorded movies and computer-generated movies.
54

Waking the dead: Scientific analysis of an Egyptian tunic.

Haldane, E.A., Gillies, Sara, O'Connor, Sonia A., Batt, Catherine M., Stern, Ben January 2009 (has links)
No / The aim of the research is to identify and help to explain the unusual pattern of staining on the tunic, provide more specific information relating to the tunic's age and provenance and the chronology of alterations, and also inform the conservation decision-making process.
55

The Interim as developmental academic journal

Kokt, D., Lategan, L., Dessels, R. January 2012 (has links)
Published Article / Research has two important objectives: the contribution to scientific discourse and the identification of solutions for the challenges societies, government, business and industry face. Research should be in the public domain. The publication and presentation of research results are important activities academics need to engage with. Through publications and presentations are societies informed of the positive influence and impact research can bring to them. This paper will focus on the importance of publications and how emerging scholars can be assisted to get their research published. A case study is presented of the Interim, an in-house academic journal.
56

Seafarers, silk, and science : oceanographic data in the making

Halfmann, Gregor January 2018 (has links)
This thesis comprises an empirical case study of scientific data production in oceanography and a philosophical analysis of the relations between newly created scientific data and the natural world. Based on qualitative interviews with researchers, I reconstruct research practices that lead to the ongoing production of digital data related to long-term developments of plankton biodiversity in the oceans. My analysis is centred on four themes: materiality, scientific representing with data, methodological continuity, and the contribution of non-scientists to epistemic processes. These are critically assessed against the background of today’s data-intensive sciences and increased automation and remoteness in oceanographic practices. Sciences of the world’s oceans have by and large been disregarded in philosophical scholarship thus far. My thesis opens this field for philosophical analysis and reveals various conditions and constraints of data practices that are largely uncontrollable by ocean scientists. I argue that the creation of useful scientific data depends on the implementation and preservation of material, methodological, and social continuities. These allow scientists to repeatedly transform visually perceived characteristics of research samples into meaningful scientific data stored in a digital database. In my case study, data are not collected but result from active intervention and subsequent manipulation and processing of newly created material objects. My discussion of scientific representing with data suggests that scientists do not extract or read any intrinsic representational relation between data and a target, but make data gradually more computable and compatible with already existing representations of natural systems. My arguments shed light on the epistemological significance of materiality, on limiting factors of scientific agency, and on an inevitable balance between changing conditions of concrete research settings and long-term consistency of data practices.
57

The Condsideration of Scientific Methodology: Paul Feyerabend¡¦s Position of Scientific Rationality

Lee, Lai-Hsing 07 September 2005 (has links)
The thesis puts more emphasis on Scientific methodology,discuss Philosopher of Science--Paul Feyerabend's thinking. It discusses if science is a rational statement or not and has some reflections on how we think of Scientific methodology. In this reserch we can make a conclusion that Paul Feyerabend think Science doesn't have a so-called position.What we called "Science" today also follow a normal methodology.He suggest that we should get rid of the normal methodology and support the scientists do their reserch freely by using suitalbe methodology so human beings can devolop more knowledge probability
58

Voronoi site modeling a computer model to predict the binding affinity of small flexible molecules.

Richardson, Wendy Westenberg. January 1993 (has links)
Thesis (Ph. D.)--University of Michigan. / eContent provider-neutral record in process. Description based on print version record.
59

Voronoi site modeling a computer model to predict the binding affinity of small flexible molecules.

Richardson, Wendy Westenberg. January 1993 (has links)
Thesis (Ph. D.)--University of Michigan. / eContent provider-neutral record in process. Description based on print version record.
60

Facilitating reproducible computing via scientific workflows – an integrated system approach

Cao, Yuan 04 May 2017 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / Reproducible computing and research are of great importance for scientific investigation in any discipline. This thesis presents a general approach to provenance in the context of workflows for widely used script languages. Our solution is based on system integration, and is demonstrated by integrating MATLAB with VisTrails, an open source scientific workflow system. The integrated VisTrails-MATLAB system supports reproducible computing with truly prospective and retrospective provenance at multiple granularity levels as scientists choose for their scripts, and at the same time, is very easy to use.

Page generated in 0.0492 seconds