• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 79
  • 50
  • 12
  • 8
  • 7
  • 3
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 195
  • 195
  • 65
  • 47
  • 41
  • 40
  • 37
  • 35
  • 34
  • 27
  • 26
  • 24
  • 23
  • 22
  • 21
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Geschäftsmodelle für Grid Computing in der Medizin und der Biomedizin

Scholz, Stefan January 2009 (has links)
Zugl.: Hannover, Univ., Diss., 2009
2

Resource allocation in the Grid a market engineering approach /

Schnizler, Björn. January 2007 (has links)
Zugl.: Karlsruhe, University, Diss., 2007.
3

Community driven data grids

Scholl, Tobias. Unknown Date (has links)
Techn. Univ., Diss., 2010--München.
4

Einsatz von Risikomanagement bei der Steuerung von Grid-Systemen : eine Analyse von Versicherungen anhand einer simulierten Grid-Ökonomie

Streitberger, Werner January 2009 (has links)
Bayreuth, Univ., Diss., 2009.
5

LHCb data management on the computing grid

Smith, Andrew Cameron January 2009 (has links)
The LHCb detector is one of the four experiments being built to harness the proton-proton collisions provided by the Large Hadron Collider (LHC) at the European Organisation for Nuclear Research (CERN). The data rate expected, when the LHC experiments are fully operational, eclipses that of any previous scientific experiments and has motivated the adoption of a grid computing paradigm to store and process the data. Managing PetaBytes of data in a distributed environment provides a rich set of challenges related to scalability, reliability and performance. This thesis will present the data management requirements for executing the workload of the LHCb collab- oration. We present the systems designed that support all aspects of the grid data management for LHCb, from data transfer, to data integrity, and efficient data access. The distributed computing environment is inherently unstable and much focus has been made on providing systems that are ro- bust and resilient to observed failures.
6

Traitement et analyse de grands ensembles d'images médicales

Montagnat, Johan 20 December 2006 (has links) (PDF)
Non disponible
7

Towards Grid-Wide Modeling and Simulation

Xie, Yong, Teo, Yong Meng, Cai, W., Turner, S. J. 01 1900 (has links)
Modeling and simulation permeate all areas of business, science and engineering. With the increase in the scale and complexity of simulations, large amounts of computational resources are required, and collaborative model development is needed, as multiple parties could be involved in the development process. The Grid provides a platform for coordinated resource sharing and application development and execution. In this paper, we survey existing technologies in modeling and simulation, and we focus on interoperability and composability of simulation components for both simulation development and execution. We also present our recent work on an HLA-based simulation framework on the Grid, and discuss the issues to achieve composability. / Singapore-MIT Alliance (SMA)
8

Beyond music sharing: an evaluation of peer-to-peer data dissemination techniques in large scientific collaborations

Al Kiswany, Samer 05 1900 (has links)
The avalanche of data from scientific instruments and the ensuing interest from geographically distributed users to analyze and interpret it accentuates the need for efficient data dissemination. An optimal data distribution scheme will find the delicate balance between conflicting requirements of minimizing transfer times, minimizing the impact on the network, and uniformly distributing load among participants. We identify several data distribution techniques, some successfully employed by today's peer-to-peer networks: staging, data partitioning, orthogonal bandwidth exploitation, and combinations of the above. We use simulations to explore the performance of these techniques in contexts similar to those used by today's data-centric scientific collaborations and derive several recommendations for efficient data dissemination. Our experimental results show that the peer-to-peer solutions that offer load balancing and good fault tolerance properties and have embedded participation incentives lead to unjustified costs in today's scientific data collaborations deployed on over-provisioned network cores. However, as user communities grow and these deployments scale, peer-to-peer data delivery mechanisms will likely outperform other techniques.
9

Beyond music sharing: an evaluation of peer-to-peer data dissemination techniques in large scientific collaborations

Al Kiswany, Samer 05 1900 (has links)
The avalanche of data from scientific instruments and the ensuing interest from geographically distributed users to analyze and interpret it accentuates the need for efficient data dissemination. An optimal data distribution scheme will find the delicate balance between conflicting requirements of minimizing transfer times, minimizing the impact on the network, and uniformly distributing load among participants. We identify several data distribution techniques, some successfully employed by today's peer-to-peer networks: staging, data partitioning, orthogonal bandwidth exploitation, and combinations of the above. We use simulations to explore the performance of these techniques in contexts similar to those used by today's data-centric scientific collaborations and derive several recommendations for efficient data dissemination. Our experimental results show that the peer-to-peer solutions that offer load balancing and good fault tolerance properties and have embedded participation incentives lead to unjustified costs in today's scientific data collaborations deployed on over-provisioned network cores. However, as user communities grow and these deployments scale, peer-to-peer data delivery mechanisms will likely outperform other techniques.
10

An e-Science Approach to Genetic Analysis of Quantitative Traits

Jayawardena, Mahen January 2010 (has links)
Many important traits in plants, animals and humans are quantitative, and most such traits are generally believed to be affected by multiple genetic loci. Standard computational tools for mapping of quantitative traits (i.e. for finding Quantitative Trait Loci, QTL, in the genome) use linear regression models for relating the observed phenotypes to the genetic composition of individuals in an experimental population. Using these tools to simultaneously search for multiple QTL is computationally demanding. The main reason for this is the complex optimization landscape for the multidimensional global optimization problems that must be solved. This thesis describes parallel algorithms, implementations and tools for simultaneous mapping of several QTL. These new computational tools enable genetic analysis exploiting new classes of multidimensional statistical models, potentially resulting in interesting results in genetics. We first describe how the standard, brute-force algorithm for global optimization in QTL analysis is parallelized and implemented on a grid system. Then, we also present a parallelized version of the more elaborate global optimization algorithm DIRECT and show how this can be efficiently deployed and used on grid systems and other loosely-coupled architectures. The parallel DIRECT scheme is further developed to exploit both coarse-grained parallelism in grid systems or clusters as well as fine-grained, tightly-coupled parallelism in multi-core nodes. The results show that excellent speedup and performance can be archived on grid systems and clusters, even when using a tightly-coupled algorithm such as DIRECT. Finally, we provide two distinctly different front-ends for our code. One is a grid portal providing a graphical front-end suitable for novice users and standard forms of QTL analysis. The other is a prototype of an R-based grid-enabled problem solving environment. Both of these front-ends can, after some further refinement, be utilized by geneticists for performing multidimensional genetic analysis of quantitative traits on a regular basis. / eSSENCE

Page generated in 0.0493 seconds