• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 19
  • 5
  • 5
  • 1
  • 1
  • Tagged with
  • 61
  • 15
  • 13
  • 13
  • 13
  • 13
  • 13
  • 13
  • 11
  • 11
  • 10
  • 10
  • 10
  • 10
  • 8
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

A Parallel Multidimensional Weighted Histogram Analysis Method

Potgieter, Andrew 01 January 2014 (has links)
The Weighted Histogram Analysis Method (WHAM) is a technique used to calculate free energy from molecular simulation data. WHAM recombines biased distributions of samples from multiple Umbrella Sampling simulations to yield an estimate of the global unbiased distribution. The WHAM algorithm iterates two coupled, non-linear, equations, until convergence at an acceptable level of accuracy. The equations have quadratic time complexity for a single reaction coordinate. However, this increases exponentially with the number of reaction coordinates under investigation, which makes multidimensional WHAM a computationally expensive procedure. There is potential to use general purpose graphics processing units (GPGPU) to accelerate the execution of the algorithm. Here we develop and evaluate a multidimensional GPGPU WHAM implementation to investigate the potential speed-up attained over its CPU counterpart. In addition, to avoid the cost of multiple Molecular Dynamics simulations and for validation of the implementations we develop a test system to generate samples analogous to Umbrella Sampling simulations. We observe a maximum problem size dependent speed-up of approximately 19 for the GPGPU optimized WHAM implementation over our single threaded CPU optimized version. We find that the WHAM algorithm is amenable to GPU acceleration, which provides the means to study ever more complex molecular systems in reduced time periods.
32

Real-time Generation of Procedural Forests

Kenwood, Julian 01 January 2014 (has links)
The creation of 3D models for games and simulations is generally a time-consuming and labour intensive task. Forested landscapes are an important component of many large virtual environments in games and film. To create the many individual tree models required for forests requires a large numbers of artists and a great deal of time. In order to reduce modelling time procedural methods are often used. Such methods allow tree models to be created automatically and relatively quickly, albeit at potentially reduced quality. Although the process is faster than manual creation, it can still be slow and resource-intensive for large forests. The main contribution of this work is the development of an efficient procedural generation system for creating large forests. Our system uses L-Systems, a grammar based procedural technique, to generate each tree. We explore two approaches to accelerating the creation of large forests. First, we demonstrate performance improvements for the creation of individual trees in the forest, by reducing the computation required by the underlying L-Systems. Second, we reduce the memory overhead by sharing geometry between trees using a novel branch instancing approach. Test results show that our scheme significantly improves the speed of forest generation over naive methods: our system is able to generate over 100, 000 trees in approximately 2 seconds, while using a modest amount of memory. With respect to improving L-System processing, one of our methods achieves a 25% speed up over traditional methods at the cost of a small amount of additional memory, while our second method manages a 99% reduction in memory at the expense of a small amount of extra processing.
33

A GPU-Based Level of Detail System for the Real-Time Simulation and Rendering of Large-Scale Granular Terrain

Leach, Craig 01 June 2014 (has links)
Real-time computer games and simulations often contain large virtual outdoor environments. Terrain forms an important part of these environments. This terrain may consist of various granular materials, such as sand, rubble and rocks. Previous approaches to rendering such terrains rely on simple textured geometry, with little to no support for dynamic interactions. Recently, particle-based granular terrain simulations have emerged as an alternative method for rendering granular terrain. These systems simulate granular materials by using particles to represent the individual granules, and exhibit realistic, physically correct interactions with dynamic objects. However, they are extremely computationally expensive, and thus may only feasibly be used to simulate small areas of terrain. In order to overcome this limitation, this thesis builds upon a previously created particle-based granular terrain simulation, by integrating it with a heightfield-based terrain system. In this way, we create a level of detail system for simulating large-scale granular terrain. The particle-based terrain system is used to represent areas of terrain around dynamic objects, whereas the heightfield-based terrain is used elsewhere. This allows large-scale granular terrain to be simulated in real-time, with physically correct dynamic interactions. This is made possible by a novel system, which allows for terrain to be converted from one representation to the other in real-time, while maintaining changes made to the particle-based system in the heightfield-based system. The system also allows for updates to particle-systems to be paused, creating the illusion that more particle systems are active than actually are. We show that the system is capable of simulating and rendering multiple particle-based simulations across a large-scale terrain, whilst maintaining real-time performance. However, the number of particles used, and thus the number of particle-based simulations which may be used, is limited by the computational resources of the GPU.
34

Performance Study and Dynamic Optimization Design for Thread Pool Systems

Dongping Xu January 2004 (has links)
19 Dec 2004. / Published through the Information Bridge: DOE Scientific and Technical Information. "IS-T 2359" Dongping Xu. 12/19/2004. Report is also available in paper and microfiche from NTIS.
35

Benchmarking More Aspects of High Performance Computing.

Rahul Ravindrudu January 2004 (has links)
Thesis (M.S.); Submitted to Iowa State Univ., Ames, IA (US); 19 Dec 2004. / Published through the Information Bridge: DOE Scientific and Technical Information. "IS-T 2196" Rahul Ravindrudu. US Department of Energy 12/19/2004. Report is also available in paper and microfiche from NTIS.
36

Dos pontapés na bola aos pontapés no Direito-para um entendimento do direito do desporto

Nolasco, Carlos Manuel Simões January 1999 (has links)
No description available.
37

Utilization of Personal Health Informatics Through Intermediaries

Katule, N.A 01 July 2018 (has links)
Personal informatics are important tools in health self-management as they support individuals to quantify and self-reflect on their lifestyle. Human-computer interaction researchers have devoted resources on studying how to design such tools. Various motivational strategies have been explored for their capabilities in improving user engagement. However, such strategies are developed with an assumption that the targeted consumer of information is the one directly manipulating user interfaces of the system that has information. This may not always be the case for users in developing regions. As a result, such systems may not scale well in contexts where a targeted consumer (beneficiary) may use technology through the facilitation of another person (intermediary) whom is responsible for manipulating user interfaces, because such facilitators are not recognized as part of the system, hence motivational strategies don't cater for them. In order to uncover design implications for intermediated technology use in the context of personal health informatics (PHI), the researcher started with the theoretical framing of the work followed by a contextual enquiry which led to development of mobile applications' prototypes for tracking nutrition and physical activity. Evaluation of the prototypes revealed that a familial relationship is a prerequisite for such an intervention. The most promising combination involves family members, possibly a child and a parent working together. The study used self-determination theory to understand how a collaborative gamified system can increase engagement. The result revealed that gamification as the source of a significant increase in perceived competence in intermediary users whom also tended to consider themselves as co-owners of the interaction experience. Therefore, gamification was found to be a catalyst for increasing collaboration between an intermediary and beneficiary user of technology, provided that the two users that formed a pair had a prior social relationship. In the absence of gamification, intermediary users tended to be less engaged in the intervention. The study highlights both the positive and negative aspects of gamification in promoting collaboration in intermediated use and its general implications in health settings. Design considerations required in order to improve the overall user experience of both users involved are proposed. In general, this work contributes to both theory and empirical validation of factors for, supporting proximate-enabled intermediated use of personal health informatics.
38

Maschinelle Generierung von Empfehlungen zur Lehr-/Lernunterstützung im Hochschulkontext

Engelbert, Benedikt 20 April 2017 (has links)
Das Internet, in dem Informationen schnell und komfortabel abgerufen werden können, hat das Leben im Informationszeitalter nachhaltig verändert. Die Vorteile eines einfachen Zugriffs auf Daten, Dienste und Informationen sind dabei genauso greifbar wie die stetig anwachsende Menge an Angeboten, die das Internet in gleicher Weise vielfältig und unübersichtlich gestalten. Die Auswahl geeigneter Angebote ist für Benutzer ein häufiges Problem, für welches sich unterstützende Systeme unter dem Namen der Empfehlungsdienste etabliert haben. Empfehlungsdienste helfen bei der Selektierung von Informationen auf der Grundlage von Benutzerpräferenzen oder Bedürfnissen in einer Vielzahl von Anwendungskontexten. Online Versandhandel, in denen Empfehlungsdienste Vorschläge für interessante oder nützliche Artikel aussprechen, ist eines der populärsten und gängigsten Anwendungsszenarien. Aber auch das Lernen wurde durch das Internet in den letzten Jahren stark geprägt. Ein Lernmanagementsystem (LMS) ist im universitären Kontext mittlerweile ein gängiger Standard, wodurch der Prozess der Distribution und des Zugriffs auf digitale Lernmaterialien stark vereinfacht wurde. Über das Internet finden Lernende die Möglichkeit ihr Repertoire an Lernmaterialien zu erweitern, was sich angesichts der Menge an verfügbaren Materialien, sowie der Komplexität an Lerninhalten im universitären Kontext nicht notwendigerweise als Vorteil herausstellt. Ein Lernender sieht sich einem schwer zu sichtenden Überangebot ausgesetzt, welches sich im ungünstigsten Fall auf die Motivation oder Leistung auswirken könnte. In jedem Fall bedarf es bei einer Suche Zeit, um eine genügende Sichtung durchführen zu können, obgleich im Regelfall bereits Lernmaterialien durch die/den Lehrende/n zur Verfügung stehen. Es wird an dieser Stelle der Bedarf an unterstützenden Systemen im Kontext des digitalen Lernens bzw. E-Learning gesehen, die eine assistierende Rolle bei der Auswahl an Inhalten und Materialien einnehmen. Die vorliegende Arbeit untersucht die Entwicklung eines maschinellen Ansatzes zur automatisierten und inhaltsbezogenen Herleitung von Empfehlungen, die kontextrelevante Inhalte in Lernmaterialien aufdecken und mit alternativen Materialien verknüpfen. Hierfür präsentiert die Arbeit die Entwicklung eines kollaborativen Tagging Ansatzes, um den Zielvorgaben zu genügen. Das entwickelte System verfolgt die Intention die Nutzung von bestehenden Materialien Lehrender zu erleichtern und des Weiteren den Prozess des Auffindens relevanter, alternativer Lernmaterialien zu vereinfachen. Die einfache Integration in die Lehre steht bei der Entwicklung ebenfalls im Fokus, so dass ein Mehraufwand für Lehrende vermieden wird und sich möglichst Mehrwerte durch hergeleitete Informationen des Systems ergeben. Die Arbeit beschreibt zudem die Evaluation des Systems, die in zwei Evaluationsszenarien durchgeführt wurde und diskutiert die Ergebnisse im Zusammenhang mit vergleichbaren Ansätzen.
39

Computer Modeling Of Blowback Oil Consumption In Internal Engines

Bilge, Egemen 01 September 2009 (has links) (PDF)
Environmental pollution is an important problem of our world. Governments are aware of this problem and emission regulations are continuously improved. One of the strictest regulations is about unburned and burned hydrocarbon emissions. In internal combustion engines the origin of the burned and unburned hydrocarbons is fuel and engine oil. As a result of the sanctions and the necessity of improved combustion performance of the engine, manufacturers work on manufacturing technology and engine tribology. With the improvement of these areas oil loss from internal combustion engine is reduced. Engine oil consumption mechanisms are specific research areas in the internal combustion engine development. Oil consumption occurs via to two main routes: &ldquo / Valve train&rdquo / and &ldquo / in cylinder components&rdquo / . In cylinder components have three sub mechanisms: evaporation, ring scraping and blowback. In this thesis, blowback oil loss mechanism is studied. 2D flow model of piston-cylinder mechanism is developed in Fluent. Land pressures and ring end gap flow data are taken from this model. An iterative computer program is developed to calculate backflow oil consumption. In this program, an empirical entrainment correlation compiled from literature is used. The calculated oil consumption values match with the range of the values in the literature.
40

Application of optimal prediction to molecular dynamics

Barber IV, John Letherman January 2004 (has links)
Thesis (Ph.D.); Submitted to the University of California at Berkeley, Berkeley, CA 94720 (US); 1 Dec 2004. / Published through the Information Bridge: DOE Scientific and Technical Information. "LBNL--56842" Barber IV, John Letherman. USDOE Director. Office of Science. Advanced Scientific Computing Research (US) 12/01/2004. Report is also available in paper and microfiche from NTIS.

Page generated in 0.0575 seconds