• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 50
  • 19
  • 7
  • Tagged with
  • 76
  • 53
  • 44
  • 25
  • 25
  • 25
  • 24
  • 18
  • 16
  • 15
  • 15
  • 14
  • 12
  • 12
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Das FEA-Assistenzsystem – Analyseteil FEdelM

Spruegel, Tobias C., Wartzack, Sandro 10 December 2016 (has links) (PDF)
Die simulative Absicherung von Produkten in den frühen Phasen der Produktentwicklung wird immer wichtiger, um den Anforderungen nach steigender Effizienz gerecht zu werden. Da das Angebot an erfahrenen Berechnungsingenieuren mit langjähriger Berufserfahrung begrenzt ist gilt es weniger erfahrene Simulationsanwender bei der Durchführung von aussagekräftigen Finite-Elemente-Simulationen zu unterstützen. Die Autoren stellen im Rahmen des Beitrags das Konzept des Analyseteils FEdelM eines FEA-Assistenzsystems vor, welches strukturmechanische Finite-Elemente (FE) Simulationen auf Plausibilität überprüft und auftretende Fehler möglichst automatisiert zu erkennt und behebt. Hierbei werden die einzelnen Module und deren Verknüpfungen untereinander und zu anderen Anwendungen vorgestellt.
12

Investigation of Laser-Induced-Liquid-Beam-Ion-Desorption (LILBID) with Molecular Dynamics Simulations

Wiederschein, Frank 13 January 2010 (has links)
No description available.
13

The Role of Dynamic Interactive Technology in Teaching and Learning Statistics

Burrill, Gail 12 April 2012 (has links) (PDF)
Dynamic interactive technology brings new opportunities for helping students learn central statistical concepts. Research and classroom experience can be help identify concepts with which students struggle, and an \"action-consequence\" pre-made technology document can engage students in exploring these concepts. With the right questions, students can begin to make connections among their background in mathematics, foundational ideas that undergrid statistics and the relationship these ideas. The ultimate goal is to have students think deeply about simple and basic statistical ideas so they can see how they lead to reasoning and sense making about data and about making decisions about characteristics of a population from a sample.Technology has a critical role in teaching and learning statistics, enabling students to use real data in investigations, to model complex situations based on data, to visualize relationships using different representations, to move beyond calculations to interpreting statistical processes such as confidence intervals and correlation, and to generate simulations to investigate a variety of problems including laying a foundation for inference. Thus, graphing calculators, spreadsheets, and interactive dynamic software can all be thought of as tools for statistical sense making in the service of developing understanding.
14

DeltaTick: Applying Calculus to the Real World through Behavioral Modeling

Wilkerson-Jerde, Michelle H., Wilensky, Uri 22 May 2012 (has links) (PDF)
Certainly one of the most powerful and important modeling languages of our time is the Calculus. But research consistently shows that students do not understand how the variables in calculus-based mathematical models relate to aspects of the systems that those models are supposed to represent. Because of this, students never access the true power of calculus: its suitability to model a wide variety of real-world systems across domains. In this paper, we describe the motivation and theoretical foundations for the DeltaTick and HotLink Replay applications, an effort to address these difficulties by a) enabling students to model a wide variety of systems in the world that change over time by defining the behaviors of that system, and b) making explicit how a system\'s behavior relates to the mathematical trends that behavior creates. These applications employ the visualization and codification of behavior rules within the NetLogo agent-based modeling environment (Wilensky, 1999), rather than mathematical symbols, as their primary building blocks. As such, they provide an alternative to traditional mathematical techniques for exploring and solving advanced modeling problems, as well as exploring the major underlying concepts of calculus.
15

The Adaptive Particle Representation (APR) for Simple and Efficient Adaptive Resolution Processing, Storage and Simulations

Cheeseman, Bevan 29 March 2018 (has links) (PDF)
This thesis presents the Adaptive Particle Representation (APR), a novel adaptive data representation that can be used for general data processing, storage, and simulations. The APR is motivated, and designed, as a replacement representation for pixel images to address computational and memory bottlenecks in processing pipelines for studying spatiotemporal processes in biology using Light-sheet Fluo- rescence Microscopy (LSFM) data. The APR is an adaptive function representation that represents a function in a spatially adaptive way using a set of Particle Cells V and function values stored at particle collocation points P∗. The Particle Cells partition space, and implicitly define a piecewise constant Implied Resolution Function R∗(y) and particle sampling locations. As an adaptive data representation, the APR can be used to provide both computational and memory benefits by aligning the number of Particle Cells and particles with the spatial scales of the function. The APR allows reconstruction of a function value at any location y using any positive weighted combination of particles within a distance of R∗(y). The Particle Cells V are selected such that the error between the reconstruction and the original function, when weighted by a function σ(y), is below a user-set relative error threshold E. We call this the Reconstruction Condition and σ(y) the Local Intensity Scale. σ(y) is motivated by local gain controls in the human visual system, and for LSFM data can be used to account for contrast variations across an image. The APR is formed by satisfying an additional condition on R∗(y); we call the Resolution Bound. The Resolution Bound relates the R∗(y) to a local maximum of the absolute value function derivatives within a distance R∗(y) or y. Given restric- tions on σ(y), satisfaction of the Resolution Bound also guarantees satisfaction of the Reconstruction Condition. In this thesis, we present algorithms and approaches that find the optimal Implied Resolution Function to general problems in the form of the Resolution Bound using Particle Cells using an algorithm we call the Pulling Scheme. Here, optimal means the largest R∗(y) at each location. The Pulling Scheme has worst-case linear complexity in the number of pixels when used to rep- resent images. The approach is general in that the same algorithm can be used for general (α,m)-Reconstruction Conditions, where α denotes the function derivative and m the minimum order of the reconstruction. Further, it can also be combined with anisotropic neighborhoods to provide adaptation in both space and time. The APR can be used with both noise-free and noisy data. For noisy data, the Reconstruction Condition can no longer be guaranteed, but numerical results show an optimal range of relative error E that provides a maximum increase in PSNR over the noisy input data. Further, if it is assumed the Implied Resolution Func- tion satisfies the Resolution Bound, then the APR converges to a biased estimate (constant factor of E), at the optimal statistical rate. The APR continues a long tradition of adaptive data representations and rep- resents a unique trade off between the level of adaptation of the representation and simplicity. Both regarding the APRs structure and its use for processing. Here, we numerically evaluate the adaptation and processing of the APR for use with LSFM data. This is done using both synthetic and LSFM exemplar data. It is concluded from these results that the APR has the correct properties to provide a replacement of pixel images and address bottlenecks in processing for LSFM data. Removal of the bottleneck would be achieved by adapting to spatial, temporal and intensity scale variations in the data. Further, we propose the simple structure of the general APR could provide benefit in areas such as the numerical solution of differential equations, adaptive regression methods, and surface representation for computer graphics.
16

Ein Werkzeug zur schnellen Konfiguration biomechanischer Simulationen in der Produktentwicklung

Krüger, Daniel, Wartzack, Sandro January 2012 (has links)
Aus Punkt 1: "Neben der Funktionserfüllung und den Kosten ist der wirtschaftliche Erfolg eines Produktes nicht zuletzt dadurch gegeben, wie gut es den Wünschen, Bedürfnissen und Fähigkeiten seiner Nutzer entspricht. Zwischen Nutzer und Produkt existieren vielfältige Wechselwirkungen, die erfasst und verstanden werden müssen, um Produktkonzepte hinsichtlich der oben genannten Kriterien bewerten und optimieren zu können. Die Philosophie der menschzentrierten Produktentwicklung (MZP) besteht folglich darin, in allen Phasen der Produktentwicklung konsequent das Gesamtsystem bestehend aus Nutzer, Produkt und Umwelt zu betrachten."
17

The Adaptive Particle Representation (APR) for Simple and Efficient Adaptive Resolution Processing, Storage and Simulations

Cheeseman, Bevan 28 November 2017 (has links)
This thesis presents the Adaptive Particle Representation (APR), a novel adaptive data representation that can be used for general data processing, storage, and simulations. The APR is motivated, and designed, as a replacement representation for pixel images to address computational and memory bottlenecks in processing pipelines for studying spatiotemporal processes in biology using Light-sheet Fluo- rescence Microscopy (LSFM) data. The APR is an adaptive function representation that represents a function in a spatially adaptive way using a set of Particle Cells V and function values stored at particle collocation points P∗. The Particle Cells partition space, and implicitly define a piecewise constant Implied Resolution Function R∗(y) and particle sampling locations. As an adaptive data representation, the APR can be used to provide both computational and memory benefits by aligning the number of Particle Cells and particles with the spatial scales of the function. The APR allows reconstruction of a function value at any location y using any positive weighted combination of particles within a distance of R∗(y). The Particle Cells V are selected such that the error between the reconstruction and the original function, when weighted by a function σ(y), is below a user-set relative error threshold E. We call this the Reconstruction Condition and σ(y) the Local Intensity Scale. σ(y) is motivated by local gain controls in the human visual system, and for LSFM data can be used to account for contrast variations across an image. The APR is formed by satisfying an additional condition on R∗(y); we call the Resolution Bound. The Resolution Bound relates the R∗(y) to a local maximum of the absolute value function derivatives within a distance R∗(y) or y. Given restric- tions on σ(y), satisfaction of the Resolution Bound also guarantees satisfaction of the Reconstruction Condition. In this thesis, we present algorithms and approaches that find the optimal Implied Resolution Function to general problems in the form of the Resolution Bound using Particle Cells using an algorithm we call the Pulling Scheme. Here, optimal means the largest R∗(y) at each location. The Pulling Scheme has worst-case linear complexity in the number of pixels when used to rep- resent images. The approach is general in that the same algorithm can be used for general (α,m)-Reconstruction Conditions, where α denotes the function derivative and m the minimum order of the reconstruction. Further, it can also be combined with anisotropic neighborhoods to provide adaptation in both space and time. The APR can be used with both noise-free and noisy data. For noisy data, the Reconstruction Condition can no longer be guaranteed, but numerical results show an optimal range of relative error E that provides a maximum increase in PSNR over the noisy input data. Further, if it is assumed the Implied Resolution Func- tion satisfies the Resolution Bound, then the APR converges to a biased estimate (constant factor of E), at the optimal statistical rate. The APR continues a long tradition of adaptive data representations and rep- resents a unique trade off between the level of adaptation of the representation and simplicity. Both regarding the APRs structure and its use for processing. Here, we numerically evaluate the adaptation and processing of the APR for use with LSFM data. This is done using both synthetic and LSFM exemplar data. It is concluded from these results that the APR has the correct properties to provide a replacement of pixel images and address bottlenecks in processing for LSFM data. Removal of the bottleneck would be achieved by adapting to spatial, temporal and intensity scale variations in the data. Further, we propose the simple structure of the general APR could provide benefit in areas such as the numerical solution of differential equations, adaptive regression methods, and surface representation for computer graphics.
18

Multiscale modeling of structure formation and dynamic properties of organic molecules in hybrid inorganic/organic semiconductors

Pałczynski, Karol 29 July 2016 (has links)
Die optoelektronischen Eigenschaften von inorganischen/organischen Hybridmaterialien (HIOS) sind besonders von der Kristallstruktur und der Ausrichtung der organischen Moleküle relativ zur inorganischen Oberfläche abhängig. Beides hängt von den kollektiven Wechselwirkungen der Materialien und von Transportprozessen wie etwa der Diffusion während der Deposition der organischen Materialien auf inorganischen Oberflächen ab. Durch die Komplexität solcher System sind jedoch viele Fragen im Bezug auf die gezielte Herstellung und Vorhersage von HIOS-Strukturen offen. Die Ziele dieser Arbeit sind daher (1) die theoretische Reproduktion der experimentell bekannten Einkristall-Struktur des weit verbreiteten organischen Moleküls para-Sexiphenyl (p-6P) und (2) die Untersuchung der Selbstdiffusion eines einzelnen p-6P auf einer inorganischen Zinkoxid (ZnO) Oberfläche. Die jeweiligen Systeme werden mittels klassischer atomistischer Molekulardynamik Simulationen und mit Methoden der klassischen Diffusionstheorie untersucht. Die Arbeit demonstriert, dass ein Modell basierend auf einem klassischen Kraftfeld die internen geometrischen und energetischen Eigenschaften eines realen p-6P Moleküls reproduziert. Wir simulieren die Selbstanordnung von p-6P zu Kristallen mit der experimentellen Einkristall-Struktur des p-6P und reproduzieren das reale Phasenverhalten des p-6P Kristalls. Wir untersuchen den Zusammenhang zwischen der Oberflächendiffusion eines p-6P und der elektrostatischen Kopplung zur ZnO (10-10)-Oberfläche. Wir entwickeln Strategien zur Berechnung von freie-Energie Landschaften, Diffusionskoeffizienten und Übergangsraten über Stufenkanten. Im Ergebnis hängen die Übergangsraten exponentiell von der Temperatur, der elektrostatischen Kopplung und der Höhe der Stufenkanten ab. Wir entdecken zudem zwei unterschiedliche Übergangspfade des p-6P über Stufenkanten, die von der Temperatur des Systems und von der elektrostatischen Kopplung abhängen. / The optoelectronic properties of hybrid inorganic/organic semiconductors strongly depend on the crystal structure and alignment of the molecules relative to the surface. Structure and alignment, in turn, depend on the surface-molecule and molecule-molecule interactions as well as transport processes such as diffusion during deposition of the organic molecules on an inorganic surface. However, due to their high complexity, fundamental questions pertaining to the design and prediction of HIOS structure are still unanswered. The aims of this thesis are therefore (1) to theoretically reproduce experimental bulk crystal structures of the widely used organic para-sexiphenyl molecule (p-6P) and (2) to investigate the self-diffusion of a single p-6P deposited on an inorganic Zincoxide (ZnO) surface. A multi-scale strategy is used, combining quantum density functional theory (DFT), all-atom molecular dynamics simulations, and classical diffusion theory. The thesis demonstrates that a classical force field model yields self-assembled bulk crystal structures and reproduces the real solid to liquid crystal phase behavior. The internal geometries and energies of the p-6P molecule and the structure of the p-6P bulk crystal are reproduced, all consistent with DFT and experiments. We investigate how the diffusion of the p-6P relates to the surface structure and the electrostatic coupling between the molecule and the ZnO (10-10) surface. We investigate by means of an advanced sampling strategy, free energy landscapes, diffusion coefficients and crossing rates over surface-step-edges. We find that the reciprocal values of the rates depend exponentially on the system temperature, the amplitude of the surface charges and the step-edge height, as well as linearly on the distance between equally high steps. We also discover two different crossing pathways for the molecule moving over the step, which simultaneously depend on the system temperature and the electrostatic coupling.
19

Analysis of diagnostic climate model cloud parameterisations using large-eddy simulations

Rosch, Jan, Heus, Thijs, Salzmann, Marc, Mülmenstädt, Johannes, Schlemmer, Linda, Quaas, Johannes 28 April 2016 (has links) (PDF)
Current climate models often predict fractional cloud cover on the basis of a diagnostic probability density function (PDF) describing the subgrid-scale variability of the total water specific humidity, qt, favouring schemes with limited complexity. Standard shapes are uniform or triangular PDFs the width of which is assumed to scale with the gridbox mean qt or the grid-box mean saturation specific humidity, qs. In this study, the qt variability is analysed from large-eddy simulations for two stratocumulus, two shallow cumulus, and one deep convective cases. We find that in most cases, triangles are a better approximation to the simulated PDFs than uniform distributions. In two of the 24 slices examined, the actual distributions were so strongly skewed that the simple symmetric shapes could not capture the PDF at all. The distribution width for either shape scales acceptably well with both the mean value of qt and qs, the former being a slightly better choice. The qt variance is underestimated by the fitted PDFs, but overestimated by the existing parameterisations. While the cloud fraction is in general relatively well diagnosed from fitted or parameterised uniform or triangular PDFs, it fails to capture cases with small partial cloudiness, and in 10 – 30% of the cases misdiagnoses clouds in clear skies or vice-versa. The results suggest choosing a parameterisation with a triangular shape, where the distribution width would scale with the grid-box mean qt using a scaling factor of 0.076. This, however, is subject to the caveat that the reference simulations examined here were partly for rather small domains and driven by idealised boundary conditions.
20

Untersuchung der Physisorption von Wasserstoff in porösen Materialien mit einer neuartigen volumetrischen Apparatur

Khvostikova, Olga 01 April 2011 (has links) (PDF)
Wasserstoff ist der ideale Energieträger, da er völlig schadstofffrei verbrennt und einen potentiell hohen Energiegehalt pro Masse besitzt. Die größte Herausforderung für den Gebrauch von Wasserstoff als Kraftstoff ist die Wasserstoffspeicherung in sicheren und kostengünstigen Systemen. Die Ziele und Aufgaben der vorliegenden Doktorarbeit sind, poröse Materialien, die unterschiedliche Struktur und Zusammensetzung besitzen, für die Physisorption von Wasserstoff mittels einer neuartigen volumetrischen Apparatur zu untersuchen. Das Erreichen maximaler Speicherdaten stand nicht im Vordergrund dieser Arbeit. Viel wichtiger war es, einen Struktur – Eigenschafts – (Sorptions) – Zusammenhang zu verstehen, auf deren Basis eine systematische Entwicklung von Wasserstoffspeichermaterialien erfolgen könnte. Zwei Klassen von potentiellen Wasserstoffspeichern wurden erforscht: expandierte Graphitmaterialien und Metallorganische Netzwerke. Neue experimentelle Methoden zur Untersuchung der Wasserstoffspeicherkapazität an modifizierter volumetrischer Apparatur wurden erfolgreich entwickelt und geprüft. Das Verwenden einer der Kammern als Referenzkammer ermöglicht das Ausschließen der experimentellen Artefakte aus der Auswertung der gespeicherten Wasserstoffmenge. Es wurde keine Gaszustandgleichung bei tiefen Temperaturen verwendet, was sehr wichtig bei den Experimenten mit Wasserstoff ist.

Page generated in 0.1589 seconds