451 |
Two-way Coupled Multiscale Tsunami Modelling from Generation to Coastal Zone Hydrodynamics / 双方向結合マルチスケールモデルによる波源から沿岸域までの津波解析William, James Pringle 23 March 2016 (has links)
京都大学 / 0048 / 新制・課程博士 / 博士(工学) / 甲第19677号 / 工博第4132号 / 新制||工||1638(附属図書館) / 32713 / 京都大学大学院工学研究科都市社会工学専攻 / (主査)教授 五十嵐 晃, 准教授 米山 望, 准教授 森 信人 / 学位規則第4条第1項該当 / Doctor of Philosophy (Engineering) / Kyoto University / DFAM
|
452 |
Pore-Scale Simulation of Cathode Catalyst Layers in Proton Exchange Membrane Fuel Cells (PEMFCs)ZHENG, WEIBO 11 July 2019 (has links)
No description available.
|
453 |
A Contribution to the Multidimensional and Correlative Tomographic Characterization of Micron–Sized Particle SystemsDitscherlein, Ralf 12 September 2022 (has links)
The present work was carried out within the framework of the priority programme SPP 2045. Technical ultra–fine particle systems (< 10μm) from highly specific separation processes are to be investigated here with regard to multi–dimensional property distributions. Tomographic measurement methods allow a comprehensive 3D description of particle–discrete data sets of statistically relevant size. The focus of the work is on X–ray tomographic analysis by means of micro-computed tomography (micro–CT), which, if necessary, is extended to several size scales by including further measurement methods (nano–CT) and supplemented by suitable elemental analysis (FIB–SEM + EBSD, EDX). Two preparation methods (wax, epoxy resin) for different particle preparations are described methodically, which have already been published in a case study or are the subject of current studies in the outlook of the work. Finally, a networked multiple use of the generated data within an online particle database is shown and its application is explained using three concrete examples.:1 Outline
2 Description of Particle Properties
2.1 Integral or Class–Based Description
2.2 Particle–Discrete Description
2.2.1 2D Description
2.2.2 Full 3D Description
2.3 Multidimensional Characterization on Basis of Particle–Discrete 3D Data
2.3.1 Motivation
2.3.2 Kernel Density Approach
2.3.3 Copula Approach
3 X–ray Tomography
3.1 Historical Context
3.2 X–ray Physics
3.2.1 X–ray Generation
3.2.2 Polychromatic Spectrum
3.2.3 Interaction with Matter
3.3 Tomographic Imaging
3.3.1 Motivation
3.3.2 Basic Idea
3.3.3 X–ray Microscopy Measurement Setup andWorkflow
3.3.4 Tomographic Reconstruction via Filtered Back Projection
3.3.5 Region of Interest Tomography
3.4 Relevant Artefacts Related to Particle Measurement
3.4.1 Temperature Drift
3.4.2 Penumbral Blurring and Shadow
3.4.3 Cone Beam
3.4.4 Out–of–Field
3.4.5 Center Shift
3.4.6 Sample Drift
3.4.7 Beam Hardening
3.4.8 Rings
3.4.9 Noise
3.4.10 Partial Volume
3.4.11 Summary
4 Practical Implementation
4.1 Particle Sample Requirements
4.1.1 Geometry
4.1.2 Dispersity and Homogeneity
4.2 Statistics
4.2.1 Single Particle Properties
4.2.2 Properties of a Limited Number of Particles (10 to several 100)
4.2.3 Particle Populations with Distributed Properties
4.3 2D Validation
4.4 Measurement
4.4.1 X–ray Microscope
4.4.2 Source Filter
4.4.3 Detector Binning
4.4.4 Cone Beam Artefact Compensation
4.4.5 Center Shift Correction
4.4.6 Dynamic Ring Removal
5 Image Analysis
5.1 Image Quality
5.1.1 Grey Value Histogram
5.1.2 Resolution
5.1.3 Signal–to–Noise Ratio
5.1.4 Contrast and Dynamic Range
5.1.5 Sharpness
5.1.6 Summary
5.2 Basic Image Processing Strategies
5.2.1 Threshold–Based Segmentation
5.2.2 Machine Learning Assisted Segmentation
6 Correlative Tomography
6.1 Scouting Approach
6.2 Multiscale Approach
6.3 Multidisciplinary Approach
7 Data Management
7.1 Data Quality
7.2 Data Availability
7.2.1 Tomographic Datasets
7.2.2 Particle Database
8 Outlook on Further Research Activities
9 Publications
9.1 Copyright Declaration
9.2 Overview
9.3 List of Publications
Paper A, Preparation techniques for micron–sized particulate samples in X–ray microtomography
Paper B, Self–constructed automated syringe for preparation of micron–sized particulate samples in X–ray microtomography
Paper C, Preparation strategy for statistically significant micrometer–sized particle systems suitable for correlative 3D imaging workflows on the example of X–ray microtomography
Paper D, Multi–scale tomographic analysis for micron–sized particulate samples
Paper E, PARROT: A pilot study on the open access provision of particle discrete tomographic datasets
10 Appendix
10.1 Application Example 1: Fracture Analysis
10.2 Application Example 2: 3D Contact Angle Measurement
10.3 Influence of the Source Filter
10.4 Influence of the X–rays on the Sample
10.5 Appropriate Filter Settings
10.6 Log File Parser / Die vorliegende Arbeit ist im Rahmen des Schwerpunktprogramms SPP 2045 entstanden. Technische Feinstpartikelsysteme (< 10μm) aus hochspezifischen Trennprozessen sollen hier hinsichtlich mehrdimensionaler Eigenschaftsverteilungen untersucht werden. Tomographische Messverfahren erlauben dabei eine vollständige 3D Beschreibung partikeldiskreter Datensätze statistisch relevanter Größe. Der Schwerpunkt der Arbeit liegt auf der röntgentomographischen Analyse mittels Mikro–Computertomographie (mikro–CT), die im Bedarfsfall unter Einbeziehung weiterer Messmethoden (nano–CT) auf mehrere Größenskalen erweitert und durch geeignete Elementanalytik (FIB–SEM + EBSD, EDX) ergänzt wird. Methodisch werden zwei Präparationsverfahren (Wachs, Epoxidharz) für unterschiedliche Partikelpräparate beschrieben, welche in einer Fallstudie bereits veröffentlicht bzw. im Ausblick der Arbeit Gegenstand aktueller Studien ist. Schließlich wird eine vernetzte Mehrfachnutzung der erzeugten Daten innerhalb einer online-Partikeldatenbank gezeigt und deren Anwendung an drei konkreten Beispielen erläutert.:1 Outline
2 Description of Particle Properties
2.1 Integral or Class–Based Description
2.2 Particle–Discrete Description
2.2.1 2D Description
2.2.2 Full 3D Description
2.3 Multidimensional Characterization on Basis of Particle–Discrete 3D Data
2.3.1 Motivation
2.3.2 Kernel Density Approach
2.3.3 Copula Approach
3 X–ray Tomography
3.1 Historical Context
3.2 X–ray Physics
3.2.1 X–ray Generation
3.2.2 Polychromatic Spectrum
3.2.3 Interaction with Matter
3.3 Tomographic Imaging
3.3.1 Motivation
3.3.2 Basic Idea
3.3.3 X–ray Microscopy Measurement Setup andWorkflow
3.3.4 Tomographic Reconstruction via Filtered Back Projection
3.3.5 Region of Interest Tomography
3.4 Relevant Artefacts Related to Particle Measurement
3.4.1 Temperature Drift
3.4.2 Penumbral Blurring and Shadow
3.4.3 Cone Beam
3.4.4 Out–of–Field
3.4.5 Center Shift
3.4.6 Sample Drift
3.4.7 Beam Hardening
3.4.8 Rings
3.4.9 Noise
3.4.10 Partial Volume
3.4.11 Summary
4 Practical Implementation
4.1 Particle Sample Requirements
4.1.1 Geometry
4.1.2 Dispersity and Homogeneity
4.2 Statistics
4.2.1 Single Particle Properties
4.2.2 Properties of a Limited Number of Particles (10 to several 100)
4.2.3 Particle Populations with Distributed Properties
4.3 2D Validation
4.4 Measurement
4.4.1 X–ray Microscope
4.4.2 Source Filter
4.4.3 Detector Binning
4.4.4 Cone Beam Artefact Compensation
4.4.5 Center Shift Correction
4.4.6 Dynamic Ring Removal
5 Image Analysis
5.1 Image Quality
5.1.1 Grey Value Histogram
5.1.2 Resolution
5.1.3 Signal–to–Noise Ratio
5.1.4 Contrast and Dynamic Range
5.1.5 Sharpness
5.1.6 Summary
5.2 Basic Image Processing Strategies
5.2.1 Threshold–Based Segmentation
5.2.2 Machine Learning Assisted Segmentation
6 Correlative Tomography
6.1 Scouting Approach
6.2 Multiscale Approach
6.3 Multidisciplinary Approach
7 Data Management
7.1 Data Quality
7.2 Data Availability
7.2.1 Tomographic Datasets
7.2.2 Particle Database
8 Outlook on Further Research Activities
9 Publications
9.1 Copyright Declaration
9.2 Overview
9.3 List of Publications
Paper A, Preparation techniques for micron–sized particulate samples in X–ray microtomography
Paper B, Self–constructed automated syringe for preparation of micron–sized particulate samples in X–ray microtomography
Paper C, Preparation strategy for statistically significant micrometer–sized particle systems suitable for correlative 3D imaging workflows on the example of X–ray microtomography
Paper D, Multi–scale tomographic analysis for micron–sized particulate samples
Paper E, PARROT: A pilot study on the open access provision of particle discrete tomographic datasets
10 Appendix
10.1 Application Example 1: Fracture Analysis
10.2 Application Example 2: 3D Contact Angle Measurement
10.3 Influence of the Source Filter
10.4 Influence of the X–rays on the Sample
10.5 Appropriate Filter Settings
10.6 Log File Parser
|
454 |
Magnetic APFC modeling and the influence of magneto-structural interactions on grain shrinkageBackofen, Rainer, Salvalaglio, Marco, Voigt, Axel 22 February 2024 (has links)
We derive the amplitude expansion for a phase-field-crystal (APFC) model that captures the basic physics of magneto-structural interactions. The symmetry breaking due to magnetization is demonstrated, and the characterization of the magnetic anisotropy for a bcc crystal is provided. This model enables a convenient coarse-grained description of crystalline structures, in particular when considering the features of the APFC model combined with numerical methods featuring inhomogeneous spatial resolution. This is shown by addressing the shrinkage of a spherical grainwithin amatrix, chosen as a prototypical system to demonstrate the influence of different magnetizations. These simulations serve as a proof of concept for the modeling of manipulation of dislocation networks and microstructures in ferromagnetic materials within the APFC model.
|
455 |
Numerical methods for computationally efficient and accurate blood flow simulations in complex vascular networks: Application to cerebral blood flowGhitti, Beatrice 04 May 2023 (has links)
It is currently a well-established fact that the dynamics of interacting fluid compartments of the central nervous system (CNS) may play a role in the CNS fluid physiology and pathology of a number of neurological disorders, including neurodegenerative diseases associated with accumulation of waste products in the brain. However, the mechanisms and routes of waste clearance from the brain are still unclear. One of the main components of this interacting cerebral fluids dynamics is blood flow. In the last decades, mathematical modeling and fluid dynamics simulations have become a valuable complementary tool to experimental approaches, contributing to a deeper understanding of the circulatory physiology and pathology. However, modeling blood flow in the brain remains a challenging and demanding task, due to the high complexity of cerebral vascular networks and the difficulties that consequently arise to describe and reproduce the blood flow dynamics in these vascular districts. The first part of this work is devoted to the development of efficient numerical strategies for blood flow simulations in complex vascular networks. In cardiovascular modeling, one-dimensional (1D) and lumped-parameter (0D) models of blood flow are nowadays well-established tools to predict flow patterns, pressure wave propagation and average velocities in vascular networks, with a good balance between accuracy and computational cost. Still, the purely 1D modeling of blood flow in complex and large networks can result in computationally expensive simulations, posing the need for extremely efficient numerical methods and solvers. To address these issues, we develop a novel modeling and computational framework to construct hybrid networks of coupled 1D and 0D vessels and to perform computationally efficient and accurate blood flow simulations in such networks. Starting from a 1D model and a family of nonlinear 0D models for blood flow, with either elastic or viscoelastic tube laws, this methodology is based on (i) suitable coupling equations ensuring conservation principles; (ii) efficient numerical methods and numerical coupling strategies to solve 1D, 0D and hybrid junctions of vessels; (iii) model selection criteria to construct hybrid networks, which provide a good trade-off between accuracy in the predicted results and computational cost of the simulations. By applying the proposed hybrid network solver to very complex and large vascular networks, we show how this methodology becomes crucial to gain computational efficiency when solving networks and models where the heterogeneity of spatial and/or temporal scales is relevant, still ensuring a good level of accuracy in the predicted results. Hence, the proposed hybrid network methodology represents a first step towards a high-performance modeling and computational framework to solve highly complex networks of 1D-0D vessels, where the complexity does not only depend on the anatomical detail by which a network is described, but also on the level at which physiological mechanisms and mechanical characteristics of the cardiovascular system are modeled. Then, in the second part of the thesis, we focus on the modeling and simulation of cerebral blood flow, with emphasis on the venous side. We develop a methodology that, departing from the high-resolution MRI data obtained from a novel in-vivo microvascular imaging technique of the human brain, allows to reconstruct detailed subject-specific cerebral networks of specific vascular districts which are suitable to perform blood flow simulations.
First, we extract segmentations of cerebral districts of interest in a way that the arterio-venous separation is addressed and the continuity and connectivity of the vascular structures is ensured. Equipped with these segmentations, we propose an algorithm to extract a network of vessels suitable and good enough, i.e. with the necessary properties, to perform blood flow simulations. Here, we focus on the reconstruction of detailed venous vascular networks, given that the anatomy and patho-physiology of the venous circulation is of great interest from both clinical and modeling points of view. Then, after calibration and parametrization of the MRI-reconstructed venous networks, blood flow simulations are performed to validate the proposed methodology and assess the ability of such networks to predict physiologically reasonable results in the corresponding vascular territories. From the results obtained we conclude that this work represents a proof-of-concept study that demonstrates that it is possible to extract subject-specific cerebral networks from the novel high-resolution MRI data employed, setting the basis towards the definition of an effective processing pipeline for detailed blood flow simulations from subject-specific data, to explore and quantify cerebral blood flow dynamics, with focus on venous blood drainage.
|
456 |
MECHANICS OF STRUCTURE GENOME-BASED MULTISCALE DESIGN FOR ADVANCED MATERIALS AND STRUCTURESSu Tian (14232869) 09 December 2022 (has links)
<p>Composite materials have been invented and used to make all kinds of industrial products, such as automobiles, aircraft, sports equipment etc., for many years. Excellent properties such as high specific stiffness and strength have been recognized and studied for decades, motivating the use of composite materials. However, the design of composite structures still remains a challenge. Existing design tools are not adequate to exploit the full benefits of composites. Many tools are still based on the traditional material selection paradigm created for isotropic homogeneous materials, separated from the shape design. This will lose the coupling effects between composite materials and the geometry and lead to less optimum design of the structure. Hence, due to heterogeneity and anisotropy inherent in composites, it is necessary to model composite parts with appropriate microstructures instead of simplistically replacing composites as black aluminum and consider materials and geometry at the same time.</p>
<p><br></p>
<p>This work mainly focuses on the design problems of complex material-structural systems through computational analyses. Complex material-structural systems are structures made of materials that have microstructures smaller than the overall structural dimension but still obeying the continuum assumption, such as fiber reinforced laminates, sandwich structures, and meta-materials, to name a few. This work aims to propose a new design-by-analysis framework based on the mechanics of structure genome (MSG), because of its capability in accurate and efficient predictions of effective properties for different solid/structural models and three-dimensional local fields (stresses, strains, failure status, etc). The main task is to implement the proposed framework by developing new tools and integrating these tools into a complete design toolkit. The main contribution of this work is a new efficient high-fidelity design-by-analysis framework for complex material-structural systems.</p>
<p><br></p>
<p>The proposed design framework contains the following components. 1) MSG and its companion code SwiftComp is the theoretical foundation for structural analysis in this design framework. This is used to model the complex details of the composite structures. This approach provides engineers the flexibility to use different multiscale modeling strategies. 2) Structure Gene (SG) builder creates finite element-based model inputs for SwiftComp using design parameters defining the structure. This helps designers deal with realistic and meaningful engineering parameters directly without expert knowledge of finite element analysis. 3) Interface is developed using Python for easy access to needed data such as structural properties and failure status. This is used as the integrator linking all components and/or other tools outside this framework. 4) Design optimization methods and iteration controller are used for conducting the actual design studies such as parametric study, optimization, surrogate modeling, and uncertainty quantification. This is achieved by integrating Dakota into this framework. 5) Structural analysis tool is used for computing global structural responses. This is used if an integrated MSG-based global analysis process is needed.</p>
<p><br></p>
<p>Several realistic design problems of composite structures are used to demonstrate the capabilities of the proposed framework. Parameter study of a simple fiber reinforce laminated structure is carried out for investigating the following: comparing with traditional design-by-analysis approaches, whether the new approach can bring new understandings on parameter-response relations and because of new parameterization methods and more accurate analysis results. A realistic helicopter rotor blade is used to demonstrate the optimization capability of this framework. The geometry and material of composite rotor blades are optimized to reach desired structural performance. The rotor blade is also used to show the capability of strength-based design using surrogate models of sectional failure criteria. A thin-walled composite shell structure is used to demonstrate the capability of designing variable stiffness structures by steering in-plane orientations of fibers of the laminate. Finally, the tool is used to study and design auxetic laminated composite materials which have negative Poisson's ratios.</p>
|
457 |
Machine Learning-based Multiscale Topology OptimizationJoel Christian Najmon (17548431) 05 December 2023 (has links)
<p dir="ltr">Multiscale topology optimization is a numerical method that enables the synthesis of hierarchical structures, offering greater design flexibility than single-scale topology optimization. However, this increased flexibility also incurs higher computational costs. Recent advancements have integrated machine learning models into MSTO methods to address this issue. Unfortunately, existing machine learning-based multiscale topology optimization (ML-MSTO) approaches underutilize the potential of machine learning models to surrogate the inner optimization, analysis, and numerical homogenization of arbitrary non-periodic microstructures. This dissertation presents an ML-MSTO method featuring displacement-driven topology-optimized microstructures (TOMs). The proposed method solves an outer optimization problem to design a homogenized macroscale structure and multiple inner optimization problems to obtain spatially distributed, non-periodic TOMs. The inner problem formulation employs the macroscale element densities and nodal displacements to define constraints and boundary conditions for microscale density-based topology optimization problems. Each problem yields a free-form TOM. To reduce computational costs, artificial neural networks (ANNs) are trained to predict their homogenized constitutive tensor. The ANNs also enable sensitivity coefficients to be approximated through a variety of standard derivative methods. The effect of the neural network-based derivative methods on topology optimization results is evaluated in a comparative study. An explicit dehomogenization approach is proposed, leveraging the TOMs of the ML-MSTO method. The explicit approach also features two post-processing schemes to improve the connectivity and clean the final multiscale structure. A 2D and a 3D case study are designed with the ML-MSTO method and dehomogenized with the explicit approach. The resulting multiscale structures are non-periodic with free-form microstructures. In addition, a second implicit dehomogenization approach is developed in this dissertation that allows the projection of homogenized mechanical property fields onto a discrete lattice structure of arbitrary shape. The implicit approach is capable of dehomogenizing any homogenized design. This is done by incorporating an optimization algorithm to find the lattice thickness distribution that minimizes the difference between a local target homogenized property and a corresponding lattice homogenized stiffness tensor. The result is a well-connected, functionally graded lattice structure, that enables control over the length scale, orientation, and complexity of the final microstructured design.</p>
|
458 |
Consciousness Detection in a Complete Locked-in Syndrome Patient through Multiscale Approach AnalysisWu, Shang-Ju, Nicolaou, Nicoletta, Bogdan, Martin 13 April 2023 (has links)
Completely locked-in state (CLIS) patients are unable to speak and have lost all muscle movement. From the external view, the internal brain activity of such patients cannot be easily perceived, but CLIS patients are considered to still be conscious and cognitively active. Detecting the current state of consciousness of CLIS patients is non-trivial, and it is difficult to ascertain whether CLIS patients are conscious or not. Thus, it is important to find alternative ways to re-establish communication with these patients during periods of awareness, and one such alternative is through a brain–computer interface (BCI). In this study, multiscale-based methods (multiscale sample entropy, multiscale permutation entropy and multiscale Poincaré plots) were applied to analyze electrocorticogram signals from a CLIS patient to detect the underlying consciousness level. Results from these different methods converge to a specific period of awareness of the CLIS patient in question, coinciding with the period during which the CLIS patient is recorded to have communicated with an experimenter. The aim of the investigation is to propose a methodology that could be used to create reliable communication with CLIS patients.
|
459 |
MULTISCALE MODELING AND CHARACTERIZATION OF THE POROELASTIC MECHANICS OF SUBCUTANEOUS TISSUEJacques Barsimantov Mandel (16611876) 18 July 2023 (has links)
<p>Injection to the subcutaneous (SC) tissue is one of the preferred methods for drug delivery of pharmaceuticals, from small molecules to monoclonal antibodies. Delivery to SC has become widely popular in part thanks to the low cost, ease of use, and effectiveness of drug delivery through the use of auto-injector devices. However, injection physiology, from initial plume formation to the eventual uptake of the drug in the lymphatics, is highly dependent on SC mechanics, poroelastic properties in particular. Yet, the poroelastic properties of SC have been understudied. In this thesis, I present a two-pronged approach to understanding the poroelastic properties of SC. Experimentally, mechanical and fluid transport properties of SC were measured with confined compression experiments and compared against gelatin hydrogels used as SC-phantoms. It was found that SC tissue is a highly non-linear material that has viscoelastic and porohyperelastic dissipation mechanisms. Gelatin hydrogels showed a similar, albeit more linear response, suggesting a micromechanical mechanism may underline the nonlinear behavior. The second part of the thesis focuses on the multiscale modeling of SC to gain a fundamental understanding of how geometry and material properties of the microstructure drive the macroscale response. SC is composed of adipocytes (fat cells) embedded in a collagen network. The geometry can be characterized with Voroni-like tessellations. Adipocytes are fluid-packed, highly deformable and capable of volume change through fluid transport. Collagen is highly nonlinear and nearly incompressible. Representative volume element (RVE) simulations with different Voroni tesselations shows that the different materials, coupled with the geometry of the packing, can contribute to different material response under the different kinds of loading. Further investigation of the effect of geometry showed that cell packing density nonlinearly contributes to the macroscale response. The RVE models can be homogenized to obtain macroscale models useful in large scale finite element simulations of injection physiology. Two types of homogenization were explored: fitting to analytical constitutive models, namely the Blatz-Ko material model, or use of Gaussian process surrogates, a data-driven non-parametric approach to interpolate the macroscale response.</p>
|
460 |
Multiscale Biomaterials for Cell and Tissue EngineeringAgarwal, Pranay 10 August 2017 (has links)
No description available.
|
Page generated in 0.0667 seconds