• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 69
  • 7
  • 5
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 119
  • 119
  • 25
  • 23
  • 19
  • 16
  • 15
  • 15
  • 13
  • 13
  • 10
  • 10
  • 9
  • 9
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

Évaluation et modélisation des dispositifs de retenue pour motards / Design and Assessment of Motorcyclists Restraint Systems

Kodjo, Vidjannagni 02 September 2016 (has links)
En France, en 2015 les motocyclistes représentent 43% des blessés graves victimes d'accidents de la circulation routière. Au même moment, même si les accidents de chocs de motocyclistes contre les dispositifs de retenue routiers ne représentent en moyenne que 10% des accidents de la circulation routière, les motocyclistes présentent un risque de décès 30 à 80 fois supérieur au risque de décès des usagers de véhicules impliqués dans le même type d'accidents.Pour renforcer la sécurité des motocyclistes, les dispositifs de retenue pour automobiles (dispositifs traditionnels) sont équipés à leur partie inférieure d'écran de protection motards. Les écrans motards ont pour but d'amortir de manière progressive les chocs des motocyclistes en les empêchant dans le même temps de passer à travers la partie inférieure des dispositifs de retenue pour automobiles, de percuter un poteau et d'être sévèrement blessés. Les écrans motards doivent subir des essais de chocs normatifs et répondre à certaines exigences afin de pouvoir être installés sur le bord des routes européennes et de garantir surtout la sécurité des motards. Cependant, il n'existe que des procédures normatives pour tester ces dispositifs, la procédure LIER dans le contexte Français et la procédure normative EN 1317-8 dans le contexte Européen. Afin d'évaluer ces deux procédures normatives et de proposer de nouvelles dispositions pouvant servir de procédure normative plus avancée sur le protocole d'essais d'évaluation de performances d'écrans motards, des simulations numériques ont été réalisées. Un modèle numérique complet de dispositif de retenue routier équipé d'écran motard en acier a été développé et corrélé sur un essai de choc réel sur un dispositif de même type. Une fois corrélé,le modèle a été utilisé afin d'évaluer l'influence des conditions d'essais et des caractéristiques mécaniques matériaux de l'écran sur les performances de l'écran motard. Enfin, une nouvelle démarche d'évaluation de performances d'écran de protection motard a été mise en place / In France, in 2015,motorcyclists represent 43% of seriously injured accident victims in road traffic. At the same time, even though motorcyclists impact crashes against the road restraint systems represent on average 10% of accidents in road traffic, motorcyclists present a risk of death from30 to 80 times upper than the risk of death for users of vehicles involved in the same type of accidents. To enhance the safety of motorcyclists, Vehicle Restraints Systems (VRS) arefitted to the lower part of restraint systems for motorcyclists. Motorcyclist’s Restraint Systems (MRS) are designed to absorb shocks gradually and to preventmotorcyclists to pass through the lower part of VRS, to hit a post and be severely injured. Before being installed on the roadsides, MRS have to be crash-tested according to standards in order to evaluate their safety and ensure motorcyclists safety. However, there are only normative procedures to testMRS, the LIER procedurein the French context and the procedure EN1317-8 in the European context. To assess both normative procedures and propose new provisions which can be used as normative procedure more advanced on the protocol of MRS performances evaluation tests, numerical simulations were performed. A numerical model of MRS was developed and correlated on one real impact test on other device of same type. Once correlated, the model was used to conduct a parametric study on the test conditions and thematerialsmechanical properties of the MRS. Finally, a new demarche of MRS performances evaluation was developed
82

Three-dimensional geometry-based radio channel model:parametrization and validation at 10 GHz

Roivainen, A. (Antti) 05 May 2017 (has links)
Abstract This dissertation presents complete parameterizations for a three-dimensional (3-D) geometry-based stochastic radio channel model (GSCM) at 10 GHz based on measurement campaigns. The thesis is divided into the following main parts: radio channel measurements, the characterization of model parameters, and model validation. Experimental multiple-input multiple-output (MIMO) channel measurements carried out in two-story lobby and urban small cell scenarios are first described. The measurements were performed with a vector network analyzer and dual polarized virtual antenna arrays with a bandwidth over 500 MHz. The measurement data was post-processed using the ESPRIT algorithm and the post-processed data was verified using a semi-deterministic map-based model. The results showed a good match between estimated and modeled multipath components (MPCs). In addition, single-input single-output outdoor-to-indoor measurements were executed through a standard multi-pane glass window and concrete wall. A statistical analysis was carried out for defining full 3-D characterization of the propagation channel in both line-of-sight (LOS) and non-line-of-sight (NLOS) propagation conditions. The delay and angular dispersions of MPCs are smaller in comparison to lower frequency bands due to the higher attenuation of the delayed MPCs. Moreover, specular reflection is observed to be the more dominant propagation mechanism in comparison to diffuse scattering, leading to smaller cluster angle spreads in comparison to lower frequency bands. The penetration loss caused by a standard multi-pane glass window is on the same level as in the lower frequency bands, whereas the loss caused by the concrete wall is a few dBs higher than at lower frequency bands. Finally, the GSCM with determined parameters is validated. A MIMO channel was reconstructed by embedding 3-D radiation patterns of the antennas into the propagation path estimates. Equivalently the channel simulations were performed with a quasi deterministic radio channel generator (QuaDRiGa) using the defined parameters. The channel capacity, Demmel condition number, and relative condition numbers are used as the comparison metrics between reconstructed and modeled channels. The results show that the reconstructed MIMO channel matches the simulated MIMO channel well. / Tiivistelmä Tämä väitöskirja esittää parametroinnit kolmiulotteiselle geometriaan perustuvalle stokastiselle radiokanavamallille 10 GHz:n taajuusalueella perustuen mitattuun radiokanavaan. Väitöskirja koostuu kolmesta pääalueesta: radiokanavamittaukset, radiokanavamallin parametrien määrittäminen ja mallin validointi. Aluksi kuvataan kaksikerroksisessa aula ja kaupunkipiensolu ympäristöissä monilähetin monivastaanotin (MIMO) järjestelmällä tehdyt kanavamittaukset. Mittaukset tehtiin vektoripiirianalysaattorilla ja kaksoispolaroiduilla virtuaaliantenniryhmillä 500 MHz kaistanleveydellä. Mittausdata jälkikäsiteltiin käyttämällä ESPRIT-algoritmia ja jälkikäsitelty data varmennettiin osittain deterministisellä mittausympäristön karttaan pohjautuvalla radiokanavamallilla. Tulokset osoittivat hyvän yhteensopivuuden mitattujen ja mallinnettujen moniteiden välillä. Lisäksi toteuttiin yksi-lähetin yksi-vastaanotin mittaukset ulko-sisä etenemisympäristössä monikerroksisen lasin ja betoniseinän läpi. Tilastollinen analyysin avulla määritettiin täysi kolmiulotteinen kuvaus radioaallon etenemiskanavasta näköyhteys ja näköyhteydettömässä tilanteissa. Moniteiden suuremmista vaimennuksista johtuen viive ja kulmahajonnat ovat pienemmät verrattaessa matalempiin taajuuksiin. Peiliheijastus on diffuusisirontaa merkittävämpi radioaallon etenemismekanismi johtaen pienempiin klustereiden kulmahajeisiin matalempiin taajuuksiin verrattuna. Monikerroksisen lasin läpäisyvaimennus on samankaltainen kuin alemmilla taajuuksilla, kun sitä vastoin betoniseinän vaimennus on muutaman desibelin suurempi kuin alemmilla taajuuksilla. Lopulta geometriaan perustava stokastinen radiokanavamalli validoidaan määritellyillä parametreilla. MIMO kanava uudelleen rakennetaan lisäämällä kolmiulotteiset antennien säteilykuviot estimoituihin radioaallon etenemisteihin. Vastaavasti radiokanava simuloidaan näennäisesti deterministisellä radiokanavageneraattorilla (QuaDRiGa) käyttäen määriteltyjä mallin parametreja. Kanavakapasiteettia, Demmel ehtolukua ja suhteellista ehtolukua käytetään vertailumittareina uudelleen rakennetun ja simuloidun kanavan välillä. Tulosten perusteella uudelleen rakennettu MIMO kanava on yhteensopiva simuloidun radiokanavan kanssa.
83

Habitat models to predict wetland bird occupancy influenced by scale, anthropogenic disturbance, and imperfect detection

Glisson, Wesley J., Conway, Courtney J., Nadeau, Christopher P., Borgmann, Kathi L. 06 1900 (has links)
Understanding species-habitat relationships for endangered species is critical for their conservation. However, many studies have limited value for conservation because they fail to account for habitat associations at multiple spatial scales, anthropogenic variables, and imperfect detection. We addressed these three limitations by developing models for an endangered wetland bird, Yuma Ridgway's rail (Rallus obsoletus yumanensis), that examined how the spatial scale of environmental variables, inclusion of anthropogenic disturbance variables, and accounting for imperfect detection in validation data influenced model performance. These models identified associations between environmental variables and occupancy. We used bird survey and spatial environmental data at 2473 locations throughout the species' U.S. range to create and validate occupancy models and produce predictive maps of occupancy. We compared habitat-based models at three spatial scales (100, 224, and 500 m radii buffers) with and without anthropogenic disturbance variables using validation data adjusted for imperfect detection and an unadjusted validation dataset that ignored imperfect detection. The inclusion of anthropogenic disturbance variables improved the performance of habitat models at all three spatial scales, and the 224-m-scale model performed best. All models exhibited greater predictive ability when imperfect detection was incorporated into validation data. Yuma Ridgway's rail occupancy was negatively associated with ephemeral and slow-moving riverine features and high-intensity anthropogenic development, and positively associated with emergent vegetation, agriculture, and low-intensity development. Our modeling approach accounts for common limitations in modeling species-habitat relationships and creating predictive maps of occupancy probability and, therefore, provides a useful framework for other species.
84

A Combined Formal Model for Relational Context-Dependent Roles

Kühn, Thomas, Böhme, Stephan, Götz, Sebastian, Aßmann, Uwe 08 June 2021 (has links)
Role-based modeling has been investigated for over 35 years as a promising paradigm to model complex, dynamic systems. Although current software systems are characterized by increasing complexity and context-dependence, all this research had almost no influence on current software development practice, still being discussed in recent literature. One reason for this is the lack of a coherent, comprehensive, readily applicable notion of roles. Researchers focused either on relational roles or context-dependent roles rather then combining both natures. Currently, there is no role-based modeling language sufficiently incorporating both the relational and context-dependent nature of roles together with the various proposed constraints. Hence, this paper formalizes a full-fledged role-based modeling language supporting both natures. To show its sufficiency and adequacy, a real world example is employed.
85

FRaMED: Full-Fledge Role Modeling Editor (Tool Demo)

Kühn, Thomas, Bierzynski, Kay, Richly, Sebastian, Aßmann, Uwe 09 June 2021 (has links)
Since the year 1977, role modeling has been continuously investigated as promising paradigm to model complex, dynamic systems. However, this research had almost no influence on the design of todays increasingly complex and context-sensitive software systems. The reason for that is twofold. First, most modeling languages focused either on the behavioral, relational or context-dependent nature of roles rather than combining them. Second, there is a lack of tool support for the design, validation, and generation of role-based software systems. In particular, there exists no graphical role modeling editor supporting the three natures as well as the various proposed constraints. To overcome this deficiency, we introduce the Full-fledged Role Modeling Editor (FRaMED), a graphical modeling editor embracing all natures of roles and modeling constraints featuring generators for a formal representation and source code of a rolebased programming language. To show its applicability for the development of role-based software systems, an example from the banking domain is employed.
86

Validation of a Public Transport Model / Validering av en kollektivtrafikmodell

Aho, Yousef, de Jong, Johannes January 2020 (has links)
During 2018, the Public Transport Administration (Trafikförvaltningen) in the Stockholm region spent approximately 2.2 billion SEK on new infrastructure investments related to the public transport system, many of which were based on their public transport models. The previously used method for validating these models has lacked scientific rigour, efficiency and a systematic approach, which has led to uncertainty in decision making. Furthermore, few scientific studies have been conducted to develop validation methodologies for large-scale models, such as public transport models. For these reasons, a scientific validation methodology for public transport models has been developed in this thesis. This validation methodology has been applied on the 2014 route assignment model used by Trafikförvaltningen, for the transport modes bus, commuter train and local tram. In the developed validation methodology, the selected validation metrics called MAPE, %RMSE and R^2 are used to compare link loads from a route assignment model with observed link loads from an Automatic Passenger Counting (APC) system. To obtain an overview of the performance of the route assignment model, eight different scenarios are set, based on whether the validation metrics meet acceptable thresholds or not. In the application of the developed validation methodology, the average link loads for the morning rush have been validated. To adjust the developed validation methodology to system-specific factors and to set acceptable metric thresholds, discussions with model practitioners have taken place. The validation has been performed on both lines and links, and for bus entire line number series have been validated as well. The validation results show that commuter train meets the set threshold values in a higher proportion than bus and local tram do. However, Trafikförvaltningen is recommended to further calibrate the route assignment model in order to achieve a better model performance. The developed validation methodology can be used for validation of public transport models, and can in combination with model calibration be used in an iterative process to fine-tune model parameters for optimising validation results. Finally, a number of recommendations are proposed for Trafikförvaltningen to increase the efficiency and quality of the validation process, such as synchronising model data with the observed data. / Under 2018 spenderade Trafikförvaltningen ungefär 2,2 miljarder kronor på nya infrastrukturinvesteringar för kollektivtrafiksystemet i Stockholm, varav många av dessa baserades på deras kollektivtrafikmodeller. Den tidigare metoden för att valideras dessa modeller har saknat gedigen vetenskaplig grund, effektivitet och ett systematiskt tillvägagångssätt, vilket lett till osäkerhet gällande investeringsbeslut. Dessutom har få vetenskapliga studier genomförts för att ta fram valideringsmetodologier för storskaliga modeller, såsom kollektivtrafikmodeller. Av dessa skäl har en vetenskaplig valideringsmetodologi för kollektivtrafikmodeller tagits fram i detta examensarbete. Denna valideringsmetodologi har tillämpats på Trafikförvaltningens 2014 års nätutläggningsmodell, för trafikslagen buss, pendeltåg och spårväg. I den framtagna valideringsmetodologin har de valda valideringsmåtten vid namn MAPE, %RMSE och R^2 använts för att jämföra länkbelastningar från en nätutläggningsmodell med observerade länkbelastningar från ett Automatisk Trafikanträkning-system (ATR). För att ge en översikt över modellens precision har åtta scenarios satts baserat på om valideringsmåtten godkänns eller inte enligt tröskelvärden. I tillämpningen av den framtagna valideringsmetodologin har de genomsnittliga länkbelastningarna för morgonens rusningstrafik validerats. För att justera den framtagna valideringsmetodologin efter systemspecifika faktorer och för att sätta godkända tröskelvärden för valideringsmåtten, har diskussioner med trafikanalytiker hållits. Valideringen har utförts både på linjer och länkar, och för buss har även hela linjeserier validerats. Valideringsresultaten för pendeltåg har en högre andel godkända mätningar än buss och spårväg. Trafikförvaltningen rekommenderas dock att kalibrera nätutläggningsmodellen ytterligare för att uppnå ett bättre resultat. Den framtagna valideringsmetodologin kan användas för valideringar av kollektivtrafikmodeller, och kan i kombination med modellkalibrering användas i en iterativ process för att finjustera modellparametrar och därmed optimera valideringsresultaten. Slutligen föreslås ett antal rekommendationer för Trafikförvaltningen för att öka effektiviteten och kvaliteten på valideringsprocessen, till exempel att synkronisera modelldata med observerad data.
87

Fluid Flow Characterization and In Silico Validation in a Rapid Prototyped Aortic Arch Model

Knauer, Alexandra Mariel 01 August 2016 (has links) (PDF)
Transcatheter aortic heart valve replacement (TAVR) is a procedure to replace a failing aortic valve and is becoming the new standard of care for patients that are not candidates for open-heart surgery [2]. However, this minimally invasive technique has shown to cause ischemic brain lesions, or “silent infarcts”, in 90% of TAVR patients, which can increase the patient’s risk for stroke by two to four times in future years [3]. Claret Medical Inc., a medical device company, has developed a cerebral protection system that filters and captures embolic debris released during endovascular procedures, such as TAVR. This thesis utilized CT scans from Claret Medical to create a physical construct of the aortic arch to experimentally validate a theoretical computer model through flow visualization. The hypothesis was that the empirical model can accurately mimic the fluid dynamic properties of the aortic arch in order validate an in silico model using the finite elements program COMSOL MultiPhysics® Modeling Software. The physical model was created from a patient CT scan of the aortic arch using additive manufacturing (3D printing) and polymer casting, resulting in the shape of the aortic arch within a transparent, silicone material. Fluid was pumped through the model to visualize and quantify the velocity of the fluid within the aortic arch. COMSOL MultiPhysics® was used to model the aortic arch and obtain velocity measurements, which were statistically compared to the velocity measurements from the physical model. There was no significant difference between the values of the physical model and the computer model, confirming the hypothesis. Overall, this study successfully used CT scans to create an anatomically accurate physical model that was validated by a computer model using a novel technique of flow visualization. As TAVR and similar procedures continue to develop, the need for experimental evaluation and visualization of devices will continue to grow, making this project relevant to many companies in the medical device industry.
88

Direct Assessment and Investigation of Nonlinear and Nonlocal Turbulent Constitutive Relations in Three-Dimensional Boundary Layer Flow

Gargiulo, Aldo 12 July 2023 (has links)
Three-dimensional (3D) turbulent boundary layers (TBLs) play a crucial role in determining the aerodynamic properties of most aero-mechanical devices. However, accurately predicting these flows remains a challenge due to the complex nonlinear and nonlocal physics involved, which makes it difficult to develop universally applicable models. This limitation is particularly significant as the industry increasingly relies on simulations to make decisions in high-consequence environments, such as the certification or aircraft, and high-fidelity simulation methods that don't rely on modeling are prohibitively expensive. To address this challenge, it is essential to gain a better understanding of the physics underlying 3D TBLs. This research aims to improve the predictive accuracy of turbulence models in 3D TBLs by examining the impact of model assumptions underpinning turbulent constitutive relations, which are fundamental building blocks of every turbulence model. Specifically, the study focuses on the relevance and necessity of nonlinear and nonlocal model assumptions for accurately predicting 3D TBLs. The study considers the attached 3D boundary layer flow over the textbf{Be}nchmark textbf{V}alidation textbf{E}xperiment for textbf{R}ANS/textbf{L}ES textbf{I}nvestiagtions (BeVERLI) Hill as a test case and corresponding particle image velocimetry data for the investigation. In a first step, the BeVERLI Hill experiment is comprehensively described, and the important characteristics of the flow over the BeVERLI Hill are elucidated, including complex symmetry breaking characteristics of this flow. Reynolds-averaged Navier-Stokes simulations of the case using standard eddy viscosity models are then presented to establish the baseline behavior of local and linear constitutive relations, i.e., the standard Boussinesq approximation. The tested eddy viscosity models fail in the highly accelerated hill top region of the BeVERLI hill and near separation. In a further step, several nonlinear and nonlocal turbulent constitutive relations, including the QCR model, the model by Gatski and Speziale, and the difference-quotient model by Egolf are used as metrics to gauge the impact of nonlinearities and nonlocalities for the modeling of 3D TBLs. It is shown that nonlinear and nonlocal approaches are essential for effective 3D TBL modeling. However, simplified reduced-order models could accurately predict 3D TBLs without high computational costs. A constitutive relation with local second-order nonlinear mean strain relations and simplified nonlocal terms may provide such a minimal model. In a final step, the structure and response of non-equilibrium turbulence to continuous straining are studied to reveal new scaling laws and structural models. / Doctor of Philosophy / Airplanes and other flying objects rely on the way air flows around them to generate lift and stay in the sky. This airflow can be very complex, especially close to the surface of the object, where it is affected by friction with the object. This friction generates a layer of air called a boundary layer, which can become turbulent and lead to complex patterns of airflow. The boundary layer is generated by the friction between the air and the surface of the object, which causes the air molecules to "stick" to the surface. This sticking creates a layer of slow-moving air that slows down the flow of air around the object. This loss of momentum creates drag, which is one of the main factors that resist the motion of objects in the air. The slowing of the air flow in the boundary layer is due to the viscosity of the air, which is a measure of how resistant the air is to deformation. The molecules in the air have a tendency to stick together, making it difficult for them to move past each other. This resistance causes the momentum of the air to be lost as it flows over the surface of the object because air molecules close to the surface "pull" on the ones farther away. Understanding how turbulent boundary layers (TBLs) work is essential to accurately predict the airflow around these objects using computer simulations. However, it's challenging because TBLs involve complex physics that are difficult to model accurately. This research focuses on a specific type of TBL called a three-dimensional (3D) TBL. This study looks at how different assumptions affect the accuracy of computer simulations that predict this type of airflow. It is found that using more complex models that take into account nonlinear and nonlocal physics can help predict 3D TBLs more accurately. However, these models are computationally expensive, and it is also found that simpler models can work well enough and are cheaper. This research further establishes important physical relations of the mechanisms pertaining 3D TBLs that could support the advancement of current models.
89

Enabling Digital Twinning via Information-Theoretic Machine Learning-Based Inference Intelligence

Jeongwon Seo (8458191) 30 November 2023 (has links)
<p dir="ltr">Nuclear energy, renowned for its clean, carbon-free attributes and cost-effectiveness, stands as a pivotal pillar in the global quest for sustainable energy sources. Additionally, nuclear power, being a spatially high-concentrated industry, offers an unparalleled energy density compared to other sources of energy. Despite its numerous advantages, if a nuclear power plant (NPP) is not operated safely, it can lead to long-term shutdowns, radiation exposure to workers, radiation contamination of surrounding areas, or even a national-scale disaster, as witnessed in the Chernobyl incident of 1986. Therefore, ensuring the safe operation of nuclear reactors is considered the most important factor in their operation. Recognizing the intricate tradeoff between safety and economy, economic considerations are often sacrificed in favor of safety.</p><p dir="ltr">Given this context, it becomes crucial to develop technologies that ensure NPPs’ safety while optimizing their operational efficiency, thereby minimizing the sacrifice of economic benefits. In response to this critical need, scientists introduced the term “digital twin (DT)”, derived from the concept of product lifecycle management. As the first instance of the term, the DT model comprises the physical product, its digital representation, data flowing from the physical to the DT, and information flowing from the digital to the physical twin. In this regard, various nuclear stakeholders such as reactor designers, researchers, operators, and regulators in the nuclear sector, are pursuing the DT technologies which are expected to enable NPPs to be monitored and operated/controlled in an automated and reliable manner. DT is now being actively sought given its wide potential, including increased operational effectiveness, enhanced safety and reliability, uncertainty reduction, etc.</p><p dir="ltr">While a number of technical challenges must be overcome to successfully implement DT technology, this Ph.D. work limits its focus on one of the DT’s top challenges, i.e., model validation, which ensures that model predictions can be trusted for a given application, e.g., the domain envisaged for code usage. Model validation is also a key regulatory requirement in support of the various developmental stages starting from conceptual design to deployment, licensing, operation, and safety. To ensure a given model to be validated, the regulatory process requires the consolidation of two independent sources of knowledge, one from measurements collected from experimental conditions, and the other from code predictions that model the same experimental conditions.</p><p dir="ltr">and computational domains in an optimal manner, considering the characteristics of predictor and target responses. Successful model validation necessitates a complete data analytics pipeline, generally including data preprocessing, data analysis (model training), and result interpretation. Therefore, this Ph.D. work begins by revisiting fundamental concepts such as uncertainty classification, sensitivity analysis (SA), similarity/representativity metrics, and outlier rejection techniques, which serve as robust cornerstones of validation analysis.</p><p dir="ltr">The ultimate goal of this Ph.D. work is to develop an intelligent inference framework that infers/predicts given responses, adaptively handling various levels of data complexities, i.e., residual shape, nonlinearity, heteroscedasticity, etc. These Ph.D. studies are expected to significantly advance DT technology, enabling support for various levels of operational autonomy in both existing and first-of-a-kind reactor designs. This extends to critical aspects such as nuclear criticality safety, nuclear fuel depletion dynamics, spent nuclear fuel (SNF) analysis, and the introduction of new fuel designs, such as high burnup fuel and high-assay low-enriched uranium fuel (HALEU). These advancements are crucial in scenarios where constructing new experiments is costly, time-consuming, or infeasible with new reactor systems or high-consequence events like criticality accidents.</p>
90

PVactVal: A Validation Approach for Agent-based Modeling of Residential Photovoltaic Adoption

Johanning, Simon, Abitz, Daniel, Schulte, Emily, Scheller, Fabian, Bruckner, Thomas 19 October 2023 (has links)
Agent-based simulation models are an important tool to study the effectiveness of policy interventions on the uptake of residential photovoltaic systems by households, a cornerstone of sustainable energy system transition. In order for these models to be trustworthy, they require rigorous validation. However, the canonical approach of validating emulation models through calibration with parameters that minimize the difference of model results and reference data fails when the model is subject to many stochastic influences. The residential photovoltaic diffusion model PVact features numerous stochastic influences that prevent straightforward optimization-driven calibration. From the analysis of the results of a case-study on the cities Dresden and Leipzig (Germany) based on three error metrics (mean average error, root mean square error and cumulative average error), this research identifies a parameter range where stochastic fluctuations exceed differences between results of different parameterization and a minimization-based calibration approach fails. Based on this observation, an approach is developed that aggregates model behavior across multiple simulation runs and parameter combinations to compare results between scenarios representing different future developments or policy interventions of interest.

Page generated in 0.109 seconds