• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 71
  • 7
  • 5
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 122
  • 122
  • 26
  • 23
  • 20
  • 16
  • 16
  • 15
  • 14
  • 13
  • 10
  • 10
  • 10
  • 9
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

Analyse des modèles résines pour la correction des effets de proximité en lithographie optique / Resist modeling analysis for optical proximity correction effect in optical lithography

Top, Mame Kouna 12 January 2011 (has links)
Les progrès réalisés dans la microélectronique répondent à la problématique de la réduction des coûts de production et celle de la recherche de nouveaux marchés. Ces progrès sont possibles notamment grâce à ceux effectués en lithographie optique par projection, le procédé lithographique principalement utilisé par les industriels. La miniaturisation des circuits intégrés n’a donc été possible qu’en poussant les limites d’impression lithographique. Cependant en réduisant les largeurs des transistors et l’espace entre eux, on augmente la sensibilité du transfert à ce que l’on appelle les effets de proximité optique au fur et à mesure des générations les plus avancées de 45 et 32 nm de dimension de grille de transistor.L’utilisation des modèles OPC est devenue incontournable en lithographie optique, pour les nœuds technologiques avancés. Les techniques de correction des effets de proximité (OPC) permettent de garantir la fidélité des motifs sur plaquette, par des corrections sur le masque. La précision des corrections apportées au masque dépend de la qualité des modèles OPC mis en œuvre. La qualité de ces modèles est donc primordiale. Cette thèse s’inscrit dans une démarche d’analyse et d’évaluation des modèles résine OPC qui simulent le comportement de la résine après exposition. La modélisation de données et l’analyse statistique ont été utilisées pour étudier ces modèles résine de plus en plus empiriques. Outre la fiabilisation des données de calibrage des modèles, l’utilisation des plateformes de création de modèles dédiées en milieu industriel et la méthodologie de création et de validation des modèles OPC ont également été étudié. Cette thèse expose le résultat de l’analyse des modèles résine OPC et propose une nouvelles méthodologie de création, d’analyse et de validation de ces modèles. / The Progress made in microelectronics responds to the matter of production costs reduction and to the search of new markets. These progresses have been possible thanks those made in optical lithography, the printing process principally used in integrated circuit (IC) manufacturing.The miniaturization of integrated circuits has been possible only by pushing the limits of optical resolution. However this miniaturization increases the sensitivity of the transfer, leading to more proximity effects at progressively more advanced technology nodes (45 and 32 nm in transistor gate size). The correction of these optical proximity effects is indispensible in photolithographic processes for advanced technology nodes. Techniques of optical proximity correction (OPC) enable to increase the achievable resolution and the pattern transfer fidelity for advanced lithographic generations. Corrections are made on the mask based on OPC models which connect the image on the resin to the changes made on the mask. The reliability of these OPC models is essential for the improvement of the pattern transfer fidelity.This thesis analyses and evaluates the OPC resist models which simulates the behavior of the resist after the photolithographic process. Data modeling and statistical analysis have been used to study these increasingly empirical resist models. Besides the model calibration data reliability, we worked on the way of using the models calibration platforms generally used in IC manufacturing.This thesis exposed the results of the analysis of OPC resist models and proposes a new methodology for OPC resist models creation, analysis and validation.
102

Commissioning new applications on processing machines: Part II – implementation

Troll, Clemens, Schebitz, Benno, Majschak, Jens-Peter, Döring, Michael, Holowenko, Olaf, Ihlenfeldt, Steffen 07 June 2018 (has links) (PDF)
The subject of this splitted article is the commissioning of a new application that may be part of a processing machine. At the example of the intermittent transport of small sized goods, for example, chocolate bars, ideas for increasing the maximum machine performance are discussed. Therefore, optimal process motion profiles are synthesised with the help of a computer simulation. In the first part of the paper, the modelling of the process was shown. This second part focusses on implementing the simulated motion approaches on an experimental test rig, whereby the new motion approach is compared to the conventional approach. Hence, the increasing of the performance can be proven. Eventually, possibilities for an online process control are observed which are necessary to prevent unstable process conditions.
103

Modélisation d’essais de choc sur dispositifs de retenue de véhicules : Application aux dispositifs mixtes acier-bois / Vehicle restraint system crash test modelling : Application to steel-wood structures

Goubel, Clément 13 December 2012 (has links)
En France, un tiers des personnes tuées sur la route le sont lors d’un accident sur un obstaclefixe. Dans 90% des cas, ces accidents surviennent après une perte de contrôle du véhicule.Les dispositifs de retenue de véhicule ont pour but de maintenir les véhicules en perdition surla chaussée en limitant la sévérité de l’impact.Ces dispositifs doivent subir des essais de chocs normatifs afin de pouvoir être installés sur lebord des routes européennes et d’évaluer leurs performances en termes de sévérité et dedéflexion.Les tolérances existantes sur les paramètres d’essai (véhicule, masse du véhicule, vitesse,angle et point d’impact …) et les incertitudes sur les caractéristiques mécaniques desmatériaux constituant le dispositif ont un effet sur les performances de ce dispositifs etdoivent être prises en compte lors des calculsLes dispositifs mixtes (acier-bois) présentent une difficulté supplémentaire en raison del’hétérogénéité du matériau et de sa sensibilité aux variables d’environnement telles que latempérature et l’humidité.Afin de prendre en compte cette variabilité et d’évaluer son impact sur les performances d’undispositif, des essais dynamiques sur des échantillons de structure ont été réalisés et modélisésnumériquement.Enfin, un modèle complet d’un dispositif de retenue de véhicule a été effectué et corrélé surun essai de choc réel à l’aide d’une méthode prenant en compte la variation de paramètresphysiques liés à l’apparition des modes de ruine de la structure. Une fois corrélé, le modèle aété utilisé afin d’évaluer l’incidence de la modification des caractéristiques mécaniques dubois liée aux variations des conditions environnementales. / In France, one third of the people dying on the roads are killed after impacting against ahazard. In 90% of the reported cases, these accidents result from loss of control. VehicleRestraint Systems (VRS) are specially designed to restrain an errant vehicle and to limitimpact severity.Before being installed on the roadsides, these devices have to be crash-tested according tostandards in order to evaluate their safety and deflexion performances.Tolerances exist on impact parameters (vehicle, vehicle mass, impact speed, impact angle,impact point …) and material’s mechanical characteristic uncertainties have an effect towardsdevice performances and have to be taken into account during numerical simulations.Steel-wood structures present an additional numerical challenge due to wood heterogeneityand its sensibility to environment variables such as temperature and moisture content.In order to assess the effect of this variability toward safety performances, three point bendingdynamic experiments on structural samples are performed and modelled.Finally, a complete model of a vehicle restraint system is built and validated according to realcrash test results thanks to a parametric method. This method takes into account the variabilityof the parameters associated to the failure modes of the structure. Once validated the model isused to assess the effect of wood mechanical properties modifications due to environmentvariable variations.
104

Commissioning new applications on processing machines: Part II – implementation

Troll, Clemens, Schebitz, Benno, Majschak, Jens-Peter, Döring, Michael, Holowenko, Olaf, Ihlenfeldt, Steffen 07 June 2018 (has links)
The subject of this splitted article is the commissioning of a new application that may be part of a processing machine. At the example of the intermittent transport of small sized goods, for example, chocolate bars, ideas for increasing the maximum machine performance are discussed. Therefore, optimal process motion profiles are synthesised with the help of a computer simulation. In the first part of the paper, the modelling of the process was shown. This second part focusses on implementing the simulated motion approaches on an experimental test rig, whereby the new motion approach is compared to the conventional approach. Hence, the increasing of the performance can be proven. Eventually, possibilities for an online process control are observed which are necessary to prevent unstable process conditions.
105

Zásobník tepla solární soustavy / Solar hot water storage tank

Vyhlídalová, Karolína January 2020 (has links)
The solar hot water storage tank is off great importance in the solar collector array. It allows transformed energy accumulation thus deals with the inconsistency between supply and demand. The suitable design of the storage tank can improve system efficiency. The storage capacity represents the balance between the amount of stored hot water and the tank's heat losses. The design of the storage capacity is based on three hypotheses. The coverage of hot water demand by solar energy, the ratio between storage capacity and solar thermal collector area and the prediction that the storage capacity corresponds to one- to twofold hot water demand. The purpose of this thesis is to share an understanding of the solar storage tanks design and to improve the design through numerical simulation, experimentations and general calculations. It also focuses on the confirmation of the used hypotheses and determination of the best way to design the solar storage tank for general practice and further potential discussions. The simulation model has three variables – the storage capacity, collector area and the number of occupants. The intent is to find the interdependence of these three variables. The purpose of the simulations is to modify the design of the solar tank based on the mutual influence of studied parameters. The modifications are performed based on the users' needs.
106

Quality Assurance of Exposure Models for Environmental Risk Assessment of Substances / Qualitätssicherung von Expositionsmodellen zur Umweltrisikoabschätzung von Substanzen

Schwartz, Stefan 04 September 2000 (has links)
Environmental risk assessment of chemical substances in the European Union is based on a harmonised scheme. The required models and parameters are laid down in the Technical Guidance Document (TGD) and are implemented in the EUSES software. An evaluation study of the TGD exposure models was carried out. In particular, the models for estimating chemical intake by humans were investigated. The objective of this study was two-fold: firstly, to develop an evaluation methodology, since no appropriate approach is available in the scientific literature. Secondly, to elaborate applicability and limitations of the models and to provide proposals for their improvement. The principles of model evaluation in terms of quality assurance, model validation and software evaluation were elaborated and a suitable evaluation protocol for chemical risk assessment models was developed. Quality assurance of a model includes internal (e.g. an investigation of the underlying theory) and external (e.g. a comparison of the results with experimental data) validation, and addresses the evaluation of the respective software. It should focus not only on the predictive capability of a model, but also on the strength of the theoretical underpinnings, evidence supporting the model?s conceptualisation, the database and the software. The external validation was performed using a set of reference substances with different physico-chemical properties and use patterns. Additionally, sensitivity and uncertainty analyses were carried out, and alternative models were discussed. Recommendations for improvements and maintenance of the risk assessment methodology were presented. To perform the software evaluation quality criteria for risk assessment software were developed. From a theoretical point of view, it was shown that the models strongly depend on the lipophilicity of the substance, that the underlying assumptions drastically limit the applicability, and that realistic concentrations may seldom be expected. If the models are applied without adjustment, high uncertainties must inevitably be expected. However, many cases were found in which the models deliver highly valuable results. The overall system was classified as a good compromise between complexity and practicability. But several chemicals and classes of chemicals, respectively, with several restrictions were revealed: The investigated models used to assess indirect exposure to humans are in parts currently not applicable for dissociating compounds, very polar compounds, very lipophilic compounds, ions, some surfactants, and compounds in which metabolites provide the problems and mixtures. In a strict sense, the method is only applicable for persistent, non-dissociating chemicals of intermediate lipophilicity. Further limitations may exist. Regarding the software, it was found that EUSES basically fulfils the postulated criteria but is highly complex and non-transparent. To overcome the inadequacies a more modular design is proposed.
107

Quantifying metabolic fluxes using mathematical modeling / Kvantifiering av metabola flöden genom matematisk modellering

Viberg, Victor January 2018 (has links)
Background Cancer is one of the leading causes of death in Sweden. In order to develop better treatments against cancer we need to better understand it. One area of special interest is cancer metabolism and the metabolic fluxes. As these fluxes cannot be directly measured, modeling is required to determine them. Due to the complexity of cell metabolism, some limitations in the metabolism model are required. As the TCA-cycle (TriCarboxylic Acid cycle) is one of the most important parts of cell metabolism, it was chosen as a starting point. The primary goal of this project has been to evaluate the previously constructed TCA-cycle model. The first step of the evaluation was to determine the CI (Confidence Interval) of the model parameters, to determine the parameters’ identifiability. The second step was to validate the model to see if the model could predict data for which the model had not been trained for. The last step of the evaluation was to determine the uncertainty of the model simulation. Method The TCA-cycle model was created using Isotopicaly labeled data and EMUs (ElementaryMetabolic Units) in OpenFlux, an open source toolbox. The CIs of the TCA-cycle model parameters were determined using both OpenFlux’s inbuilt functionality for it as well as using amethod called PL (Profile Likelihood). The model validation was done using a leave one out method. In conjunction with using the leave on out method, a method called PPL (Prediction Profile Likelihood) was used to determine the CIs of the TCA-cycle model simulation. Results and Discussion Using PL to determine CIs had mixed success. The failures of PL are most likely caused by poor choice of settings. However, in the cases in which PL succeeded it gave comparable results to those of OpenFLux. However, the settings in OpenFlux are important, and the wrong settings can severely underestimate the confidence intervals. The confidence intervals from OpenFlux suggests that approximately 30% of the model parameters are identifiable. Results from the validation says that the model is able to predict certain parts of the data for which it has not been trained. The results from the PPL yields a small confidence interval of the simulation. These two results regarding the model simulation suggests that even though the identifiability of the parameters could be better, that the model structure as a whole is sound. Conclusion The majority of the model parameters in the TCA-cycle model are not identifiable, which is something future studies needs to address. However, the model is able to to predict data for which it has not been trained and the model has low simulation uncertainty.
108

Modélisation dynamique tridimensionnelle avec tache solaire pour la simulation du comportement thermique d’un bâtiment basse consommation / A three dimensional thermal room and sun patch model to simulate the transient behaviour of an energy efficient building

Rodler, Auline 25 November 2014 (has links)
Cette thèse s’inscrit dans le contexte du développement de Bâtiments Basse Consommation. La conception de telles constructions les rend sensibles aux sollicitations internes. Aussi, les outils de thermique du bâtiment existants ne sont pas adaptés pour simuler assez fidèlement ce type de bâtiments, si bien qu’un modèle tridimensionnel et dynamique a été développé ici. Celui-ci présente plusieurs particularités : il s’appuie sur une discrétisation spatiale optimisée des parois, la tache solaire y est localisée et l’intégration des dynamiques des conditions environnementales est assurée par un solveur numérique à pas de temps adaptatif et un seul nœud d’air est considéré. La validation du modèle s’est suivant une confrontation avec des mesures en conditions réelles réalisées dans une cellule de BESTlab d’EDF R&D. Un suivi visuel de la tache solaire a permis de confirmer sa bonne localisation par notre modèle. Des mesures de température en surface complétées par des cartographies thermographiques ont été comparées aux champs de températures simulés, montrant une bonne concordance. Les comparaisons de températures d’air mesurées et simulées ont montré des résidus ne dépassant pas 1,5 ˚C, pour des erreurs moyennes de 0,5 ˚C. La pertinence des deux principales innovations du modèle a été ensuite démontrée : l’utilisation d’entrées échantillonnées à la minute associées à un solveur à pas de temps adaptatif permet de minimiser les erreurs de simulation : en mi-saison, les résidus maximaux sont respectivement de 1 ˚C et 2 ˚C pour des entrées à la minute et à l’heure. En hiver, les températures d’air simulées tendent à plus osciller autour de la consigne quand le pas d’échantillonnage des entrées s’allonge. Deux modèles unidimensionnels, représentatifs de modèles courants, M1D,sol diluant le rayonnement solaire sur le sol seul et M1D,parois le distribuant de façon homogène sur les parois au prorata de la taille de la tache solaire censée les frappées, ne dégradent que légèrement la précision des calculs de température d’air. Cependant, ces modèles 1D ne permettent pas de calcul des champs de températures sur les parois si bien qu’ils présentent des erreurs locales dépassant 20 ˚C aux endroits touchés par la tache solaire. Enfin en hiver, le modèle 3D permet de prédire des consommations de chauffage surestimées de 6,5 % quand M 1D,parois les surestime de 11 % et M1D,sol de 22 %. Les améliorations apportées par notre modèle ont été confirmées pour d’autres types de cellules. D’ailleurs des écarts plus importants entre M1D,sol et le modèle 3D ont été observés pour une cellule dont parois et sol ont des compositions très différentes, alors que l’orientation a aussi un impact. Ce travail confirme la nécessité de représenter plus finement les phénomènes physiques pour des locaux fortement isolés. Des améliorations sont à intégrer, comme la description de l’anisothermie de l’air. / Low energy building constructions become sensitive to internal gains : any internal heating source has an impact on the envelope. Therefore, it is important to evaluate the performance of current transient thermal models when adapted to low energy buildings. This work describes a numerical model to simulate a single room, using a refined spatial three-dimensional description of heat conduction in the envelope but a single air node is considered. The model has been developed for environmental conditions that vary over short time-steps and has integrated the projection of solar radiation through a window onto interior walls : the sun patch. The validation of the model has been done through a detailed comparison between model and measurements. The in situ experiment has been carried out in one of the BESTlab cells (EDF R&D). The sun patch has been followed by a camera to validate its calculated position and surface. Temperature measurements by thermocouples and by thermal cameras have been compared to the models outputs. Differences between air and surface temperatures measured and simulated were never above 1.5 ˚C and mean errors reached 0.5 ˚C. The two innovations of the model have then be proven. Using minute wise weather data and inputs associated to an adaptative solver, enabled to pull down simulation errors : in May maximal differences rised from 1 ˚C to 2 ˚C for respectivelly one minute and hourly wise inputs. More important errors are seen in summer whereas in winter, air temperatures simulated tend to more fluctuate around the set up temperature when the sampling step gets longer. Two one dimensional models, close to traditional taken simulation tools, were used. Model M 1D,sol supposed the incoming radiation to reach only the floor. A 1D model with sun patch movement, called 1D,parois , was also used. These two models evaluated the air temperature with an acceptable error. However, their surface temperatures were still subject to important errors. Thus, for temperature surfaces evaluation, both 1D model presented differences up to 20 ˚C for surfaces touched by the sun patch. In winter, the 3D model can predict heating energy consumptions overestimated by 6.5 % when M 1D,parois overestimated them by 11 % and M1D,sol by 22 %. The improvements brought by our model have been proven also for other cells with different thermal masses. For these cells, differences between M1D,sol and the 3D model could reach 4.5 ˚C. Differences seemed to be more important for low thermal mass cells, and the orientation of the building had a strong impact. This work has confirmed the necessity of representing more accuratelly the descriptions of the enveloppe for strongly insulated rooms. To improve the model, the anisothermal hypotheses of the air should be considered.
109

Data preparation, hydrodynamic and contaminant transport shallow-water simulations of Lake Victoria

Paul, Seema January 2019 (has links)
This study explores shallow lake numerical hydrodynamic processes that support model development and validation, extreme events and effects of water circulation in Lake Victoria. Lake Victoria is the second largest freshwater lake in the world, and the largest in East Africa. It is the major freshwater reservoir and source for domestic, agriculture, industrial, fishery, and transport. The resources support livelihoods and ecosystem services for over 40 million people. The lake is severely affected by water quality degradation by pollution. This thesis aims at improving the understanding by following recommendation of the Lake Victoria Environment Management Project, Lake Victoria Basin Commission climate change adaptation strategy and action plan 2018-2023, Lake Victoria Basin Commission operational plan 2015-2020, and Lake Victoria Basin Commission report. These reports suggested detailed lake bathymetry survey, modelling of lake flow, study of lake hydrometeorological processes by modelling and simulation, to identify extreme weather events, assess water circulation effect, and study lake pollution near the shore. A numerical hydrodynamic model was built in the COMSOL Multiphysics (CM) software for assessing lake flows and water turn-over from river inflows which carry pollution. The work included the development of systematic methods for lake bathymetry that are relevant for lake numerical and hydrodynamic modelling. The hydrometeorological driven simulation model was employed to assess lake water balance, water circulation and soluble transport. Paper 1 creates a bathymetry from several methods and from several data sources, and a vertically integrated free surface flow model was implemented in CM. The model was used to investigate outflow conditions, mean velocities driven by river inflow, outflow, precipitation and evaporation. It is shown to be exactly conservative and give water level variation in reasonable agreement with measurements. The results indicate that the shallow water model is close to linear. An outflow model, linear in water level, predicts water level reasonable agreement with measurements. The findings suggest that the model should consider wind stress driven flow to provide more accurate lake flow behavior. Paper 2 performed an assessment of the hydro-meteorological processes and extreme weather events that are responsible for changing the characteristics of lake water balance, and changing streamflow variations, and lake transportation. We compare historical data over a long time with data from the model including water balance, sources of data uncertainty, correlations, extreme rain and inflow years, and seasonal variations. Solute loading and transportation was illustrated by tracing the water from the river inflows. The results indicate that the lake rainfall has a strong seasonal variation with strong correlations between tributary inflows and precipitation, and between lake outflow and water level. The tracer transport by mean flow is very slow. Flow increases somewhat in wet periods and is faster in the shallow Kenya lake zone than in the deeper Uganda and Tanzanian lake zones, where the major inflow, from the Kagera River, appears to strongly influence transportation. / Denna studie undersöker med numerisk metodik hydrodynamiska processer i den mycket grundaVictoriasjön och hur de påverkas av extrem väderlek, inflöden, och nederbörd. Victoriasjön är denandra största sötvattensjön i världen, och den största i Afrika. Den är färskvattenförråd och källa förhushåll, jordbruk, industri, fiske och transporter. Resurserna ger livsuppehåll och ekosystemtjänsterför mer än 40 miljoner människor. Sjön är utsatt för allvarliga föroreningar som försämrarvattenkvaliteten. Detta arbete avser att förbättra förståelsen genom att följa rekommendationer somgivits ut av Lake Victoria Environment Management Project (LVEMP), och Lake Victoria BasinCommissions (LVBC) rapporter om strategi för anpassning till klimatförändringar, åtgärdsplan2018-2023 och översiktsplan 2015-2020. Rapporterna föreslår detaljerad genomgång avdjupkartor, modellering av strömning i sjön i syfte att identifiera extrema väderhändelser,undersöka vattencirkulationen, och studera föroreningarna nära stränder. En hydrodynamisknumerisk modell har byggts i simuleringspaketet COMSOL Multiphysics (CM) för uppskattning avströmning och vattenutbyte från förorenade inflöden. Arbetet innefattade utveckling av metoder förvattendjups-modeller för hydrodynamiska studier. Simuleringsmodellen drivs avhydrometeorologiska data och används för vattenmängds-balans, cirkulation ochföroreningstransport.Artikel 1 skapar vattendjupskartan från flera data-mängder med olika metoder. En vertikaltintegrerad modell med fri yta implementerades i CM. Modellen ger vertikalt medelvärdesbildadehastigheter drivna av flodinflöden, utflöde, nederbörd och avdunstning. Modellen representerarvattenbalansen exakt och ger variationer i vattennivå i rimlig överensstämmelse med mätningar.Resultaten antyder att modellen är nära linjär och tids-invariant. En utflödesmodell ansatt somlinjär i vatten-nivån kan anpassas noggrant till historiska data. Bättre realism kan uppnås omvindens pådrivande verkan inkluderas.Artikel 2 går igenom de hydro-meteorologiska processer och extrema väder-händelser som ändrarvattenbalans, strömningsmönster och transport. Vi har jämfört data över femtio år med modellens,inkluderande vattennivå, källor för osäkerhet i data, korrelationer, år med extrema regn ochinflöden, och årstidsvariationer. Resultaten tyder på att nederbörden varierar kraftigt medårstiderna, och signifikanta korrelationer ses mellan nederbörd och inflöden, och mellan utflöde ochvattennivå.Transport av lösliga föroreningar illustrerades genom spårning av vatten från de olika inflödena.Spårämnestransport med vertikalt medelvärdesbildade hastigheter är mycket långsam.Strömningen ökar något i våta årstider och är snabbare i den grunda zonen i Kenya än i de djuparedelarna i Uganda och Tanzania. Det största inflödet som kommer från Kagera tycks ha stor inverkanpå transporten. / <p>QC 20191106</p>
110

Валидация модели машинного обучения для прогнозирования магнитных свойств нанокристаллических сплавов типа FINEMET : магистерская диссертация / Validation of machine learning model to predict magnetic properties of nanocrystalline FINEMET type alloys

Степанова, К. А., Stepanova, K. A. January 2022 (has links)
В работе была произведена разработка модели машинного обучения на языке программирования Python, а также проведена ее валидация на этапах жизненного цикла. Целью создания модели машинного обучения является прогнозирование магнитных свойств нанокристаллических сплавов на основе железа по химическому составу и условиям обработки. Процесс валидации модели машинного обучения позволяет не только произвести контроль за соблюдением требований, предъявляемых при разработке и эксплуатации модели, к результатам, полученных с помощью моделирования, но и способствует внедрению модели в процесс производства. Процесс валидации включал в себя валидацию данных, в ходе которой были оценены типы, пропуски данных, соответствие цели исследования, распределения признаков и целевых характеристик, изучены корреляции признаков и целевых характеристик; валидацию алгоритмов, применяемых в модели: были проанализированы параметры алгоритмов с целью соблюдения требования о корректной обобщающей способности модели (отсутствие недо- и переобучения); оценку работы модели, благодаря которой был произведен анализ полученных результатов с помощью тестовых данных; верификацию результатов с помощью актуальных данных, полученных из статей, опубликованных с 2010 по 2022 год. В результате валидации модели было показано высокое качество разработанной модели, позволяющее получить оценки качества R2 0,65 и выше. / In this work machine learning model was developed by Python programming language, and also was validated at stages of model’s life cycle. The purpose of creating the machine learning model is to predict the magnetic properties of Fe-based nanocrystalline alloys by chemical composition and processing conditions. The validation of machine learning models allows not only to control the requirements for development and operation of the models, for the results obtained by modeling, but also contrib¬utes to the introduction of the model into production process. The validation process included: data validation: data types and omissions, compliance with the purpose of the study, dis¬tribution of features and target characteristics were evaluated, correlations of features and target characteristics were studied; flgorithms validation: the parameters of the algorithms were analyzed in order to comply with the requirement for the correct generalizing ability of the model (without under- and overfit¬ting); evaluation of the model work: the analysis of the obtained results was carried out using test data; verification of results using actual data obtained from articles published since 2010 to 2022. As a result of the model validation, the high quality of the developed model was shown, which makes it possible to obtain quality metric R2 0.65 and higher.

Page generated in 0.0797 seconds