Spelling suggestions: "subject:"2analysis system"" "subject:"3analysis system""
41 |
Indoor overheating risk : a framework for temporal building adaptation decision-makingGichuyia, Linda Nkatha January 2017 (has links)
Overheating in buildings is predicted to increase as a result of a warming climate and urbanisation in most cities. With regards to responding to this challenge, decision makers ranging from_ design teams, local authorities, building users, national programs and market innovators; and during the different stages of a building’s service life, want to know a few pertinent matters: What space characteristics and buildings are at a higher risk and by how much?; What are the tradeoffs between alternative design and/or user-based actions?; What are the likely or possible consequences of their decisions?; What is the impact of climate change to indoor overheating?; among other decision support questions. However, such decision appraisal information still remains buried and dispersed in existing simulation models, and empirical studies, and not yet been clearly articulated in any existing study or model. Especially decision support information articulated in a way that gives each decision maker maximum capacity to anticipate and respond to thermal discomfort in different spaces and through the lifetime of a building. There is a need for an integrated and systematic means of building adaptation decision-support, which provides analytical leverage to these listed decision makers. A means that: 1) assimilates a range of indoor thermal comfort's causal and solution space processes; 2) reveals and enhances the exploration of the space and time-dependent patterns created by the dynamics of the indoor overheating phenomenon through time; and one that 3) imparts insight into decision strategy and its synthesis across multiple decision makers. This study recognises the lack of an overarching framework attending to the listed concerns. Therefore, the key aim of this thesis is to develop and test a building adaptation decision-support framework, which extends the scope of existing frameworks and indoor overheating risk models to facilitate trans-sectional evaluations that reveal temporal decision strategies. The generic framework frames a multi-method analysis aiming to underpin decision appraisal for different spaces over a 50 to 100-year time horizon. It constitutes an underlying architecture that engages the dimensions of decision support information generation, information structuring, its exploration and dissemination, to ease in drawing decision strategy flexibly and transparently. The multi-method framework brings together: 1) Systems thinking methods to a) facilitate the systematic exposure of the elements that shape indoor overheating risk, and b) reveal the processes that shape multi-stakeholder decision-making response over time; 2) The use of normative, predictive and exploratory building scenarios to a) examine the overheating phenomenon over time, and b) as a lens through which to explore the micro-dynamics brought about by aspects of heterogeneity and uncertainty; and 3) The application of both computational and optimization techniques to appraise potential routes towards indoor thermal comfort over an extended time scale by a) tracking shifts in frequency, intensity and distribution of indoor overheating vulnerability by causal elements over time and space; and b) tracking shifting optima of the heat mitigation solution space, with respect to time, climate futures, heterogeneity of spaces, and due to thermal comfort assumptions. The framework’s potential has been demonstrated through its application to office buildings in Nairobi.
|
42 |
Contributions to an Improved Oxygen and Thermal Transport Model and Development of Fatigue Analysis Software for Asphalt PavementsJin, Xin 2009 August 1900 (has links)
Fatigue cracking is one primary distress in asphalt pavements, dominant
especially in later years of service. Prediction of mixture fatigue resistance is critical for
various applications, e.g., pavement design and preventative maintenance. The goal of
this work was to develop a tool for prediction of binder aging level and mixture fatigue
life in pavement from unaged binder/mixture properties. To fulfill this goal, binder
oxidation during the early fast-rate period must be understood. In addition, a better
hourly air temperature model is required to provide accurate input for the pavement
temperature prediction model. Furthermore, a user-friendly software needs to be
developed to incorporate these findings.
Experiments were conducted to study the carbonyl group formation in one
unmodified binder (SEM 64-22) and one polymer-modified binder (SEM 70-22), aged at
five elevated temperatures. Data of SEM 64-22, especially at low temperatures, showed
support for a parallel-reaction model, one first order reaction and one zero order
reaction. The model did not fit data of SEM 70-22. The polymer modification of SEM 70-22 might be responsible for this discrepancy. Nonetheless, more data are required to
draw a conclusion.
Binder oxidation rate is highly temperature dependent. Hourly air temperature
data are required as input for the pavement temperature prediction model. Herein a new
pattern-based air temperature model was developed to estimate hourly data from daily
data. The pattern is obtained from time series analysis of measured data. The new model
yields consistently better results than the conventional sinusoidal model.
The pavement aging and fatigue analysis (PAFA) software developed herein
synthesizes new findings from this work and constant-rate binder oxidation and
hardening kinetics and calibrated mechanistic approach with surface energy (CMSE)
fatigue analysis algorithm from literature. Input data include reaction kinetics
parameters, mixture test results, and pavement temperature. Carbonyl area growth,
dynamic shear rheometer (DSR) function hardening, and mixture fatigue life decline are
predicted as function of time. Results are plotted and saved in spreadsheets.
|
43 |
Novel technologies for cell culture and tissue engineeringGe, Cheng January 2016 (has links)
Cell culture has been a fundamental tool for the study of cell biology, tissue engineering, stem cell technology and biotechnology in general. It becomes more and more important to have a well-defined physiochemical microenvironment during cell culture. Conventional cell cultures employ expensive, manually controlled incubation equipment, making it difficult to maximize a cultures yield. Furthermore, previous studies use qualitative methods to assess cell culture proliferation that are inherently inaccurate and labour intensive, thereby increasing the cost of production. In addition, three dimensional cell culture, in scaffold, has been shown to provide more physiological relevant information as it mimic more accurate conditions that are similar to the physiological conditions of the human body compared with two dimension, which has special interest to regenerative medicine. Therefore, a portable and automated total-analysis-system (μTAS) was proposed with microenvironment control and quantitative analysis techniques to monitor cell proliferation and metabolic activity. The automated portable heating system was validated to be capable to maintain a stable physiochemical microenvironment, with little margin of error, for cellular substrate outside of conventional incubation. A standalone platform system was designed and fabricated with accurate temperature control by employing an optically transparent ITO-film with a large heating area. The transparency of the film is critical for continuous in-situ microscopic observation over long-term cell culture process. Previous studies have attempted to use ITO-film as a heating element, but were unable to distribute the heat evenly onto the microbioreactor platform. This nagging problem in the literature was improved through a novel film design. As a result, the ITO-film based heating system was evaluated and constructed successfully to serve as a heating element for long-term static cell culture with facilitated proliferation rate in gas-permeable PDMS microbioreactor outside of conventional incubation. In addition to maintaining a stable microenvironment, a non-invasive in-situ technology for monitoring cell viability and proliferation rate was constructed and developed based on bioimpedance spectroscopy (BIS). It was primarily focused on making decisions for structure and specification of proposed system-on a chip BIS measurement. The miniaturization of BIS system on microbioreactor platform was achieved by utilizing and integrating switching matrix array, impedance analyzer chip with reliable analogue-front-end circuitry. The realized system was verified with the DLD-1 cells and its monitored data were validated with conventional bioassays. Three dimensional cell cultures with scaffold is a key to the success of tissue engineering. Engineered cornea collagen scaffold may be feasible using re-seeding proper human cells onto a decellularized corneal scaffold. The quality of the scaffold and the interaction of the cells are critical to the key function (i.e transparency, haze and total transmittance) of final products. An integrated corneal collagen scaffold quality assessment system, via optical property inspection unit, was innovatively designed and realized with non-invasive and non-destructive characteristics. The H1299 cells were seeded onto inspected corneal scaffold and BIS system, which were realized in the previous chapter, were used to validate its applicability for 3D cell culture. The cell adhesion as an outcome at different scaffolds with different optical properties has revealed the importance of the microstructure of scaffold on the cell functions. The results showed the developed technologies can be used for the quality control of corneal scaffold and the fabricated μTAS not only enabled environmental control but, with BIS-based in-situ assay, it also facilitate the function (i.e adhesion) and viability monitoring with quantitative and qualitative analysis in 3D-alike cell culture. Additionally, by considering its low decontamination and cost-effective nature with compatibility for high-throughput screening applications, the fabricated and integrated systems has significant applications in tissue engineering.
|
44 |
Desenvolvimento de um sistema de análise de imagem para quantificação do tamanho e distribuição de partículas de desgaste /Gonçalves, Valdeci Donizete. January 2009 (has links)
Orientador: Mauro Hugo Mathias / Co orientador: Mauro Pedro Peres / Banca: José Elias Tomazini / Banca: João Zangrandi Filho / Banca: Francisco Carlos Parquet Bizarria / Banca: Edson Antonio Capello Sousa / Resumo: Este trabalho descreve o desenvolvimento de um sistema de análise de imagens de partículas de desgaste encontradas em óleos lubrificantes de equipamentos industriais. Para tanto, foi utilizado um sistema de aquisição de imagem para capturar imagens de amostras de óleo retidas em membranas de filtro e, também, desenvolvida uma metodologia analítica que faz a classificação quantitativa e qualitativa das partículas, relacionando-as ao modo de desgaste na qual foram geradas. Para a classificação quantitativa utilizou-se a norma ISO 4406 e para a qualitativa, a análise por meio de Redes Neurais Artificiais. O sistema aplicado consiste em uma câmera digital, um microscópio óptico monocular, um sistema de filtragem de óleo e dois programas computacionais desenvolvidos para realização da análise automatizada das imagens das partículas de desgaste obtidas. Foram utilizados óleos minerais de redutores de máquinas industriais no processo de obtenção de amostras. As partículas de desgaste foram analisadas pelo processo de microscopia óptica para obtenção da imagem da amostra relacionada ao desgaste. Os resultados obtidos mostraram que o sistema de análise desenvolvido realiza a classificação individual das partículas, através de Redes Neurais Artificiais, com uma eficiência de até 96%, além de analisar as múltiplas partículas contidas nas amostras, exibindo um relatório de acompanhamento e evolução do desgaste. Os programas computacionais desenvolvidos para essa análise possuem interface gráfica de fácil utilização. Eles podem ser amplamente utilizados no estudo e avaliação das partículas de desgaste obtidas de amostras de óleos industriais em empresas ou em universidades para fins educacionais. O sistema pode ser utilizado para análise de superfícies de metais obtidos pelo processo de metalografia ou na análise de imagens de microorganismos obtidas... (Resumo completo, clicar acesso eletrônico abaixo) / Abstract: This paper describes the development of an image analysis system for wear particles found in industrial equipment lubricating oil. Hence, it was utilized an image acquisition system to capture image samples of the oil held in filter membranes. An analytical methodology was also developed to classify the particles quantitatively and qualitatively, relating them to the wear mode where they had been generated. The ISO 4406 standard was applied for quantitative classification and Artificial Neural Networks were used for qualitative classification analysis. The applied system consists of a digital camera, a monocular optical microscope, one oil filtering system and two software programs developed to perform the automated analysis of the acquired wear particles images. Mineral oils of gearbox industrial machines were used in the process of achieving the samples. The wear particles were analyzed by an optical microscopy system, to obtain the sample image related to the wear. The results showed that the analysis system that was developed classifies the individual particles through Artificial Neural Networks with 96% accuracy, in addition to analyzing the multiple particles contained in the samples and generating an evaluation report and wear evolution. The software programs developed for the analysis have a graphical interface easy to use. They can be used in the study and evaluation of the wear particles obtained from industrial oil samples in the companies or in the universities for educational purposes. The system can be applied to analyze the surface of metals acquired by metallographic processes or in the image analysis of microorganisms acquired from blood samples, opening a wide field of application within universities and researches. / Doutor
|
45 |
Sistema de análise de movimento para avaliação da postura vertical durante a corrida no teste de esforço máximo incremental / Movement analysis system for evaluation of the spinal posture during running in the incremental maximum effort testCampos, Mario Hebling 16 August 2018 (has links)
Orientador: Rene Brenzikofer / Tese (doutorado) - Universidade Estadual de Campinas, Faculdade de Educação Física / Made available in DSpace on 2018-08-16T01:10:33Z (GMT). No. of bitstreams: 1
Campos_MarioHebling_D.pdf: 6224071 bytes, checksum: c66d59aff4b31c64858f5f1d04d932ed (MD5)
Previous issue date: 2010 / Resumo: Introdução: Apesar de ser reconhecida a importância da postura vertebral durante a corrida, pouco se sabe sobre este assunto. Não encontramos um método automático de baixo custo para avaliação detalhada da curvatura vertebral durante a locomoção. O objetivo deste estudo foi desenvolver um sistema automático de análise de movimento para a avaliação da postura vertebral durante a corrida no teste de esforço máximo incremental. Materiais e Métodos: Foi desenvolvido um sistema de rastreamento automático de marcadores retro-refletivos colocados na coluna vertebral, com três câmeras de luz visível de baixo custo. Iluminadores foram construídos com Led de alto brilho para serem acoplados às câmeras. Em Matlab, foi desenvolvido um software para processamento de imagem digital e análise. Foi implementado um algoritmo de rastreamento automático que utiliza pontos de controle para previsão da trajetória dos pontos da coluna vertebral. O DLT foi implementado para a reconstrução 3D. Quinze atletas amadores foram avaliados (10,4 ± 68,6 kg, 1,73 ± 0,09 m, 41,8 ± 12,2 anos). Quatro destes voluntários realizaram um pré-teste, três semanas antes. Foi quantificada a curvatura geométrica 2D da coluna projetada nos planos sagital e frontal de um sistema local instantâneo de coordenadas no tronco, com origem na Junção Tóraco-Lombar Geométrica (GJTL), um ponto de inflexão da coluna exibido no plano sagital, na região de T12. A postura vertebral foi descrita pela Curva Neutra, a postura média apresentada no ciclo da passada. Foi avaliada a variabilidade intra e inter-individual da Curva Neutra no teste de esforço. A reprodutibilidade entre dias desta variável foi estimada e comparada com variáveis angulares. Resultados e Discussão: O rastreamento dos marcadores com os pontos de controle permitiu o rastreamento automático de todo o teste de esforço máximo, mesmo com o ruído presente nas imagens das câmeras de luz visível, oclusão, impacto e a proximidade de 2,3 ± 0,3cm entre os marcadores posicionados ao longo da coluna vertebral. A acurácia do sistema foi avaliada em 0,55mm e 0,81°. A Curva Neutra é estável e apresentou características individuais no teste de esforço. Por outro lado, no plano sagital, houve um aumento progressivo e linear (p <0,05) do pico de curvatura da Curva de Neutra, especialmente na região lombar que apresentou a maior variação. No plano frontal, na fase final do teste de esforço, houve um aumento significativo (p <0,05) do pico de curvatura na parte superior da coluna torácica, sugerindo que a eminência de fadiga provoca um aumento dos desvios laterais nessa região. Os picos de curvatura da Curva Neutra apresentaram maior reprodutibilidade entre dias do que as variáveis angulares e, ao contrário destas, não dependem da identificação exata do processo espinhoso de T12 com palpação. Conclusão: O sistema proposto mostrou-se eficiente e acurado. A Curva Neutra é uma boa descritora da postura vertebral durante a corrida. / Abstract: Introduction: Although it recognized the importance of spinal posture during running, little is known about this subject. We did not find a low cost and automatic method for detailed evaluation of spinal curvature during gait. The purpose of this study was to develop an automatic motion analysis system for evaluation of the spinal posture during running in the incremental maximum effort test. Materials and Methods: We developed a system for automatic tracking of retro-reflective markers placed on the spine, with three low cost cameras of visible light. Were built illuminators with high brightness Led to be attached to the cameras. In Matlab, was developed a software for digital image processing and analysis. Was implemented an algorithm for automatic tracking that uses control points for prediction of the trajectory of the points in the spine. The DLT was implemented for 3D reconstruction. Fifteen amateur athletes were avaluated (10.4 ± 68.6 kg, 1.73 ± 0.09 m, 41.8 ± 12.2 years). Four of these volunteers performed a pretest, three weeks before. Was quantified the 2D geometric curvature of the spine projected in the sagittal and frontal planes of an instantaneous local coordinate system at the trunk with origin at the Geometric Junction Thoraco-Lumbar (GJTL), a inflection point of the column displayed in the sagittal plane, close to T12. The Neutral Curve, the average posture presented in the gait cycle, was adopted as a descriptor of the spinal posture. It was measured the intra and intersubject variability of the Neutral Curve in the effort test. The inter-day repeatability of this variable was estimated and compared with angular variables. Results and Discussion: Tracking markers with control points enabled the automatic tracking throughout the maximum effort test, even with the image noise of the visible light cameras, occlusion, impact and the proximity of 2,3 ± 0,3cm between the markers positioned along the spine. The accuracy of the system was 0,55mm and 0,81°. The Neutral Curve is stable and presented individual caracteristcs in the effort test. On the other hand, in the sagittal plane, there was a progressive and linear increase (p <0.05) of the curvature peak of the Neutral Curve, especially in the lumbar region that showed the greatest variation. In the frontal plane, in the final stages of the effort test, there was a significant increase (p <0,05) of the curvature peak in the upper thoracic spine, suggesting that the eminence of fatigue causes an increase in lateral deviations in that region. The curvature peaks of the Neutral Curve showed higher inter-day reproducibility than angular variables and, unlike these, do not depend on the exact identification of the spinous process of T12 with palpation. Conclusion: The proposed system proved to be efficient and accurate. The Neutral Curve is a good descriptor of the spinal posture during running. / Doutorado / Biodinamica do Movimento Humano / Doutor em Educação Física
|
46 |
Posouzení informačního systému firmy a návrh změn / Assessment of the Company's Information System and Suggested ChangesKlein, Lukáš January 2014 (has links)
This diploma thesis is focused on corporate information systems. It contains an analysis of current status of the information system in company, which uses e-commerce as its main scope of business. There is a proposed solution based on the analysis, which should contribute to an improvement of the information system itself. The proposed solution includes schedule and price calculation of the project as well.
|
47 |
Implementace nástrojů vizualizace pro osazování DPS / Implementation of visualization tools for PCB assemblyKolář, Radek January 2016 (has links)
Familiarization with the problems of PCB assembly and commissioning means for interim and final control of DPS on semi-automatic and manual assembly. Analysis tools for visualizing programs during PCB assembly. Defining the pluses and minuses of each program. Introducing the functioning of the program selected for visualization. Implementation of the program into production. Problems with introducing the program and compared with the previous situation. Evaluation of its contribution to manufacturing (time, quality, financial costs, human resources, ...). Solving problems encountered with the implementation of the program. Recommendations for the type of production the program is suitable.
|
48 |
Automatizované nastavení regulátoru pohonu / Automated tuning of drive controllerAdamec, Matúš January 2017 (has links)
This thesis deals with automated tuning of drive controller. To achieve this goal, system identification is needed. Therefore, the issue of identification is described at the beginning of this thesis. Spectral analysis was selected from many described methods. It was implemented in Matlab and also in C# language where was used averaging and Blackman-Tukey method. The C# application is linked to Beckhoff TwinCAT 3 and TwinCAT 2 runtime sytems that enable connections with real drive. Next, the problem of drive regulation is discussed and the results of using spectral analysis on real drives are shown. At the end of the thesis is described the algorithm of setting the speed controller with different types of frequency converters.
|
49 |
Statistical Methods for Launch Vehicle Guidance, Navigation, and Control (GN&C) System Design and AnalysisRose, Michael Benjamin 01 May 2012 (has links)
A novel trajectory and attitude control and navigation analysis tool for powered ascent is developed. The tool is capable of rapid trade-space analysis and is designed to ultimately reduce turnaround time for launch vehicle design, mission planning, and redesign work. It is streamlined to quickly determine trajectory and attitude control dispersions, propellant dispersions, orbit insertion dispersions, and navigation errors and their sensitivities to sensor errors, actuator execution uncertainties, and random disturbances. The tool is developed by applying both Monte Carlo and linear covariance analysis techniques to a closed-loop, launch vehicle guidance, navigation, and control (GN&C) system. The nonlinear dynamics and flight GN&C software models of a closed-loop, six-degree-of-freedom (6-DOF), Monte Carlo simulation are formulated and developed. The nominal reference trajectory (NRT) for the proposed lunar ascent trajectory is defined and generated. The Monte Carlo truth models and GN&C algorithms are linearized about the NRT, the linear covariance equations are formulated, and the linear covariance simulation is developed. The performance of the launch vehicle GN&C system is evaluated using both Monte Carlo and linear covariance techniques and their trajectory and attitude control dispersion, propellant dispersion, orbit insertion dispersion, and navigation error results are validated and compared. Statistical results from linear covariance analysis are generally within 10% of Monte Carlo results, and in most cases the differences are less than 5%. This is an excellent result given the many complex nonlinearities that are embedded in the ascent GN&C problem. Moreover, the real value of this tool lies in its speed, where the linear covariance simulation is 1036.62 times faster than the Monte Carlo simulation. Although the application and results presented are for a lunar, single-stage-to-orbit (SSTO), ascent vehicle, the tools, techniques, and mathematical formulations that are discussed are applicable to ascent on Earth or other planets as well as other rocket-powered systems such as sounding rockets and ballistic missiles.
|
50 |
Quantifying Uncertainty in Flood Modeling Using Bayesian ApproachesTao Huang (15353755) 27 April 2023 (has links)
<p> </p>
<p>Floods all over the world are one of the most common and devastating natural disasters for human society, and the flood risk is increasing recently due to more and more extreme climatic events. In the United States, one of the key resources that provide the flood risk information to the public is the Flood Insurance Rate Map (FIRM) administrated by the Federal Emergency Management Agency (FEMA) and the digitalized FIRMs have covered over 90% of the United States population so far. However, the uncertainty in the modeling process of FIRMs is rarely investigated. In this study, we use two of the widely used multi-model methods, the Bayesian Model Averaging (BMA) and the generalized likelihood uncertainty estimation (GLUE), to evaluate and reduce the impacts of various uncertainties with respect to modeling settings, evaluation metrics, and algorithm parameters on the flood modeling of FIRMs. Accordingly, three objectives of this study are to: (1) quantify the uncertainty in FEMA FIRMs by using BMA and Hierarchical BMA approaches; (2) investigate the inherent limitations and uncertainty in existing evaluation metrics of flood models; and (3) estimate the BMA parameters (weights and variances) using the Metropolis-Hastings (M-H) algorithm with multiple Markov Chains Monte Carlo (MCMC).</p>
<p><br></p>
<p>In the first objective, both the BMA and hierarchical BMA (HBMA) approaches are employed to quantify the uncertainty within the detailed FEMA models of the Deep River and the Saint Marys River in the State of Indiana based on water stage predictions from 150 HEC-RAS 1D unsteady flow model configurations that incorporate four uncertainty sources including bridges, channel roughness, floodplain roughness, and upstream flow input. Given the ensemble predictions and the observed water stage data in the training period, the BMA weight and the variance for each model member are obtained, and then the BMA prediction ability is validated for the observed data from the later period. The results indicate that the BMA prediction is more robust than both the original FEMA model and the ensemble mean. Furthermore, the HBMA framework explicitly shows the propagation of various uncertainty sources, and both the channel roughness and the upstream flow input have a larger impact on prediction variance than bridges. Hence, it provides insights for modelers into the relative impact of individual uncertainty sources in the flood modeling process. The results show that the probabilistic flood maps developed based on the BMA analysis could provide more reliable predictions than the deterministic FIRMs.</p>
<p><br></p>
<p>In the second objective, the inherent limitations and sampling uncertainty in several commonly used model evaluation metrics, namely, the Nash Sutcliffe efficiency (<em>NSE</em>), the Kling Gupta efficiency (<em>KGE</em>), and the coefficient of determination (<em>R</em>2), are investigated systematically, and hence the overall performance of flood models can be evaluated in a comprehensive way. These evaluation metrics are then applied to the 1D HEC-RAS models of six reaches located in the states of Indiana and Texas of the United States to quantify the uncertainty associated with the channel roughness and upstream flow input. The results show that the model performances based on the uniform and normal priors are comparable. The distributions of these evaluation metrics are significantly different for the flood model under different high-flow scenarios, and it further indicates that the metrics should be treated as random statistical variables given both aleatory and epistemic uncertainties in the modeling process. Additionally, the white-noise error in observations has the least impact on the evaluation metrics.</p>
<p><br></p>
<p>In the third objective, the Metropolis-Hastings (M-H) algorithm, which is one of the most widely used algorithms in the MCMC method, is proposed to estimate the BMA parameters (weights and variances), since the reliability of BMA parameters determines the accuracy of BMA predictions. However, the uncertainty in the BMA parameters with fixed values, which are usually obtained from the Expectation-Maximization (EM) algorithm, has not been adequately investigated in BMA-related applications over the past few decades. Both numerical experiments and two practical 1D HEC-RAS models in the states of Indiana and Texas of the United States are employed to examine the applicability of the M-H algorithm with multiple independent Markov chains. The results show that the BMA weights estimated from both algorithms are comparable, while the BMA variances obtained from the M-H MCMC algorithm are closer to the given variances in the numerical experiment. Overall, the MCMC approach with multiple chains can provide more information associated with the uncertainty of BMA parameters and its performance of water stage predictions is better than the default EM algorithm in terms of multiple evaluation metrics as well as algorithm flexibility.</p>
|
Page generated in 0.0654 seconds