Spelling suggestions: "subject:"aprediction models"" "subject:"iprediction models""
61 |
Multi-cavity molecular descriptor interconnections: Enhanced protocol for prediction of serum albumin drug bindingAkawa, O.B., Okunlola, F.O., Alahmdi, M.I., Abo-Dya, N.E., Sidhom, P.A., Ibrahim, M.A.A., Shibl, M.F., Khan, Shahzeb, Soliman, M.E.S. 03 November 2023 (has links)
Yes / The role of human serum albumin (HSA) in the transport of molecules predicates its involvement in the determination of drug distribution and metabolism. Optimization of ADME properties are analogous to HSA binding thus this is imperative to the drug discovery process. Currently, various in silico predictive tools exist to complement the drug discovery process, however, the prediction of possible ligand-binding sites on HSA has posed several challenges. Herein, we present a strong and deeper-than-surface case for the prediction of HSA-ligand binding sites using multi-cavity molecular descriptors by exploiting all experimentally available and crystallized HSA-bound drugs. Unlike previously proposed models found in literature, we established an in-depth correlation between the physicochemical properties of available crystallized HSA-bound drugs and different HSA binding site characteristics to precisely predict the binding sites of investigational molecules. Molecular descriptors such as the number of hydrogen bond donors (nHD), number of heteroatoms (nHet), topological polar surface area (TPSA), molecular weight (MW), and distribution coefficient (LogD) were correlated against HSA binding site characteristics, including hydrophobicity, hydrophilicity, enclosure, exposure, contact, site volume, and donor/acceptor ratio. Molecular descriptors nHD, TPSA, LogD, nHet, and MW were found to possess the most inherent capacities providing baseline information for the prediction of serum albumin binding site. We believe that these associations may form the bedrock for establishing a solid correlation between the physicochemical properties and Albumin binding site architecture. Information presented in this report would serve as critical in provisions of rational drug designing as well as drug delivery, bioavailability, and pharmacokinetics.
|
62 |
The application of PROMETHEE multi-criteria decision aid in financial decision making: case of distress prediction models evaluationMousavi, Mohammad M., Lin, J. 2020 May 1922 (has links)
No / Conflicting rankings corresponding to alternative performance criteria and measures are mostly reported in the mono-criterion evaluation of competing distress prediction models (DPMs). To overcome this issue, this study extends the application of the expert system to corporate credit risk and distress prediction through proposing a Multi-criteria Decision Aid (MCDA), namely PROMETHEE II, which provides a multi-criteria evaluation of competing DPMs. In addition, using data on Chinese firms listed on Shanghai and Shenzhen stock exchanges, we perform an exhaustive comparative analysis of the most popular DPMs; namely, statistical, artificial intelligence and machine learning models under both mono-criterion and multi-criteria frameworks. Further, we address two prevailing research questions; namely, "which DPM performs better in predicting distress?" and "will training models with corporate governance indicators (CGIs) enhance the performance of models?”; and discuss our findings. Our multi-criteria ranking suggests that non-parametric DPMs outperform parametric ones, where random forest and bagging CART are among the best machine learning DPMs. Further, models fed with CGIs as features outperform those fed without CGIs.
|
63 |
A dynamic performance evaluation of distress prediction modelsMousavi, Mohammad M., Ouenniche, J., Tone, K. 27 October 2022 (has links)
Yes / So far, the dominant comparative studies of competing distress prediction models (DPMs) have been restricted to the use of static evaluation frameworks and as such overlooked their performance over time. This study fills this gap by proposing a Malmquist Data Envelopment Analysis (DEA)-based multi-period performance evaluation framework for assessing competing static and dynamic statistical DPMs and using it to address a variety of research questions. Our findings suggest that (1) dynamic models developed under duration-dependent frameworks outperform both dynamic models developed under duration-independent frameworks and static models; (2) models fed with financial accounting (FA), market variables (MV), and macroeconomic information (MI) features outperform those fed with either MVMI or FA, regardless of the frameworks under which they are developed; (3) shorter training horizons seem to enhance the aggregate performance of both static and dynamic models.
|
64 |
Spatial crash prediction models: an evaluation of the impacts of enriched information on model performance and the suitability of different spatial modeling approaches / Modelos espaciais de previsão de acidentes: uma avaliação do desempenho dos modelos a partir da incorporação de informações aprimoradas e a adequação de diferentes abordagens de modelagem espacialGomes, Monique Martins 04 December 2018 (has links)
The unavailability of crash-related data has been a long lasting challenge in Brazil. In addition to the poor implementation and follow-up of road safety strategies, this drawback has hampered the development of studies that could contribute to national goals toward road safety. In contrast, developed countries have built their effective strategies on solid data basis, therefore, investing a considerable time and money in obtaining and creating pertinent information. In this research, we aim to assess the potential impacts of supplementary data on spatial model performance and the suitability of different spatial modeling approaches on crash prediction. The intention is to notify the authorities in Brazil and other developing countries, about the importance of having appropriate data. In this thesis, we set two specific objectives: (I) to investigate the spatial model prediction accuracy at unsampled subzones; (II) to evaluate the performance of spatial data analysis approaches on crash prediction. Firstly, we carry out a benchmarking based on Geographically Weighted Regression (GWR) models developed for Flanders, Belgium, and São Paulo, Brazil. Models are developed for two modes of transport: active (i.e. pedestrians and cyclists) and motorized transport (i.e. motorized vehicles occupants). Subsequently, we apply the repeated holdout method on the Flemish models, introducing two GWR validation approaches, named GWR holdout1 and GWR holdout2. While the former is based on the local coefficient estimates derived from the neighboring subzones and measures of the explanatory variables for the validation subzones, the latter uses the casualty estimates of the neighboring subzones directly to estimate outcomes for the missing subzones. Lastly, we compare the performance of GWR models with Mean Imputation (MEI), K-Nearest Neighbor (KNN) and Kriging with External Drift (KED). Findings showed that by adding the supplementary data, reductions of 20% and 25% for motorized transport, and 25% and 35% for active transport resulted in corrected Akaike Information Criterion (AICc) and Mean Squared Prediction Errors (MSPE), respectively. From a practical perspective, the results could help us identify hotspots and prioritize data collection strategies besides identify, implement and enforce appropriate countermeasures. Concerning the spatial approaches, GWR holdout2 out performed all other techniques and proved that GWR is an appropriate spatial technique for both prediction and impact analyses. Especially in countries where data availability has been an issue, this validation framework allows casualties or crash frequencies to be estimated while effectively capturing the spatial variation of the data. / A indisponibilidade de variáveis explicativas de acidentes de trânsito tem sido um desafio duradouro no Brasil. Além da má implementação e acompanhamento de estratégias de segurança viária, esse inconveniente tem dificultado o desenvolvimento de estudos que poderiam contribuir com as metas nacionais de segurança no trânsito. Em contraste, países desenvolvidos tem construído suas estratégias efetivas com base em dados sólidos, e portanto, investindo tempo e dinheiro consideráveis na obtenção e criação de informações pertinentes. O objetivo dessa pesquisa é avaliar os possíveis impactos de dados suplementares sobre o desempenho de modelos espaciais, e a adequação de diferentes abordagens de modelagem espacial na previsão de acidentes. A intenção é notificar as autoridades brasileiras e de outros países em desenvolvimento sobre a importância de dados adequados. Nesta tese, foram definidos dois objetivos específicos: (I) investigar a acurácia do modelo espacial em subzonas sem amostragem; (II) avaliar o desempenho de técnicas de análise espacial de dados na previsão de acidentes. Primeiramente, foi realizado um estudo comparativo, baseado em modelos desenvolvidos para Flandres (Bélgica) e São Paulo (Brasil), através do método de Regressão Geograficamente Ponderada (RGP). Os modelos foram desenvolvidos para dois modos de transporte: ativos (pedestres e ciclistas) e motorizados (ocupantes de veículos motorizados). Subsequentemente, foi aplicado o método de holdout repetido nos modelos Flamengos, introduzindo duas abordagens de validação para GWR, denominados RGP holdout1 e RGP holdout2. Enquanto o primeiro é baseado nas estimativas de coeficientes locais derivados das subzonas vizinhas e medidas das variáveis explicativas para as subzonas de validação, o último usa as estimativas de acidentes das subzonas vizinhas, diretamente, para estimar os resultados para as subzonas ausentes. Por fim, foi comparado o desempenho de modelos RGP e outras abordagens, tais como Imputação pela Média de dados faltantes (IM), K-vizinhos mais próximos (KNN) e Krigagem com Deriva Externa (KDE). Os resultados mostraram que, adicionando os dados suplementares, reduções de 20% e 25% para o transporte motorizado, e 25% e 35% para o transporte ativo, foram resultantes em termos de Critério de Informação de Akaike corrigido (AICc) e Erro Quadrático Médio da Predição (EQMP), respectivamente. Do ponto de vista prático, os resultados poderiam ajudar a identificar hotspots e priorizar estratégias de coleta de dados, além de identificar, implementar e aplicar contramedidas adequadas. No que diz respeito às abordagens espaciais, RGP holdout2 teve melhor desempenho em relação a todas as outras técnicas e, provou que a RGP é uma técnica espacial apropriada para ambas as análises de previsão e impactos. Especialmente em países onde a disponibilidade de dados tem sido um problema, essa estrutura de validação permite que as acidentes sejam estimados enquanto, capturando efetivamente a variação espacial dos dados.
|
65 |
Spatial crash prediction models: an evaluation of the impacts of enriched information on model performance and the suitability of different spatial modeling approaches / Modelos espaciais de previsão de acidentes: uma avaliação do desempenho dos modelos a partir da incorporação de informações aprimoradas e a adequação de diferentes abordagens de modelagem espacialMonique Martins Gomes 04 December 2018 (has links)
The unavailability of crash-related data has been a long lasting challenge in Brazil. In addition to the poor implementation and follow-up of road safety strategies, this drawback has hampered the development of studies that could contribute to national goals toward road safety. In contrast, developed countries have built their effective strategies on solid data basis, therefore, investing a considerable time and money in obtaining and creating pertinent information. In this research, we aim to assess the potential impacts of supplementary data on spatial model performance and the suitability of different spatial modeling approaches on crash prediction. The intention is to notify the authorities in Brazil and other developing countries, about the importance of having appropriate data. In this thesis, we set two specific objectives: (I) to investigate the spatial model prediction accuracy at unsampled subzones; (II) to evaluate the performance of spatial data analysis approaches on crash prediction. Firstly, we carry out a benchmarking based on Geographically Weighted Regression (GWR) models developed for Flanders, Belgium, and São Paulo, Brazil. Models are developed for two modes of transport: active (i.e. pedestrians and cyclists) and motorized transport (i.e. motorized vehicles occupants). Subsequently, we apply the repeated holdout method on the Flemish models, introducing two GWR validation approaches, named GWR holdout1 and GWR holdout2. While the former is based on the local coefficient estimates derived from the neighboring subzones and measures of the explanatory variables for the validation subzones, the latter uses the casualty estimates of the neighboring subzones directly to estimate outcomes for the missing subzones. Lastly, we compare the performance of GWR models with Mean Imputation (MEI), K-Nearest Neighbor (KNN) and Kriging with External Drift (KED). Findings showed that by adding the supplementary data, reductions of 20% and 25% for motorized transport, and 25% and 35% for active transport resulted in corrected Akaike Information Criterion (AICc) and Mean Squared Prediction Errors (MSPE), respectively. From a practical perspective, the results could help us identify hotspots and prioritize data collection strategies besides identify, implement and enforce appropriate countermeasures. Concerning the spatial approaches, GWR holdout2 out performed all other techniques and proved that GWR is an appropriate spatial technique for both prediction and impact analyses. Especially in countries where data availability has been an issue, this validation framework allows casualties or crash frequencies to be estimated while effectively capturing the spatial variation of the data. / A indisponibilidade de variáveis explicativas de acidentes de trânsito tem sido um desafio duradouro no Brasil. Além da má implementação e acompanhamento de estratégias de segurança viária, esse inconveniente tem dificultado o desenvolvimento de estudos que poderiam contribuir com as metas nacionais de segurança no trânsito. Em contraste, países desenvolvidos tem construído suas estratégias efetivas com base em dados sólidos, e portanto, investindo tempo e dinheiro consideráveis na obtenção e criação de informações pertinentes. O objetivo dessa pesquisa é avaliar os possíveis impactos de dados suplementares sobre o desempenho de modelos espaciais, e a adequação de diferentes abordagens de modelagem espacial na previsão de acidentes. A intenção é notificar as autoridades brasileiras e de outros países em desenvolvimento sobre a importância de dados adequados. Nesta tese, foram definidos dois objetivos específicos: (I) investigar a acurácia do modelo espacial em subzonas sem amostragem; (II) avaliar o desempenho de técnicas de análise espacial de dados na previsão de acidentes. Primeiramente, foi realizado um estudo comparativo, baseado em modelos desenvolvidos para Flandres (Bélgica) e São Paulo (Brasil), através do método de Regressão Geograficamente Ponderada (RGP). Os modelos foram desenvolvidos para dois modos de transporte: ativos (pedestres e ciclistas) e motorizados (ocupantes de veículos motorizados). Subsequentemente, foi aplicado o método de holdout repetido nos modelos Flamengos, introduzindo duas abordagens de validação para GWR, denominados RGP holdout1 e RGP holdout2. Enquanto o primeiro é baseado nas estimativas de coeficientes locais derivados das subzonas vizinhas e medidas das variáveis explicativas para as subzonas de validação, o último usa as estimativas de acidentes das subzonas vizinhas, diretamente, para estimar os resultados para as subzonas ausentes. Por fim, foi comparado o desempenho de modelos RGP e outras abordagens, tais como Imputação pela Média de dados faltantes (IM), K-vizinhos mais próximos (KNN) e Krigagem com Deriva Externa (KDE). Os resultados mostraram que, adicionando os dados suplementares, reduções de 20% e 25% para o transporte motorizado, e 25% e 35% para o transporte ativo, foram resultantes em termos de Critério de Informação de Akaike corrigido (AICc) e Erro Quadrático Médio da Predição (EQMP), respectivamente. Do ponto de vista prático, os resultados poderiam ajudar a identificar hotspots e priorizar estratégias de coleta de dados, além de identificar, implementar e aplicar contramedidas adequadas. No que diz respeito às abordagens espaciais, RGP holdout2 teve melhor desempenho em relação a todas as outras técnicas e, provou que a RGP é uma técnica espacial apropriada para ambas as análises de previsão e impactos. Especialmente em países onde a disponibilidade de dados tem sido um problema, essa estrutura de validação permite que as acidentes sejam estimados enquanto, capturando efetivamente a variação espacial dos dados.
|
66 |
Modellierung des Unfallgeschehens im Radverkehr am Beispiel der Stadt DresdenMartin, Jacqueline 25 January 2021 (has links)
Das Radverkehrsaufkommen in Deutschland verzeichnete in den letzten Jahren einen Zuwachs, was sich im Umkehrschluss ebenfalls im Anstieg des Unfallgeschehens mit Radfahrendenbeteiligung widerspiegelt. Um den steigenden Unfallzahlen entgegenzuwirken, empfehlen Politik und Verbände v.a. Infrastrukturmaßnahmen zu ergreifen. Davon ausgehend untersucht die vorliegende Arbeit beispielhaft für die Stadt Dresden, wie sich einzelne Infrastrukturmerkmale auf das Unfallgeschehen zwischen Rad- und motorisiertem Verkehr auswirken. Die Datengrundlage der Untersuchung stellen dabei 548 Unfälle mit Radfahrendenbeteiligung aus den Jahren 2015 bis 2019 sowie die Merkmale von 484 Knotenpunktzufahrten dar. Da die Infrastruktur das Unfallgeschehen nicht allein determiniert, werden zudem Kenngrößen des Verkehrsaufkommens einbezogen. Um das Unfallgeschehen zu untersuchen, kommen das Random Forest-Verfahren sowie die Negative Binomialregression in Form von 'Accident Prediction Models' mit vorheriger Variablenselektion anhand des LASSO-Verfahrens zum Einsatz. Die Verfahren werden jeweils auf zwei spezielle Unfalltypen für Knotenpunkte angewandt, um differenzierte Ergebnisse zu erlangen. Der erste Unfalltyp 'Abbiege-Unfall' umfasst dabei Kollisionen zwischen einem rechtsabbiegenden und einem in gleicher oder entgegengesetzter Richtung geradeausfahrenden Beteiligten, während der zweite Unfalltyp 'Einbiegen-/Kreuzen-Unfall' Kollisionen zwischen einem vorfahrtsberechtigten Verkehrsteilnehmenden und einem einbiegenden oder kreuzenden Wartepflichtigen beinhaltet. Für den Unfalltyp 'Abbiege-Unfall' zeigen die Verfahren bspw., dass eine über den Knotenpunkt komplett oder teilweise rot eingefärbte Radfahrfurt sowie eine indirekte Führung des linksabbiegenden Radverkehrs anstelle dessen Führung im Mischverkehr höhere Unfallzahlen erwarten lässt, wobei letzteres für den untersuchten Sachverhalt irrelevant erscheint und damit auf eine Schwäche bei der Variableneinbeziehung hindeutet. Im Gegensatz dazu schätzen die Verfahren für den Unfalltyp 'Einbiegen-/Kreuzen-Unfall' bspw. höhere Unfallzahlen, wenn die Anzahl der Geradeausfahrstreifen einer Zufahrt zunimmt und wenn der Knotenpunkt durch das Verkehrszeichen Z205 bzw. eine Teil-Lichtsignalanlage anstelle der Vorschrift Rechts-vor-Links geregelt wird. Zudem zeigen die Verfahren bei beiden Unfalltypen zumeist, dass die Zahl der Unfälle ab einem bestimmten Verkehrsaufkommen weniger stark ansteigt. Dieses Phänomen ist in der Wissenschaft unter dem Namen 'Safety in Numbers-Effekt' bekannt. Ein Vergleich der Modellgüten zwischen den Unfalltypen zeigt zudem, dass beide Verfahren mit ihrem Modell des Unfalltyps 'Abbiege-Unfall' bessere Vorhersagen generieren als mit ihrem Modell des Unfalltyps 'Einbiegen-/Kreuzen-Unfall'. Weiterhin unterscheiden sich die Modellgüten nach Unfalltyp nur geringfügig zwischen beiden Verfahren, weshalb davon ausgegangen werden kann, dass beide Verfahren qualitativ ähnliche Modelle des entsprechenden Unfalltyps liefern.:1 Einleitung
2 Literaturüberblick
2.1 Safety in Numbers-Effekt
2.2 Einflussfaktoren von Radverkehrsunfällen
3 Grundlagen der Unfallforschung
3.1 Unfallkategorien
3.2 Unfalltypen
4 Datengrundlage
4.1 Unfalldaten
4.2 Infrastrukturmerkmale
4.3 Überblick über verwendete Variablen
5 Methodik
5.1 Korrelationsbetrachtung
5.2 Random Forest
5.2.1 Grundlagen
5.2.2 Random Forest-Verfahren
5.2.3 Modellgütekriterien
5.2.4 Variablenbedeutsamkeit
5.3 Negative Binomialregression
5.3.1 Grundlagen
5.3.2 Accident Prediction Models
5.3.3 Variablenselektion
5.3.4 Modellgütekriterien
5.3.5 Variablenbedeutsamkeit
5.3.6 Modelldiagnostik
6 Durchführung und Ergebnisse
6.1 Korrelationsbetrachtung
6.2 Random Forest
6.2.1 Modellgütekriterien
6.2.2 Variablenbedeutsamkeit
6.3 Negative Binomialregression
6.3.1 Variablenselektion
6.3.2 Modellgütekriterien
6.3.3 Variablenbedeutsamkeit
6.3.4 Modelldiagnostik
6.4 Vergleich beider Verfahren
6.4.1 Modellgütekriterien
6.4.2 Variablenbedeutsamkeit und Handlungsempfehlungen
6.5 Vergleich mit Literaturerkenntnissen
7 Kritische Würdigung
8 Zusammenfassung und Ausblick
|
67 |
Video quality prediction for video over wireless access networks (UMTS and WLAN)Khan, Asiya January 2011 (has links)
Transmission of video content over wireless access networks (in particular, Wireless Local Area Networks (WLAN) and Third Generation Universal Mobile Telecommunication System (3G UMTS)) is growing exponentially and gaining popularity, and is predicted to expose new revenue streams for mobile network operators. However, the success of these video applications over wireless access networks very much depend on meeting the user’s Quality of Service (QoS) requirements. Thus, it is highly desirable to be able to predict and, if appropriate, to control video quality to meet user’s QoS requirements. Video quality is affected by distortions caused by the encoder and the wireless access network. The impact of these distortions is content dependent, but this feature has not been widely used in existing video quality prediction models. The main aim of the project is the development of novel and efficient models for video quality prediction in a non-intrusive way for low bitrate and resolution videos and to demonstrate their application in QoS-driven adaptation schemes for mobile video streaming applications. This led to five main contributions of the thesis as follows:(1) A thorough understanding of the relationships between video quality, wireless access network (UMTS and WLAN) parameters (e.g. packet/block loss, mean burst length and link bandwidth), encoder parameters (e.g. sender bitrate, frame rate) and content type is provided. An understanding of the relationships and interactions between them and their impact on video quality is important as it provides a basis for the development of non-intrusive video quality prediction models.(2) A new content classification method was proposed based on statistical tools as content type was found to be the most important parameter. (3) Efficient regression-based and artificial neural network-based learning models were developed for video quality prediction over WLAN and UMTS access networks. The models are light weight (can be implemented in real time monitoring), provide a measure for user perceived quality, without time consuming subjective tests. The models have potential applications in several other areas, including QoS control and optimization in network planning and content provisioning for network/service providers.(4) The applications of the proposed regression-based models were investigated in (i) optimization of content provisioning and network resource utilization and (ii) A new fuzzy sender bitrate adaptation scheme was presented at the sender side over WLAN and UMTS access networks. (5) Finally, Internet-based subjective tests that captured distortions caused by the encoder and the wireless access network for different types of contents were designed. The database of subjective results has been made available to research community as there is a lack of subjective video quality assessment databases.
|
68 |
A Mixed Effects Multinomial Logistic-Normal Model for Forecasting Baseball PerformanceEric A Gerber (7043036) 13 August 2019 (has links)
<div>Prediction of player performance is a key component in the construction of baseball team rosters. Traditionally, the problem of predicting seasonal plate appearance outcomes has been approached univariately. That is, focusing on each outcome separately rather than jointly modeling the collection of outcomes. More recently, there has been a greater emphasis on joint modeling, thereby accounting for the correlations between outcomes. However, most of these state of the art prediction models are the proprietary property of teams or industrial sports entities and so little is available in open publications.</div><div><br></div><div>This dissertation introduces a joint modeling approach to predict seasonal plate appearance outcome vectors using a mixed-effects multinomial logistic-normal model. This model accounts for positive and negative correlations between outcomes both across and within player seasons. It is also applied to the important, yet unaddressed, problem of predicting performance for players moving between the Japanese and American major leagues.</div><div><br></div>This work begins by motivating the methodological choices through a comparison of state of the art procedures followed by a detailed description of the modeling and estimation approach that includes model t assessments. We then apply the method to longitudinal multinomial count data of baseball player-seasons for players moving between the Japanese and American major leagues and discuss the results. Extensions of this modeling framework to other similar data structures are also discussed.<br>
|
69 |
Classement pour la résistance mécanique du chêne par méthodes vibratoires et par mesure des orientations des fibres / Mechanical grading of oak wood using vibrational and grain angle measurementsFaydi, Younes 11 December 2017 (has links)
En France, les feuillus constituent la part majoritaire du parc forestier, dont, notamment, le chêne de qualité secondaire. Ce dernier pourrait devenir une alternative à d’autres matériaux de construction. Cependant, en fonction des singularités relatives à chaque sciage, les performances mécaniques peuvent varier considérablement. Il est donc nécessaire de trier les sciages adaptés pour une application en structure. L’efficience des méthodes de classement du chêne apparaît comme une des problématiques majeures. Ce projet de recherche a pour but de développer des méthodes et moyens de mesure capables de classer convenablement le chêne de qualité secondaire et palier au classement visuel par un opérateur. Ce dernier sous-estime fortement les qualités du chêne mais reste fréquemment employé par les industriels faute d’alternative. Au cours de cette thèse, deux modèles de prédiction des propriétés mécaniques ont été développés pour classer par machine le chêne de qualité secondaire. Ces modèles se basent sur une large campagne expérimentale de contrôle non destructif, avec validation par essais destructifs. Le premier modèle est analytique, exploitant les cartographies d’orientation des fibres des sciages pour déterminer localement les résistances et modules élastiques, et en déduire les propriétés globales. Le second modèle est statistique, basé sur l’analyse des signaux vibratoires sous sollicitation longitudinale ou transversale. Les résultats obtenus montrent que la méthode vibratoire longitudinale, employée couramment en industrie dans le cas des résineux, n’est pas adaptée pour classer convenablement le chêne de qualité secondaire. A l’inverse, la méthode vibratoire transversale sur chant permet d’obtenir des rendements de classement pertinents mais nécessite des efforts de développement pour être industrialisée. Le modèle basé sur la mesure de l’orientation des fibres offre les meilleurs rendements et des résultats stables sur l’ensemble des classes étudiées. / Hardwoods are the majority in France, with a substantial amount of small, low grade oaks. This resource could be an alternative of typical construction materials. However, mechanical properties can change a lot depending on timber defects. Thus, it is necessary to verify the good quality of each board in order it can be used in structural applications. The efficiency of grading methods is one of principal challenges to promote the use of oak in structures. The present work aims to provide new grading machine solutions relative to low grade oak which could replace the traditional and downgrading method based on visual sorting by an operator.Indeed, two models have been developed during this thesis, based on nondestructive measurements following by destructive tests to validate them. The first one is an analytical model based on grain angle scanning measurements. From grain angle data maps, local values of modulus of elasticity and resistance were computed, then the global mechanical properties were computed. The second one is a statistical model based on the analysis of longitudinal and transversal vibrational measurements. The results show that the longitudinal vibrational method based on the first longitudinal eigen frequency, which is mostly employed in softwood industry, is not suited for oak grading. However, the efficiency of the methods based on transversal vibrations is pretty good but it needs additional efforts for industrial application. In this work, the model based on grain angle scanning offer the best and the more robust grading efficiency for all grades.
|
70 |
Dados hiperespectrais na determinação do conteúdo relativo de água na folha em cana-de-açúcar / Hyperspectral data to determine the relative water content in the sugarcane leafBonilla, Magda Maria Zuleta 23 July 2015 (has links)
A cadeia produtiva da cana-de-açúcar vem sofrendo problemas de diversas naturezas, sendo a mais comum a estiagem, agravada pelas mudanças climáticas que reduzem a disponibilidade de água no solo, afetando diretamente a produtividade da cultura. Uma grande proporção da cultura da cana-de-açúcar não é irrigada, sendo sujeita a alterações entre estações úmidas e secas em condições tropicais e subtropicais, mas quando é irrigada, tem-se observado um incremento significativo na produtividade da cultura. As necessidades hídricas da cultura devem ser atendidas, tanto, na quantidade requerida, quanto no momento oportuno. Para isto, devem ser quantificados parâmetros relacionados com o seu estado hídrico. No entanto, os métodos empregados convencionalmente são demorados, custosos e invasivos. Como alternativa que ajuda a reduzir tempo e custos, o sensoriamento remoto hiperespectral vem sendo utilizado para estimar o estado hídrico em diferentes escalas, uma vez que permite a captura de grande quantidade de informação rapidamente. Para o presente trabalho, o comportamento espectral da vegetação de 400 a 2500 nm, foi utilizado na quantificação de alguns parâmetros que estabelecem o seu estado hídrico. As avaliações tanto em casa de vegetação quanto em laboratório foram feitas em folhas de cana-de-açúcar submetidas a déficit hídrico programado. Para os dados de laboratório foram obtidos R2 > 0,8 na região do visível e R2 < 0,55, na região do infravermelho próximo para CRA (conteúdo relativo de água). Para EEA (espessura equivalente da água) foi obtido um R2 < 0,6 na região do infravermelho próximo. / The sugarcane agribusiness has been suffering several kinds of problems. The most common is the drought caused by the weather changes, which reduce the water availability in the soil, affecting directly the crop yield. A large proportion of the sugarcane crop is not irrigated undergoing changes between wet and dry seasons in tropical and subtropical conditions, but when it is irrigated, it has been possible to observe an increase in the crop yield. The crop water requirements must be provided, both at the required amount and at the right time. To do this, parameters related to its moisture status have to be quantized. However, conventional methods are slow, invasive and expensive. As an alternative to reduce time and costs, the hyperspectral remote sensing has been being used to estimate the water status at different scales, because it allows capturing big amounts of information quickly. In the present study, the spectral behavior of vegetation between 400 and 2500 nm was used to quantify some parameters that establish its water status. The evaluations were conducted both in the greenhouse and the laboratory on sugarcane leaves under programmed water deficit. The laboratory data obtained were R2> 0.8 in the visible region and R2 <0.55 in the near infrared region for the RWC (relative water content). For the EWT (equivalent water thickness) was obtained a R2 <0.6 in the near infrared region.
|
Page generated in 0.1106 seconds