• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 46
  • 44
  • 20
  • 6
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 156
  • 156
  • 37
  • 35
  • 23
  • 20
  • 20
  • 17
  • 16
  • 15
  • 15
  • 14
  • 13
  • 13
  • 12
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Corrosion Assessment for Failed Bridge Deck Closure Pour

Abbas, Ebrahim K. 12 January 2012 (has links)
Corrosion of reinforcing steel in concrete is a significant problem around the world. In the United States, there are approximately 600,000 bridges. From those bridges 24% are considered structurally deficient or functionally obsolete based on the latest, December 2010, statistic from the Federal Highway Administration (FHWA). Mainly, this is due to chloride attack present in deicing salts which causes the reinforcing steel to corrode. Different solutions have been developed and used in practice to delay and prevent corrosion initiation. The purpose of this research is to investigate the influence of corrosion on the failure mechanism that occurred on an Interstate 81 bridge deck. After 17 years in service, a 3ft x3ft closure pour section punched through. It was part of the left wheel path of the south bound right lane of the bridge deck. The bridge deck was replaced in 1992 as part of a bridge rehabilitation project, epoxy coated reinforcement were used as the reinforcing steel. Four slabs from the bridge deck, containing the closure, were removed and transported to the Virginia Tech Structures and Materials Research Laboratory for further evaluation. Also, three lab cast slabs were fabricated as part of the assessment program. Corrosion evaluation and concrete shrinkage characterization were conducted in this research. The corrosion evaluation study included visual observation, clear concrete cover depth, concrete resistivity using single point resistivity, half-cell potential, and linear polarization using the 3LP device. Shrinkage characteristics were conducted on the lab cast slabs only, which consisted of monitoring shrinkage behavior of the specimens for 180 days and comparison of the data with five different shrinkage models. Based on the research results, guidance for assessment of other bridge decks with similar conditions will be constructed to avoid similar types of failures in the future. / Master of Science
62

The Prediction of Bank Certificates of Deposit Ratings

Kim, Mi-hyung 05 1900 (has links)
The purpose of the study was to find the best prediction models of short-term bank CD ratings using financial variables. This study used short-term bank CD ratings assigned by Moody's and Standard and Poor's.
63

Spatial crash prediction models: an evaluation of the impacts of enriched information on model performance and the suitability of different spatial modeling approaches / Modelos espaciais de previsão de acidentes: uma avaliação do desempenho dos modelos a partir da incorporação de informações aprimoradas e a adequação de diferentes abordagens de modelagem espacial

Gomes, Monique Martins 04 December 2018 (has links)
The unavailability of crash-related data has been a long lasting challenge in Brazil. In addition to the poor implementation and follow-up of road safety strategies, this drawback has hampered the development of studies that could contribute to national goals toward road safety. In contrast, developed countries have built their effective strategies on solid data basis, therefore, investing a considerable time and money in obtaining and creating pertinent information. In this research, we aim to assess the potential impacts of supplementary data on spatial model performance and the suitability of different spatial modeling approaches on crash prediction. The intention is to notify the authorities in Brazil and other developing countries, about the importance of having appropriate data. In this thesis, we set two specific objectives: (I) to investigate the spatial model prediction accuracy at unsampled subzones; (II) to evaluate the performance of spatial data analysis approaches on crash prediction. Firstly, we carry out a benchmarking based on Geographically Weighted Regression (GWR) models developed for Flanders, Belgium, and São Paulo, Brazil. Models are developed for two modes of transport: active (i.e. pedestrians and cyclists) and motorized transport (i.e. motorized vehicles occupants). Subsequently, we apply the repeated holdout method on the Flemish models, introducing two GWR validation approaches, named GWR holdout1 and GWR holdout2. While the former is based on the local coefficient estimates derived from the neighboring subzones and measures of the explanatory variables for the validation subzones, the latter uses the casualty estimates of the neighboring subzones directly to estimate outcomes for the missing subzones. Lastly, we compare the performance of GWR models with Mean Imputation (MEI), K-Nearest Neighbor (KNN) and Kriging with External Drift (KED). Findings showed that by adding the supplementary data, reductions of 20% and 25% for motorized transport, and 25% and 35% for active transport resulted in corrected Akaike Information Criterion (AICc) and Mean Squared Prediction Errors (MSPE), respectively. From a practical perspective, the results could help us identify hotspots and prioritize data collection strategies besides identify, implement and enforce appropriate countermeasures. Concerning the spatial approaches, GWR holdout2 out performed all other techniques and proved that GWR is an appropriate spatial technique for both prediction and impact analyses. Especially in countries where data availability has been an issue, this validation framework allows casualties or crash frequencies to be estimated while effectively capturing the spatial variation of the data. / A indisponibilidade de variáveis explicativas de acidentes de trânsito tem sido um desafio duradouro no Brasil. Além da má implementação e acompanhamento de estratégias de segurança viária, esse inconveniente tem dificultado o desenvolvimento de estudos que poderiam contribuir com as metas nacionais de segurança no trânsito. Em contraste, países desenvolvidos tem construído suas estratégias efetivas com base em dados sólidos, e portanto, investindo tempo e dinheiro consideráveis na obtenção e criação de informações pertinentes. O objetivo dessa pesquisa é avaliar os possíveis impactos de dados suplementares sobre o desempenho de modelos espaciais, e a adequação de diferentes abordagens de modelagem espacial na previsão de acidentes. A intenção é notificar as autoridades brasileiras e de outros países em desenvolvimento sobre a importância de dados adequados. Nesta tese, foram definidos dois objetivos específicos: (I) investigar a acurácia do modelo espacial em subzonas sem amostragem; (II) avaliar o desempenho de técnicas de análise espacial de dados na previsão de acidentes. Primeiramente, foi realizado um estudo comparativo, baseado em modelos desenvolvidos para Flandres (Bélgica) e São Paulo (Brasil), através do método de Regressão Geograficamente Ponderada (RGP). Os modelos foram desenvolvidos para dois modos de transporte: ativos (pedestres e ciclistas) e motorizados (ocupantes de veículos motorizados). Subsequentemente, foi aplicado o método de holdout repetido nos modelos Flamengos, introduzindo duas abordagens de validação para GWR, denominados RGP holdout1 e RGP holdout2. Enquanto o primeiro é baseado nas estimativas de coeficientes locais derivados das subzonas vizinhas e medidas das variáveis explicativas para as subzonas de validação, o último usa as estimativas de acidentes das subzonas vizinhas, diretamente, para estimar os resultados para as subzonas ausentes. Por fim, foi comparado o desempenho de modelos RGP e outras abordagens, tais como Imputação pela Média de dados faltantes (IM), K-vizinhos mais próximos (KNN) e Krigagem com Deriva Externa (KDE). Os resultados mostraram que, adicionando os dados suplementares, reduções de 20% e 25% para o transporte motorizado, e 25% e 35% para o transporte ativo, foram resultantes em termos de Critério de Informação de Akaike corrigido (AICc) e Erro Quadrático Médio da Predição (EQMP), respectivamente. Do ponto de vista prático, os resultados poderiam ajudar a identificar hotspots e priorizar estratégias de coleta de dados, além de identificar, implementar e aplicar contramedidas adequadas. No que diz respeito às abordagens espaciais, RGP holdout2 teve melhor desempenho em relação a todas as outras técnicas e, provou que a RGP é uma técnica espacial apropriada para ambas as análises de previsão e impactos. Especialmente em países onde a disponibilidade de dados tem sido um problema, essa estrutura de validação permite que as acidentes sejam estimados enquanto, capturando efetivamente a variação espacial dos dados.
64

Spatial crash prediction models: an evaluation of the impacts of enriched information on model performance and the suitability of different spatial modeling approaches / Modelos espaciais de previsão de acidentes: uma avaliação do desempenho dos modelos a partir da incorporação de informações aprimoradas e a adequação de diferentes abordagens de modelagem espacial

Monique Martins Gomes 04 December 2018 (has links)
The unavailability of crash-related data has been a long lasting challenge in Brazil. In addition to the poor implementation and follow-up of road safety strategies, this drawback has hampered the development of studies that could contribute to national goals toward road safety. In contrast, developed countries have built their effective strategies on solid data basis, therefore, investing a considerable time and money in obtaining and creating pertinent information. In this research, we aim to assess the potential impacts of supplementary data on spatial model performance and the suitability of different spatial modeling approaches on crash prediction. The intention is to notify the authorities in Brazil and other developing countries, about the importance of having appropriate data. In this thesis, we set two specific objectives: (I) to investigate the spatial model prediction accuracy at unsampled subzones; (II) to evaluate the performance of spatial data analysis approaches on crash prediction. Firstly, we carry out a benchmarking based on Geographically Weighted Regression (GWR) models developed for Flanders, Belgium, and São Paulo, Brazil. Models are developed for two modes of transport: active (i.e. pedestrians and cyclists) and motorized transport (i.e. motorized vehicles occupants). Subsequently, we apply the repeated holdout method on the Flemish models, introducing two GWR validation approaches, named GWR holdout1 and GWR holdout2. While the former is based on the local coefficient estimates derived from the neighboring subzones and measures of the explanatory variables for the validation subzones, the latter uses the casualty estimates of the neighboring subzones directly to estimate outcomes for the missing subzones. Lastly, we compare the performance of GWR models with Mean Imputation (MEI), K-Nearest Neighbor (KNN) and Kriging with External Drift (KED). Findings showed that by adding the supplementary data, reductions of 20% and 25% for motorized transport, and 25% and 35% for active transport resulted in corrected Akaike Information Criterion (AICc) and Mean Squared Prediction Errors (MSPE), respectively. From a practical perspective, the results could help us identify hotspots and prioritize data collection strategies besides identify, implement and enforce appropriate countermeasures. Concerning the spatial approaches, GWR holdout2 out performed all other techniques and proved that GWR is an appropriate spatial technique for both prediction and impact analyses. Especially in countries where data availability has been an issue, this validation framework allows casualties or crash frequencies to be estimated while effectively capturing the spatial variation of the data. / A indisponibilidade de variáveis explicativas de acidentes de trânsito tem sido um desafio duradouro no Brasil. Além da má implementação e acompanhamento de estratégias de segurança viária, esse inconveniente tem dificultado o desenvolvimento de estudos que poderiam contribuir com as metas nacionais de segurança no trânsito. Em contraste, países desenvolvidos tem construído suas estratégias efetivas com base em dados sólidos, e portanto, investindo tempo e dinheiro consideráveis na obtenção e criação de informações pertinentes. O objetivo dessa pesquisa é avaliar os possíveis impactos de dados suplementares sobre o desempenho de modelos espaciais, e a adequação de diferentes abordagens de modelagem espacial na previsão de acidentes. A intenção é notificar as autoridades brasileiras e de outros países em desenvolvimento sobre a importância de dados adequados. Nesta tese, foram definidos dois objetivos específicos: (I) investigar a acurácia do modelo espacial em subzonas sem amostragem; (II) avaliar o desempenho de técnicas de análise espacial de dados na previsão de acidentes. Primeiramente, foi realizado um estudo comparativo, baseado em modelos desenvolvidos para Flandres (Bélgica) e São Paulo (Brasil), através do método de Regressão Geograficamente Ponderada (RGP). Os modelos foram desenvolvidos para dois modos de transporte: ativos (pedestres e ciclistas) e motorizados (ocupantes de veículos motorizados). Subsequentemente, foi aplicado o método de holdout repetido nos modelos Flamengos, introduzindo duas abordagens de validação para GWR, denominados RGP holdout1 e RGP holdout2. Enquanto o primeiro é baseado nas estimativas de coeficientes locais derivados das subzonas vizinhas e medidas das variáveis explicativas para as subzonas de validação, o último usa as estimativas de acidentes das subzonas vizinhas, diretamente, para estimar os resultados para as subzonas ausentes. Por fim, foi comparado o desempenho de modelos RGP e outras abordagens, tais como Imputação pela Média de dados faltantes (IM), K-vizinhos mais próximos (KNN) e Krigagem com Deriva Externa (KDE). Os resultados mostraram que, adicionando os dados suplementares, reduções de 20% e 25% para o transporte motorizado, e 25% e 35% para o transporte ativo, foram resultantes em termos de Critério de Informação de Akaike corrigido (AICc) e Erro Quadrático Médio da Predição (EQMP), respectivamente. Do ponto de vista prático, os resultados poderiam ajudar a identificar hotspots e priorizar estratégias de coleta de dados, além de identificar, implementar e aplicar contramedidas adequadas. No que diz respeito às abordagens espaciais, RGP holdout2 teve melhor desempenho em relação a todas as outras técnicas e, provou que a RGP é uma técnica espacial apropriada para ambas as análises de previsão e impactos. Especialmente em países onde a disponibilidade de dados tem sido um problema, essa estrutura de validação permite que as acidentes sejam estimados enquanto, capturando efetivamente a variação espacial dos dados.
65

Modellierung des Unfallgeschehens im Radverkehr am Beispiel der Stadt Dresden

Martin, Jacqueline 25 January 2021 (has links)
Das Radverkehrsaufkommen in Deutschland verzeichnete in den letzten Jahren einen Zuwachs, was sich im Umkehrschluss ebenfalls im Anstieg des Unfallgeschehens mit Radfahrendenbeteiligung widerspiegelt. Um den steigenden Unfallzahlen entgegenzuwirken, empfehlen Politik und Verbände v.a. Infrastrukturmaßnahmen zu ergreifen. Davon ausgehend untersucht die vorliegende Arbeit beispielhaft für die Stadt Dresden, wie sich einzelne Infrastrukturmerkmale auf das Unfallgeschehen zwischen Rad- und motorisiertem Verkehr auswirken. Die Datengrundlage der Untersuchung stellen dabei 548 Unfälle mit Radfahrendenbeteiligung aus den Jahren 2015 bis 2019 sowie die Merkmale von 484 Knotenpunktzufahrten dar. Da die Infrastruktur das Unfallgeschehen nicht allein determiniert, werden zudem Kenngrößen des Verkehrsaufkommens einbezogen. Um das Unfallgeschehen zu untersuchen, kommen das Random Forest-Verfahren sowie die Negative Binomialregression in Form von 'Accident Prediction Models' mit vorheriger Variablenselektion anhand des LASSO-Verfahrens zum Einsatz. Die Verfahren werden jeweils auf zwei spezielle Unfalltypen für Knotenpunkte angewandt, um differenzierte Ergebnisse zu erlangen. Der erste Unfalltyp 'Abbiege-Unfall' umfasst dabei Kollisionen zwischen einem rechtsabbiegenden und einem in gleicher oder entgegengesetzter Richtung geradeausfahrenden Beteiligten, während der zweite Unfalltyp 'Einbiegen-/Kreuzen-Unfall' Kollisionen zwischen einem vorfahrtsberechtigten Verkehrsteilnehmenden und einem einbiegenden oder kreuzenden Wartepflichtigen beinhaltet. Für den Unfalltyp 'Abbiege-Unfall' zeigen die Verfahren bspw., dass eine über den Knotenpunkt komplett oder teilweise rot eingefärbte Radfahrfurt sowie eine indirekte Führung des linksabbiegenden Radverkehrs anstelle dessen Führung im Mischverkehr höhere Unfallzahlen erwarten lässt, wobei letzteres für den untersuchten Sachverhalt irrelevant erscheint und damit auf eine Schwäche bei der Variableneinbeziehung hindeutet. Im Gegensatz dazu schätzen die Verfahren für den Unfalltyp 'Einbiegen-/Kreuzen-Unfall' bspw. höhere Unfallzahlen, wenn die Anzahl der Geradeausfahrstreifen einer Zufahrt zunimmt und wenn der Knotenpunkt durch das Verkehrszeichen Z205 bzw. eine Teil-Lichtsignalanlage anstelle der Vorschrift Rechts-vor-Links geregelt wird. Zudem zeigen die Verfahren bei beiden Unfalltypen zumeist, dass die Zahl der Unfälle ab einem bestimmten Verkehrsaufkommen weniger stark ansteigt. Dieses Phänomen ist in der Wissenschaft unter dem Namen 'Safety in Numbers-Effekt' bekannt. Ein Vergleich der Modellgüten zwischen den Unfalltypen zeigt zudem, dass beide Verfahren mit ihrem Modell des Unfalltyps 'Abbiege-Unfall' bessere Vorhersagen generieren als mit ihrem Modell des Unfalltyps 'Einbiegen-/Kreuzen-Unfall'. Weiterhin unterscheiden sich die Modellgüten nach Unfalltyp nur geringfügig zwischen beiden Verfahren, weshalb davon ausgegangen werden kann, dass beide Verfahren qualitativ ähnliche Modelle des entsprechenden Unfalltyps liefern.:1 Einleitung 2 Literaturüberblick 2.1 Safety in Numbers-Effekt 2.2 Einflussfaktoren von Radverkehrsunfällen 3 Grundlagen der Unfallforschung 3.1 Unfallkategorien 3.2 Unfalltypen 4 Datengrundlage 4.1 Unfalldaten 4.2 Infrastrukturmerkmale 4.3 Überblick über verwendete Variablen 5 Methodik 5.1 Korrelationsbetrachtung 5.2 Random Forest 5.2.1 Grundlagen 5.2.2 Random Forest-Verfahren 5.2.3 Modellgütekriterien 5.2.4 Variablenbedeutsamkeit 5.3 Negative Binomialregression 5.3.1 Grundlagen 5.3.2 Accident Prediction Models 5.3.3 Variablenselektion 5.3.4 Modellgütekriterien 5.3.5 Variablenbedeutsamkeit 5.3.6 Modelldiagnostik 6 Durchführung und Ergebnisse 6.1 Korrelationsbetrachtung 6.2 Random Forest 6.2.1 Modellgütekriterien 6.2.2 Variablenbedeutsamkeit 6.3 Negative Binomialregression 6.3.1 Variablenselektion 6.3.2 Modellgütekriterien 6.3.3 Variablenbedeutsamkeit 6.3.4 Modelldiagnostik 6.4 Vergleich beider Verfahren 6.4.1 Modellgütekriterien 6.4.2 Variablenbedeutsamkeit und Handlungsempfehlungen 6.5 Vergleich mit Literaturerkenntnissen 7 Kritische Würdigung 8 Zusammenfassung und Ausblick
66

Video quality prediction for video over wireless access networks (UMTS and WLAN)

Khan, Asiya January 2011 (has links)
Transmission of video content over wireless access networks (in particular, Wireless Local Area Networks (WLAN) and Third Generation Universal Mobile Telecommunication System (3G UMTS)) is growing exponentially and gaining popularity, and is predicted to expose new revenue streams for mobile network operators. However, the success of these video applications over wireless access networks very much depend on meeting the user’s Quality of Service (QoS) requirements. Thus, it is highly desirable to be able to predict and, if appropriate, to control video quality to meet user’s QoS requirements. Video quality is affected by distortions caused by the encoder and the wireless access network. The impact of these distortions is content dependent, but this feature has not been widely used in existing video quality prediction models. The main aim of the project is the development of novel and efficient models for video quality prediction in a non-intrusive way for low bitrate and resolution videos and to demonstrate their application in QoS-driven adaptation schemes for mobile video streaming applications. This led to five main contributions of the thesis as follows:(1) A thorough understanding of the relationships between video quality, wireless access network (UMTS and WLAN) parameters (e.g. packet/block loss, mean burst length and link bandwidth), encoder parameters (e.g. sender bitrate, frame rate) and content type is provided. An understanding of the relationships and interactions between them and their impact on video quality is important as it provides a basis for the development of non-intrusive video quality prediction models.(2) A new content classification method was proposed based on statistical tools as content type was found to be the most important parameter. (3) Efficient regression-based and artificial neural network-based learning models were developed for video quality prediction over WLAN and UMTS access networks. The models are light weight (can be implemented in real time monitoring), provide a measure for user perceived quality, without time consuming subjective tests. The models have potential applications in several other areas, including QoS control and optimization in network planning and content provisioning for network/service providers.(4) The applications of the proposed regression-based models were investigated in (i) optimization of content provisioning and network resource utilization and (ii) A new fuzzy sender bitrate adaptation scheme was presented at the sender side over WLAN and UMTS access networks. (5) Finally, Internet-based subjective tests that captured distortions caused by the encoder and the wireless access network for different types of contents were designed. The database of subjective results has been made available to research community as there is a lack of subjective video quality assessment databases.
67

A Mixed Effects Multinomial Logistic-Normal Model for Forecasting Baseball Performance

Eric A Gerber (7043036) 13 August 2019 (has links)
<div>Prediction of player performance is a key component in the construction of baseball team rosters. Traditionally, the problem of predicting seasonal plate appearance outcomes has been approached univariately. That is, focusing on each outcome separately rather than jointly modeling the collection of outcomes. More recently, there has been a greater emphasis on joint modeling, thereby accounting for the correlations between outcomes. However, most of these state of the art prediction models are the proprietary property of teams or industrial sports entities and so little is available in open publications.</div><div><br></div><div>This dissertation introduces a joint modeling approach to predict seasonal plate appearance outcome vectors using a mixed-effects multinomial logistic-normal model. This model accounts for positive and negative correlations between outcomes both across and within player seasons. It is also applied to the important, yet unaddressed, problem of predicting performance for players moving between the Japanese and American major leagues.</div><div><br></div>This work begins by motivating the methodological choices through a comparison of state of the art procedures followed by a detailed description of the modeling and estimation approach that includes model t assessments. We then apply the method to longitudinal multinomial count data of baseball player-seasons for players moving between the Japanese and American major leagues and discuss the results. Extensions of this modeling framework to other similar data structures are also discussed.<br>
68

Classement pour la résistance mécanique du chêne par méthodes vibratoires et par mesure des orientations des fibres / Mechanical grading of oak wood using vibrational and grain angle measurements

Faydi, Younes 11 December 2017 (has links)
En France, les feuillus constituent la part majoritaire du parc forestier, dont, notamment, le chêne de qualité secondaire. Ce dernier pourrait devenir une alternative à d’autres matériaux de construction. Cependant, en fonction des singularités relatives à chaque sciage, les performances mécaniques peuvent varier considérablement. Il est donc nécessaire de trier les sciages adaptés pour une application en structure. L’efficience des méthodes de classement du chêne apparaît comme une des problématiques majeures. Ce projet de recherche a pour but de développer des méthodes et moyens de mesure capables de classer convenablement le chêne de qualité secondaire et palier au classement visuel par un opérateur. Ce dernier sous-estime fortement les qualités du chêne mais reste fréquemment employé par les industriels faute d’alternative. Au cours de cette thèse, deux modèles de prédiction des propriétés mécaniques ont été développés pour classer par machine le chêne de qualité secondaire. Ces modèles se basent sur une large campagne expérimentale de contrôle non destructif, avec validation par essais destructifs. Le premier modèle est analytique, exploitant les cartographies d’orientation des fibres des sciages pour déterminer localement les résistances et modules élastiques, et en déduire les propriétés globales. Le second modèle est statistique, basé sur l’analyse des signaux vibratoires sous sollicitation longitudinale ou transversale. Les résultats obtenus montrent que la méthode vibratoire longitudinale, employée couramment en industrie dans le cas des résineux, n’est pas adaptée pour classer convenablement le chêne de qualité secondaire. A l’inverse, la méthode vibratoire transversale sur chant permet d’obtenir des rendements de classement pertinents mais nécessite des efforts de développement pour être industrialisée. Le modèle basé sur la mesure de l’orientation des fibres offre les meilleurs rendements et des résultats stables sur l’ensemble des classes étudiées. / Hardwoods are the majority in France, with a substantial amount of small, low grade oaks. This resource could be an alternative of typical construction materials. However, mechanical properties can change a lot depending on timber defects. Thus, it is necessary to verify the good quality of each board in order it can be used in structural applications. The efficiency of grading methods is one of principal challenges to promote the use of oak in structures. The present work aims to provide new grading machine solutions relative to low grade oak which could replace the traditional and downgrading method based on visual sorting by an operator.Indeed, two models have been developed during this thesis, based on nondestructive measurements following by destructive tests to validate them. The first one is an analytical model based on grain angle scanning measurements. From grain angle data maps, local values of modulus of elasticity and resistance were computed, then the global mechanical properties were computed. The second one is a statistical model based on the analysis of longitudinal and transversal vibrational measurements. The results show that the longitudinal vibrational method based on the first longitudinal eigen frequency, which is mostly employed in softwood industry, is not suited for oak grading. However, the efficiency of the methods based on transversal vibrations is pretty good but it needs additional efforts for industrial application. In this work, the model based on grain angle scanning offer the best and the more robust grading efficiency for all grades.
69

Dados hiperespectrais na determinação do conteúdo relativo de água na folha em cana-de-açúcar / Hyperspectral data to determine the relative water content in the sugarcane leaf

Bonilla, Magda Maria Zuleta 23 July 2015 (has links)
A cadeia produtiva da cana-de-açúcar vem sofrendo problemas de diversas naturezas, sendo a mais comum a estiagem, agravada pelas mudanças climáticas que reduzem a disponibilidade de água no solo, afetando diretamente a produtividade da cultura. Uma grande proporção da cultura da cana-de-açúcar não é irrigada, sendo sujeita a alterações entre estações úmidas e secas em condições tropicais e subtropicais, mas quando é irrigada, tem-se observado um incremento significativo na produtividade da cultura. As necessidades hídricas da cultura devem ser atendidas, tanto, na quantidade requerida, quanto no momento oportuno. Para isto, devem ser quantificados parâmetros relacionados com o seu estado hídrico. No entanto, os métodos empregados convencionalmente são demorados, custosos e invasivos. Como alternativa que ajuda a reduzir tempo e custos, o sensoriamento remoto hiperespectral vem sendo utilizado para estimar o estado hídrico em diferentes escalas, uma vez que permite a captura de grande quantidade de informação rapidamente. Para o presente trabalho, o comportamento espectral da vegetação de 400 a 2500 nm, foi utilizado na quantificação de alguns parâmetros que estabelecem o seu estado hídrico. As avaliações tanto em casa de vegetação quanto em laboratório foram feitas em folhas de cana-de-açúcar submetidas a déficit hídrico programado. Para os dados de laboratório foram obtidos R2 > 0,8 na região do visível e R2 < 0,55, na região do infravermelho próximo para CRA (conteúdo relativo de água). Para EEA (espessura equivalente da água) foi obtido um R2 < 0,6 na região do infravermelho próximo. / The sugarcane agribusiness has been suffering several kinds of problems. The most common is the drought caused by the weather changes, which reduce the water availability in the soil, affecting directly the crop yield. A large proportion of the sugarcane crop is not irrigated undergoing changes between wet and dry seasons in tropical and subtropical conditions, but when it is irrigated, it has been possible to observe an increase in the crop yield. The crop water requirements must be provided, both at the required amount and at the right time. To do this, parameters related to its moisture status have to be quantized. However, conventional methods are slow, invasive and expensive. As an alternative to reduce time and costs, the hyperspectral remote sensing has been being used to estimate the water status at different scales, because it allows capturing big amounts of information quickly. In the present study, the spectral behavior of vegetation between 400 and 2500 nm was used to quantify some parameters that establish its water status. The evaluations were conducted both in the greenhouse and the laboratory on sugarcane leaves under programmed water deficit. The laboratory data obtained were R2> 0.8 in the visible region and R2 <0.55 in the near infrared region for the RWC (relative water content). For the EWT (equivalent water thickness) was obtained a R2 <0.6 in the near infrared region.
70

Análise de sobrevivência de bancos privados no Brasil / Survival analysis of private banks in Brazil

Alves, Karina Lumena de Freitas 16 September 2009 (has links)
Diante da importância do sistema financeiro para a economia de um país, faz-se necessária sua constante fiscalização. Nesse sentido, a identificação de problemas existentes no cenário bancário apresenta-se fundamental, visto que as crises bancárias ocorridas mundialmente ao longo da história mostraram que a falta de credibilidade bancária e a instabilidade do sistema financeiro geram enormes custos financeiros e sociais. Os modelos de previsão de insolvência bancária são capazes de identificar a condição financeira de um banco devido ao valor correspondente da sua probabilidade de insolvência. Dessa forma, o presente trabalho teve como objetivo identificar os principais indicadores característicos da insolvência de bancos privados no Brasil. Para isso, foi utilizada a técnica de análise de sobrevivência em uma amostra de 70 bancos privados no Brasil, sendo 33 bancos insolventes e 37 bancos solventes. Foi possível identificar os principais indicadores financeiros que apresentaram-se significativos para explicar a insolvência de bancos privados no Brasil e analisar a relação existente entre estes indicador e esta probabilidade. O resultado deste trabalho permitiu a realização de importantes constatações para explicar o fenômeno da insolvência de bancos privados no Brasil, bem como, permitiu constatar alguns aspectos característicos de bancos em momentos anteriores à sua insolvência. / The financial system is very important to the economy of a country, than its supervision is necessary. Accordingly, the identification of problems in the banking scenario is fundamental, since the banking crisis occurring worldwide throughout history have shown that and instability of the financial system generates huge financial and social costs. The banking failure prediction models are able to identify the financial condition of a bank based on the value of its probability of insolvency. Thus, this study aimed to identify the main financial ratios that can explain the insolvency of private banks in Brazil. For this, it was used the survival analysis to analize a sample of 70 private banks in Brazil, with 33 solvent banks and 37 insolvent banks. It was possible to identify the key financial indicators that were significantly to explain the bankruptcy of private banks in Brazil and it was possible to examine the relationship between these financial ratios and the probability of bank failure. The result of this work has enabled the achievement of important findings to explain the phenomenon of the bankruptcy of private banks in Brazil, and has seen some characteristic of banks in times prior to its insolvency.

Page generated in 0.1468 seconds