Spelling suggestions: "subject:"resilient."" "subject:"esilient.""
41 |
Evaluation of Resilient modulus of flexible pavements by back-calculation techniqueViswanathan, B. January 1989 (has links)
No description available.
|
42 |
A MODEL FOR THE PREDICTION OF SUBGRADE SOIL RESILIENT MODULUS FOR FLEXIBLE-PAVEMENT DESIGN: INFLUENCE OF MOISTURE CONTENT AND CLIMATE CHANGEDAVIES, BERESFORD OBAFEMI ARNOLD January 2004 (has links)
No description available.
|
43 |
The role of fault management in the embedded system designVitucci, Carlo January 2024 (has links)
In the last decade, the world of telecommunications has seen the value ofservices definitively affirmed and the loss of the connectivity value. This changeof pace in the use of the network (and available hardware resources) has ledto continuous, unlimited growth in data traffic, increased incomes for serviceproviders, and a constant erosion of operators’ incomes for voice and ShortMessage Service (SMS) traffic.The change in mobile service consumption is evident to operators. Themarket today is in the hands of over the top (OTT) media content deliverycompanies (Google, Meta, Netflix, Amazon, etc.), and The fifth generation ofmobile networks (5G), the latest generation of mobile architecture, is nothingother than how operators can invest in system infrastructure to participate in theprosperous service business.With the advent of 5G, the worlds of cloud and telecommunications havefound their meeting point, paving the way for new infrastructures and ser-vices, such as smart cities, industry 4.0, industry 5.0, and Augmented Reality(AR)/Virtual Reality (VR). People, infrastructures, and devices are connected toprovide services that we even struggle to imagine today, but a highly intercon-nected system requires high levels of reliability and resilience.Hardware reliability has increased since the 1990s. However, it is equallycorrect to mention that the introduction of new technologies in the nanometerdomain and the growing complexity of on-chip systems have made fault man-agement critical to guarantee the quality of the service offered to the customerand the sustainability of the network infrastructure. In this thesis, our first contribution is a review of the fault managementimplementation framework for the radio access network domain. Our approachintroduces a holistic vision in fault management where there is increasingly moresignificant attention to the recovery action, the crucial target of the proposedframework. A new contribution underlines the attention toward the recoverytarget: we revisited the taxonomy of faults in mobile systems to enhance theresult of the recovery action, which, in our opinion, must be propagated betweenthe different layers of an embedded system ( hardware, firmware, middleware,and software). The practical adoption of the new framework and the newtaxonomy allowed us to make a unique contribution to the thesis: the proposalof a new algorithm for managing system memory errors, both temporary (soft)and permanent (hard)The holistic vision of error management we introduced in this thesis involveshardware that proactively manages faults. An efficient implementation of faultmanagement is only possible if the hardware design considers error-handlingtechniques and methodologies. Another contribution of this thesis is the def-inition of the fault management requirements for the RAN embedded systemhardware design.Another primary function of the proposed fault management framework isfault prediction. Recognizing error patterns means allowing the system to reactin time, even before the error condition occurs, or identifying the topology of theerror to implement more targeted and, therefore, more efficient recovery actions.The operating temperature is always a critical characteristic of embedded radioaccess network systems. Base stations must be able to work in very differenttemperature conditions. However, the working temperature also directly affectsthe probability of error for the system. In this thesis, we have also contributed interms of a machine-learning algorithm for predicting the working temperature ofbase stations in radio access networks — a first step towards a more sophisticatedimplementation of error prevention and prediction.
|
44 |
Floating UrbanizationPlasencia, Jacob 06 June 2024 (has links)
Climate change is a daily challenge that we are faced with, it has become a part of our lives and is altering how we live. Architecture plays a vital role in life and it is crucial for architecture to be able to adapt to the climate conditions that may arise. A large percentage of the population living near coastal cities are faced with dangers of sea level rise, flooding, and coastal storms. Architects must design for the people within these cities or else their lives will be lost.
Designers understand the catastrophic we are currently facing and are finding innovative ways to protect our cities. From rebuilding the coastal lines to making cities to imitate being a sponge. These solutions all play an important role in the future generations, each design can not work independently from each other and must work cohesively in order to have a resilient city.
This thesis explores the possibility of having a dense neighborhood adjacent to a city. This neighborhood is purely independent from the main city utilities so that if a major event did occur then no systems will be interrupted. Adaptable architecture is able to change over time and with the growth of population. The design goal is to offer an optimal living option for people, the neighborhood is designed to be able to grow with a family and offer aging in place options while continuously growing over time. The structure is able to grow by using a modular pontoon system that can be attached to another module to create an interconnected city. These modules are Biocrete structures that contain the systems and utilities for a building to function giving full flexibility of what can be constructed on top.
Since the city is forever growing the vertical core acts as a home for a mobile crane to be attached and assist with the development of residential units or the larger urban-scape. This crane can also be positioned on a small mobile barge that floats around the city to serve any location at a given time. / Master of Architecture / Architecture must be adaptable in a climate changing environment otherwise there will be catastrophic failure in society. Coastal cities are faced with the most climate challenges with sea level rise, flooding, storm surge, hurricanes, tsunami's, etc. Due to these challenges it is vital to have architecture to be resilient and still remain functional after a storm.
The main challenge that is explored is how can we overcome the losses that are caused by flooding in coastal cities. This exploration suggests the idea of floating urbanization that is completely independent from the city and is able to adapt with any sea level elevation.
The initial response to flooding is how can we keep the water out when in reality water cannot be contained over a long period of time. Instead, the question should be how can we live with water? The solution is to literally live with it, to design a city that floats on it and is able to move with the sea level. This eliminates the worry about your home being flooded and offers another option of living. This city is a fully adaptable structure that grows with families and the population at the same time. Each residential unit offers modular components to allow for the unit to expand given the size of the family, these residential modules sit on top a floating modular pontoon that can then be attached to another pontoon to allow the city to grow horizontally.
This idea is not foreign nor new, it has been a way of living for some people for many years from a new single family home in Denmark to a whole floating village in Peru. It is not a new concept but it will be a more frequent design choice as climate change becomes more prevalent.
|
45 |
STUDY OF RESILIENT MODULUS AND GEOTECHNICAL PROPERTIES OF POLYMER STABILIZED HIGH PLASTICITY CLAYBhattarai, Sushanta 01 May 2024 (has links) (PDF)
Soil stabilization is a widely used technique in the field of geotechnical engineering for a wide range of applications. Traditional stabilizers such as cement and lime, although very efficient, are not environmentally friendly as they leave major carbon footprints, therefore the demand for sustainable stabilization methods is escalating. This research investigates the potential of two different polymers e.g., a biopolymer derived from organic source, and an inorganic commercially manufactured polymer, as viable alternatives for soil stabilization. The current study focuses on exploring the efficacy of polymers stabilized soil in improving the engineering or geotechnical properties such as plasticity, compressibility, shear strength, and stiffness behavior.The research methodology involves using locally available high plastic clay for stabilization using two different types of polymers and performing laboratory experiments to analyze the strength parameters of the stabilized soil. Xanthan Gum (XG) is a biopolymer which is being studied is used in the percentages of 0.5%, 1.0% and 1.5% by dry weight of soil mass to understand the mechanism of biopolymer-soil interactions and to conclude optimum percentage suitable for stabilization in terms of technical and economical value. Similarly, Soiltac (ST) a vinyl copolymer inorganic polymer is used in 1.5% of dry mass of soil (optimum dosage as per previous literature) to compare its effectiveness with that of Xanthan Gum. After the determination of Atterberg limits and Optimum Moisture Content (OMC) and Maximum Dry Density (MDD), the samples were subjected to tests such as Unconfined Compressive Strength (UCS), Ultrasonic Pulse Velocity (UPV), Resilient Modulus (RM) test and Consolidation test. The prepared UCS samples were cured for 0, 7, 14, and 28 days in open air condition before performing test on them. Atterberg limits test on untreated Carbondale Soil were conducted to classify the soil as CH (Clay with high compressibility) type as per USCS (Unified Soil Classification System) classification. While tests on treated sample showed significant increasement in Liquid Limit (LL), slight increment in Plastic Limit (PL), thus quite surge in the Plasticity Index (PI) with increase in XG percentage in the soil. UCS value increased with the increase in percentage addition of XG. Also, UCS results from both untreated and polymer treated samples showed increase in compressive strength with increase in curing period. UCS value increased from 417.75 psi to 490.24 psi, 504.05 psi, and 542.91 psi for 0.5%, 1.0%, and 1.5% XG addition, respectively. This increase in UCS value was 17.35%, 20.66%, and 29.96% for the corresponding XG concentrations. The treated samples had a significant increase in the UCS for all the curing period in comparison to their respectively cured untreated sample. The percentages increase in the UCS for 1.5% XG sample in comparison to untreated sample cured for the same period is 6.45%, 59.57%, and 29.96%, respectively for 7, 14 and 28 days of curing. However, for the zero-day test, the UCS of 1.5% XG stabilized sample was found to be less than the zero-day untreated sample. With the addition of ST polymer, the UCS value increased for all the curing period while comparing with the UCS of untreated soil for the same curing period. The UCS of the ST treated soil increased from 58.56 psi to 467.367 psi when cured for 0 and 28 days which is an increase of 698.1 % i.e. 7 times the strength at 0 day. When UPV (Ultrasound Pulse Velocity) tests were compared with the UCS value for the same sample, the result showed that the higher UPV value corresponded to the higher UCS value. This relationship was supported by the high degree of correlation between the two measurements. The consolidation test showed that the Compression Index (Cc) of XG stabilized soil decreased as the percentage of XG added increased. Cc decreased from 0.2795 for pure Carbondale Soil (CS) to 0.2003 for 1.5% XG addition which is a drop of 28.33%. Likewise, Cc decreased by 3.0% and 19.33% for 0.5% and 1.0% XG doses respectively. The primary aim of this study is to simplify the understanding of the Resilient Modulus (RM) test, which yields vital data for pavement design. The efficacy of inclusion of stabilizer was further substantiated by RM testing which confirmed the enhancement of soil resilient qualities compared to the untreated soil. The RM values exhibited a growing trend, indicating an enhancement in the soil's stiffness and capacity to endure repetitive loads. This attribute is extremely important for applications such as the construction of pavements and foundations that are subjected to dynamic loads. The samples containing 1.0% XG showed significant increases in their RM values. Specifically, the RM values increased by 18.5%, 40%, and 39.5% after being cured for 7, 14, and 28 days, respectively, at a confining pressure of 6 psi. Similarly, the RM for the case of ST ranges from 15227.60 psi for 0 days of curing and 2 psi of confining stress to 45375 psi for 28 days of curing and 6 psi of confining pressure. The performance of ST against XG is higher.
|
46 |
Patient and public co-creation of healthcare safety and healthcare system resilience: The case of COVID-19Abutt, A., Ramsey, L., Fylan, Beth, Grindey, C., Hague, I., O'Hara, J.K. 18 September 2024 (has links)
Yes / Healthcare system resilience is a conceptual approach that seeks to explore how health services adapt and respond to variability in demand and resources. As has been witnessed since the beginning of the COVID-19 pandemic, healthcare services have undergone many reconfigurations. One understudied aspect of how the 'system' is able to adapt and respond is the contribution of key stakeholders-patients and families, and in the context of the pandemic, the general public as a whole. This study aimed to understand what people were doing during the first wave of the pandemic to protect the safety of their health, and the health of others from COVID-19, and the resilience of the healthcare system.
Social media (Twitter) was used as a method of recruitment due to its ability for social reach. Twenty-one participants took part in 57 semistructured interviews over three time points from June to September 2020. The included an initial interview and invitation to two follow-up interviews after 3 and 6 weeks. Interviews were conducted virtually using Zoom-an encrypted secure video conferencing software. A reflexive thematic analysis approach to analysis was used.
Three themes, each with its own subthemes were identified in the analysis: (1) A 'new safety normal'; (2) Existing vulnerabilities and heightened safety and (3) Are we all in this together?
This study found that the public had a role in supporting the resilience of healthcare services and systems during the first wave of the pandemic by adapting their behaviour to protect themselves and others, and to avoid overwhelming the National Health Service. People who had existing vulnerabilities were more likely to experience safety gaps in their care, and be required to step in to support their safety, despite it being more difficult for them to do so. It may be that the most vulnerable were previously required to do this extra work to support the safety of their care and that the pandemic has just illuminated this issue. Future research should explore existing vulnerabilities and inequalities, and the heightened safety consequences created by the pandemic.
The National Institute for Health Research (NIHR) Yorkshire and Humber Patient Safety Translational Research Centre (NIHR Yorkshire and Humber PSTRC), Patient and Public Involvement and Engagement Research Fellow and NIHR Yorkshire and Humber PSTRC Patient Involvement in Patient Safety theme lay leader are involved in the preparation of a lay version of the findings within this manuscript.
|
47 |
Resilient Navigation through Jamming Detection and Measurement Error ModelingJada, Sandeep Kiran 28 October 2024 (has links)
Global Navigation Satellite Systems (GNSS) provide critical positioning, navigation, and timing (PNT) services across various sectors. GNSS signals are weak when they reach Earth from Medium Earth Orbit (MEO), making them vulnerable to jamming. The jamming threat has been growing over the past decade, putting critical services at risk. In response, the National Space-Based PNT Advisory Board and the White House advocate for policies and technologies to protect, toughen, and augment GPS for a more resilient PNT.
Time-sequential estimation improves navigation accuracy and allows for the augmentation of GNSS with other difficult-to-interfere sensors. Safety-critical navigation applications (e.g., GNSS/INS-based aircraft localization) that use time-sequential estimation require high-integrity measurement error time correlation models to compute estimation error bounds.
In response, two new methods to identify high-integrity measurement error time correlation models from experimental data are developed and evaluated in this thesis. As opposed to bounding autocorrelation functions in the time domain and power spectra in the frequency domain, methods proposed in this thesis use bounding of lagged product distributions in the time domain and scaled periodogram distributions in the frequency domain. The proposed methods can identify tight-bounding models from empirical data, resulting in tighter estimation error bounds. The sample distributions are bound using theoretical First-order Gauss-Markov process (FOGMP) model distributions derived in this thesis. FOGMP models provide means to account for error time correlation while being easily incorporated into linear estimators. The two methods were evaluated using simulated and experimental GPS measurement error data collected in a mild multipath environment.
To protect and alert GNSS end users of jamming, this thesis proposes and evaluates an autonomous algorithm to detect jamming using publicly available data from large receiver networks. The algorithm uses carrier-to-noise ratio (C/N0)-based jamming detectors that are optimal, self-calibrating, receiver-independent, and while adhering to a predefined false alert rate. This algorithm was tested using data from networks with hundreds of receivers, revealing patterns indicative of intentional interference, which provided an opportunity to validate the detector. This validation activity, described in this thesis, consists of designing a portable hardware setup, deriving an optimal power-based jamming monitor for independent detection, and time-frequency analysis of wideband RF (WBRF) data collected during jamming events. The analysis of the WBRF data from a genuine jamming event detected while driving on I-25 in Denver, Colorado, USA, revealed power variations resembling a personal privacy device (PPD), validating the C/N0 detector's result.
Finally, this thesis investigates the cause of recurring false alerts in our power-based jamming detectors. These false alerts are caused by a few short pulses of power increases, which other researchers also observe. The time-frequency analysis of signals from the pulses revealed binary data encoded using frequency shift keying (FSK) in the GPS L1 band. Various experiments confirmed the signals are not aliases of out-of-band signals. A survey of similar encoded messages identified the source as car key fobs and other devices transmitting at 315 MHz, nowhere near the GPS L1 band, with an unattenuated 5$^{th}$ harmonic in the GPS L1 band. The RF emission regulations were analyzed to identify mitigation. / Doctor of Philosophy / Global Navigation Satellite Systems (GNSS) have become integral to modern-day life. Many essential services rely on GNSS-provided Positioning, Navigation, and Timing (PNT) services; power grids rely on accurate GNSS-provides timing for synchronization; stock markets use them for time-stamping trades; aircraft and ships use GNSS to correct accumulated position errors regularly; to name a few. In addition, the availability of cheap and accessible PNT services combined with mobile internet spawned new service sectors through mobile applications. A 2019 study published by the National Institute of Standards and Technology (NIST) estimates that GPS has generated $1.4 trillion in U.S. economic benefits since the system became available in the 1980s.
With the wide adoption of GNSS services comes new motives for interference. These motives can range from delivery workers and truck drivers trying to hide their location from their employers to something more nefarious, such as criminals trying to evade law enforcement surveillance. GNSS jamming is a type of interference in which the attacker drowns out the faint GNSS signals, broadcast from medium Earth Orbit (MEO) at 20,000 km, with a powerful RF transmitter. Some commonly used devices are transmitters are cheaply available for as low as $10 on Amazon, known as personal privacy devices (PPDs). Another source of jamming comes from militaries in conflict zones overseas, jamming GNSS signals over large areas of a country or a city. However, two major incidents in the US have disrupted air traffic over busy airspace, such as in Denver and Dallas. This threat of GNSS interference has grown over the past decade and is only getting worse. The White House and other organizations advocate for policies for a more resilient PNT; to protect, toughen, and augment GNSS.
%
This thesis contributes to protecting GNSS frequencies through autonomous algorithms that process publicly available signal quality data from large receiver networks for jamming detection. This autonomous algorithm uses detectors that are self-calibrating and optimal, i.e., minimizing the probability of missed detection while targeting a predefined false alert probability. Several jamming event patterns consistent with intentional interference were detected using this algorithm. The signal-quality-based detectors were validated using an independent power-based optimal jamming detector derived in this thesis.
Spurious recurring false alerts triggered the power detector. An investigation described in the thesis discovered that car key fobs and other devices emit RF energy in restricted GPS frequencies. Based on the analysis of FCC regulation for RF transmitters, mitigation is proposed for power-based jamming detectors to prevent false alarms.
Time-sequential estimation improves navigation accuracy and allows for the augmentation of GNSS with other difficult-to-interfered sensors such as IMU or LIDAR. Safety-critical navigation applications can benefit from time-sequential estimation, but they require high-integrity measurement error time correlation models to compute bounds on positioning errors. Two new methods to derive high-integrity measurement error time correlation models from experimental data are developed and evaluated in this thesis. These methods can derive tighter bounding models compared to the existing methods, reducing the uncertainty in position estimates. The two methods were implemented and evaluated using simulated and experimental GPS measurement error data collected in a mild multipath environment.
|
48 |
Resilient Monitoring and Robust Control towards Blackout Prevention in Modern Power GridsBanerjee, Abhishek January 2020 (has links)
This dissertation embodies a comprehensive approach towards resilient monitoring of frid events using Structure Preserving Energy Functions (SPEFs) and introduces a novel control architecture in Multi Terminal Direct Current (MTDC) grids, for inter-area oscillation damping and achieving robustness to AC as well as DC side, post-contingency events in the modern power grid. This work is presented as a collection of several publications which investigate and address the proposed research topics. At first, SPEFs are derived for multi-machine IEEE benchmark models with the help of the Wide-Area Measurements (WAMs). A physics-based hybrid approach to develop one-to-one mapping between properties of energy function components with respect to the type of fault in the system is introduced. The proposed method is tested offline on a IEEE-39 bus, New England Test System (NETS), with particular interest in monitoring the most sensitive energy functions during relay misoperations. Such events can be precipitated by zone 3 trips in distance relays due to load encroachment during stressed conditions. These might include a genuine misoperation, a false trip due to cyber-attacks, or a load encroachment, all of which are undesirable under normal operating circumstances. An online monitoring scheme is introduced in an actual blackout simulation in the Western Electricity Coordinating Council (WECC) to examine what further indications these energy function components can provide, especially during a cascading sequence, and how they could supervise critical tripping decisions by distance relays. Next, a futuristic grid comprised of Voltage Source Converter (VSC) based AC-MTDC is considered due to its recent proliferation in integrating offshore wind farms to onshore grids, and additionally improving strength of weak AC grids. A robust control is designed using the converter station poles as actuators to provide damping support to the surrounding AC grid. Further, a design problem is envisioned and implemented that introduces disturbance rejection into control architecture by designing a novel explicitly modeled disturbance plant in the Linear Matrix Inequality (LMI) framework. Finally, a novel robust inter-area oscillation damping controller is designed that proves its effectiveness in inter-area mode settling times, and provides robustness to (n-1) contingencies in the AC as well as the DC side of the meshed AC-MTDC grid.
|
49 |
Flexible and Redundant Supply Chain Practices to Build Strategic Supply Chain Resilience: Contingent and Resource-based PerspectivesPark, Kihyun January 2011 (has links)
No description available.
|
50 |
UTILIZAÇÃO DE PROCESSAMENTO DIGITAL DE IMAGENS E REDES NEURAIS ARTIFICIAIS PARA O RECONHECIMENTO DE ÍNDICES DE SEVERIDADE DA FERRUGEM ASIÁTICA DA SOJAMelo, Geisla de Albuquerque 25 May 2015 (has links)
Made available in DSpace on 2017-07-21T14:19:24Z (GMT). No. of bitstreams: 1
Melo, Geisla Albuquerque.pdf: 2986772 bytes, checksum: 02494f1ef68a9df48a1184c0a3e81dce (MD5)
Previous issue date: 2015-05-25 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / According to Embrapa (2013), Brazil is the world's second largest soy producer just after the United States. Season after season, the production and planted area in Brazil is growing, however, climatic factors and crop diseases are affecting plantation, preventing further growth, and causing losses to farmers. Asian rust caused by Phakopsora pachyrhizi, is a foliar disease, considered one of the most important diseases at present, because of the potential for loss. Asian rust can be mistaken for other diseases in soybeans, such as Bacterial Blight, a Stain Brown and Bacterial Pustule, due to similar visual appearances. Thus, the present study aimed to develop an application for mobile devices using the Android platform to perform automatic recognition of the Asian soybean rust severity indices to assist in the early diagnosis and therefore assist in decision-making as the management and control of the disease. For this, was used techniques of digital image processing (DIP) and Artificial Neural Networks (ANN). First, around 3.000 soybean leaves were collected in the field, where about 2.000 were harnessed. Then it were separated by severity index, photographed in a controlled environment, and after that were processed in order to eliminate noise and background images. Filtering preprocessing phase consisted of median filter, Gaussian filter processing for gray scale, Canny edge detector, expansion, find and drawcontours, and finally the cut of leaf. After this was extracted color and texture features of the images, which were the average R, G and B Variant also for the three channels R, G and B according angular momentum, entropy, contrast, homogeneity, and finally correlation the severity degree previously known. With these data, the training was performed an ANN through the neural network simulator BrNeural. During training, parameters such as number of severity levels and number of neurons of the hidden layer have changed. After training, was chosen network architecture that gave better results, with 78.86% accuracy for Resilient-propagation algorithm. This network was saved in an object and inserted into the application, ready to be used with new data. Thus, the application takes the soybean leaf picture and filters the acquired image. After this, it extracts the features and commands internally to the trained neural network, which analyzes and reports the severity. Still, it is optionally possible to see a georeferenced map of the property, with the severities identified by small colored squares, each representing a different index. / Segundo a Embrapa (2013), o Brasil é o segundo maior produtor de soja do mundo, atrás apenas nos Estados Unidos. Safra após safra, a produção e a área plantada do Brasil vem crescendo, entretanto, fatores climáticos e doenças da cultura vêm afetando as lavouras, impedindo um crescimento ainda maior, e causando perdas para os agricultores. A ferrugem asiática, causada pelo fungo Phakopsora pachyrhizi, é uma doença foliar, considerada uma das doenças de maior importância na atualidade, devido ao grande potencial de perdas. A ferrugem asiática pode ser confundida com outras doenças na soja, como o Crestamento Bacteriano, a Mancha Parda e a Pústula Bacteriana, devido às aparências visuais semelhantes. Deste modo, O presente estudo teve por objetivo desenvolver um aplicativo para dispositivos móveis que utilizam a plataforma Android, para realizar o reconhecimento automático dos índices de severidade da ferrugem asiática da soja, para auxiliar no diagnóstico precoce e por consequência, auxiliar na tomada de decisão quanto ao manejo e controle da doença. Para isto, foram utilizadas técnicas de Processamento Digital de Imagens (PDI) e Redes Neurais Artificiais (RNA). Primeiramente, foram coletadas aproximadamente 3 mil folhas de soja em campo, onde cerca de 2 mil foram aproveitadas. Então elas foram separadas por índices de severidade, fotografadas em ambiente controlado, e após isto foram processadas com o objetivo de eliminar ruídos e o fundo das imagens. A fase de filtragem do pré-processamento consistiu nos filtros da mediana, filtro Gaussiano, transformação para escala de cinza, detector de bordas Canny, dilatação, find e drawcontours, e por fim o recorte da folha. Após isto, foram extraídas as características de cor e textura das imagens, que foram as médias R, G e B, Variância também para os três canais R, G e B, Segundo Momento Angular, Entropia, Contraste, Homogeneidade, Correlação e por fim, o Grau de Severidade previamente sabido. Com estes dados, foi realizado o treinamento de uma RNA através do simulador de redes neurais BrNeural. Durante o treinamento, parâmetros como quantidade de níveis de severidade e quantidade de neurônios da camada oculta foram alterados. Após o treinamento, foi escolhida a arquitetura de rede que deu melhor resultado, com 78,86% de acerto para o algoritmo Resilient-propagation. Esta rede foi salva em um objeto e inserida no aplicativo, pronta para ser utilizada com dados novos. Assim, o aplicativo tira a foto da folha de soja e faz a filtragem da imagem adquirida. Após isto, extrai as características e manda internamente para a rede neural treinada, que analisa e informa a severidade. Ainda, opcionalmente é possível ver um mapa georreferenciado da propriedade, com as severidades identificadas por pequenos quadrados coloridos, representando cada um, um índice diferente.
|
Page generated in 0.0614 seconds