• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • 1
  • Tagged with
  • 3
  • 3
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Análise de alterações em fenômenos agroambientais utilizando o método de entropia de permutação

FERREIRA, Diego Vicente de Souza 18 February 2016 (has links)
Submitted by Mario BC (mario@bc.ufrpe.br) on 2016-05-20T16:02:58Z No. of bitstreams: 1 Diego Vicente de Souza Ferreira.pdf: 4348458 bytes, checksum: d170ec08e8b19561b0eff75d6b69c040 (MD5) / Made available in DSpace on 2016-05-20T16:02:58Z (GMT). No. of bitstreams: 1 Diego Vicente de Souza Ferreira.pdf: 4348458 bytes, checksum: d170ec08e8b19561b0eff75d6b69c040 (MD5) Previous issue date: 2016-02-18 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - CAPES / In this work we analyze the complexity of stream flow in the São Francisco River and hot-pixels detected in Amazonia, in order to evaluate the effects of human activity. Permutation entropy is employed which takes into account temporal causality by comparing consecutive values within the series. We also use this entropy method to analyze rainfall regime in Pernambuco, Brazil. For the São Francisco River, we analyze the influence of the Sobradinho dam construction on the hydrological regime. The results show that entropy of stream flow increases after the dam’s construction in 1979, which indicates more disordered and less predictable dynamics. For hot-pixels detected in Amazonia, the increase in entropy is related to severe droughts in 2005, 2007, and 2010. For temporal series of precipitation in Pernambuco, entropy values decrease with distance from the coast, indicating more predictability of monthly rainfall in the zona de mata and agreste regions, and less predictable rainfall dynamics in the sertão and vale do São Francisco regions. / Neste trabalho foi analisada a complexidade das séries temporais de vazão do rio São Francisco e de queimadas na Amazônia, para avaliar as alterações causadas pela atividade humana. Utilizou-se o método entropia de permutação (Permutation entropy) que incorpora a relação temporal entre os valores da série analisada, utilizando uma representação simbólica baseada na comparação dos valores consecutivos da série. Este método também foi usado para analisar regime de chuva de Pernambuco. Para a vazão do rio São Francisco avaliou se a influência da construção da barragem Sobradinho no regime hidrológico. Os resultados das análises da série temporal de vazão para o período 1929-2009 mostraram que a entropia aumentou depois da construção da barragem Sobradinho indicando uma dinâmica de vazão mais desordenada e menos previsível neste período. Os resultados obtidos para série temporal diária de queimadas detectadas na Amazônia durante o período 1999-2012, mostraram um aumento da entropia relacionado com secas que ocorreram em 2005, 2007 e 2010. Em relação aos dados de precipitação de Pernambuco, os valores da entropia de permutação diminuem com o aumento da distância das estações do litoral, indicando maior variabilidade e menor previsibilidade das chuvas mensais nas regiões próximas a zona da mata e agreste, e menor variabilidade e maior previsibilidade nas regiões próximas ao sertão e vale do São Francisco.
2

Addressing nonlinear systems with information-theoretical techniques

Castelluzzo, Michele 07 July 2023 (has links)
The study of experimental recording of dynamical systems often consists in the analysis of signals produced by that system. Time series analysis consists of a wide range of methodologies ultimately aiming at characterizing the signals and, eventually, gaining insights on the underlying processes that govern the evolution of the system. A standard way to tackle this issue is spectrum analysis, which uses Fourier or Laplace transforms to convert time-domain data into a more useful frequency space. These analytical methods allow to highlight periodic patterns in the signal and to reveal essential characteristics of linear systems. Most experimental signals, however, exhibit strange and apparently unpredictable behavior which require more sophisticated analytical tools in order to gain insights into the nature of the underlying processes generating those signals. This is the case when nonlinearity enters into the dynamics of a system. Nonlinearity gives rise to unexpected and fascinating behavior, among which the emergence of deterministic chaos. In the last decades, chaos theory has become a thriving field of research for its potential to explain complex and seemingly inexplicable natural phenomena. The peculiarity of chaotic systems is that, despite being created by deterministic principles, their evolution shows unpredictable behavior and a lack of regularity. These characteristics make standard techniques, like spectrum analysis, ineffective when trying to study said systems. Furthermore, the irregular behavior gives the appearance of these signals being governed by stochastic processes, even more so when dealing with experimental signals that are inevitably affected by noise. Nonlinear time series analysis comprises a set of methods which aim at overcoming the strange and irregular evolution of these systems, by measuring some characteristic invariant quantities that describe the nature of the underlying dynamics. Among those quantities, the most notable are possibly the Lyapunov ex- ponents, that quantify the unpredictability of the system, and measure of dimension, like correlation dimension, that unravel the peculiar geometry of a chaotic system’s state space. These methods are ultimately analytical techniques, which can often be exactly estimated in the case of simulated systems, where the differential equations governing the system’s evolution are known, but can nonetheless prove difficult or even impossible to compute on experimental recordings. A different approach to signal analysis is provided by information theory. Despite being initially developed in the context of communication theory, by the seminal work of Claude Shannon in 1948, information theory has since become a multidisciplinary field, finding applications in biology and neuroscience, as well as in social sciences and economics. From the physical point of view, the most phenomenal contribution from Shannon’s work was to discover that entropy is a measure of information and that computing the entropy of a sequence, or a signal, can answer to the question of how much information is contained in the sequence. Or, alternatively, considering the source, i.e. the system, that generates the sequence, entropy gives an estimate of how much information the source is able to produce. Information theory comprehends a set of techniques which can be applied to study, among others, dynamical systems, offering a complementary framework to the standard signal analysis techniques. The concept of entropy, however, was not new in physics, since it had actually been defined first in the deeply physical context of heat exchange in thermodynamics in the 19th century. Half a century later, in the context of statistical mechanics, Boltzmann reveals the probabilistic nature of entropy, expressing it in terms of statistical properties of the particles’ motion in a thermodynamic system. A first link between entropy and the dynamical evolution of a system is made. In the coming years, following Shannon’s works, the concept of entropy has been further developed through the works of, to only cite a few, Von Neumann and Kolmogorov, being used as a tool for computer science and complexity theory. It is in particular in Kolmogorov’s work, that information theory and entropy are revisited from an algorithmic perspective: given an input sequence and a universal Turing machine, Kolmogorov found that the length of the shortest set of instructions, i.e. the program, that enables the machine to compute the input sequence was related to the sequence’s entropy. This definition of the complexity of a sequence already gives hint of the differences between random and deterministic signals, in the fact that a truly random sequence would require as many instructions for the machine as the size of the input sequence to compute, as there is no other option than programming the machine to copy the sequence point by point. On the other hand, a sequence generated by a deterministic system would simply require knowing the rules governing its evolution, for example the equations of motion in the case of a dynamical system. It is therefore through the work of Kolmogorov, and also independently by Sinai, that entropy is directly applied to the study of dynamical systems and, in particular, deterministic chaos. The so-called Kolmogorov-Sinai entropy, in fact, is a well-established measure of how complex and unpredictable a dynamical system can be, based on the analysis of trajectories in its state space. In the last decades, the use of information theory on signal analysis has contributed to the elaboration of many entropy-based measures, such as sample entropy, transfer entropy, mutual information and permutation entropy, among others. These quantities allow to characterize not only single dynamical systems, but also highlight the correlations between systems and even more complex interactions like synchronization and chaos transfer. The wide spectrum of applications of these methods, as well as the need for theoretical studies to provide them a sound mathematical background, make information theory still a thriving topic of research. In this thesis, I will approach the use of information theory on dynamical systems starting from fundamental issues, such as estimating the uncertainty of Shannon’s entropy measures on a sequence of data, in the case of an underlying memoryless stochastic process. This result, beside giving insights on sensitive and still-unsolved aspects when using entropy-based measures, provides a relation between the maximum uncertainty on Shannon’s entropy estimations and the size of the available sequences, thus serving as a practical rule for experiment design. Furthermore, I will investigate the relation between entropy and some characteristic quantities in nonlinear time series analysis, namely Lyapunov exponents. Some examples of this analysis on recordings of a nonlinear chaotic system are also provided. Finally, I will discuss other entropy-based measures, among them mutual information, and how they compare to analytical techniques aimed at characterizing nonlinear correlations between experimental recordings. In particular, the complementarity between information-theoretical tools and analytical ones is shown on experimental data from the field of neuroscience, namely magnetoencefalography and electroencephalography recordings, as well as mete- orological data.
3

Consciousness Detection in a Complete Locked-in Syndrome Patient through Multiscale Approach Analysis

Wu, Shang-Ju, Nicolaou, Nicoletta, Bogdan, Martin 13 April 2023 (has links)
Completely locked-in state (CLIS) patients are unable to speak and have lost all muscle movement. From the external view, the internal brain activity of such patients cannot be easily perceived, but CLIS patients are considered to still be conscious and cognitively active. Detecting the current state of consciousness of CLIS patients is non-trivial, and it is difficult to ascertain whether CLIS patients are conscious or not. Thus, it is important to find alternative ways to re-establish communication with these patients during periods of awareness, and one such alternative is through a brain–computer interface (BCI). In this study, multiscale-based methods (multiscale sample entropy, multiscale permutation entropy and multiscale Poincaré plots) were applied to analyze electrocorticogram signals from a CLIS patient to detect the underlying consciousness level. Results from these different methods converge to a specific period of awareness of the CLIS patient in question, coinciding with the period during which the CLIS patient is recorded to have communicated with an experimenter. The aim of the investigation is to propose a methodology that could be used to create reliable communication with CLIS patients.

Page generated in 0.1289 seconds