Spelling suggestions: "subject:"complexity measures"" "subject:"komplexity measures""
1 |
Efficient Ways to Upgrade Docker Containers in Cloud to Support Backward Compatibility : Various Upgrade Strategies to Measure ComplexityMADALA, SRAVYA January 2016 (has links)
If the present world scenario in telecommunication systems is considered thousands of systems are getting moved into the cloud because of its wide features. This thesis explains the efficient ways to upgrade Docker containers in a way to support backward compatibility. It mainly concerns about the high-availability of systems in the cloud environment during upgrades. Smaller changes can be implemented automatically to some extent. The minor changes can be handled by Apache Avro where schema is defined in it. But at some point Avro also cannot handle the situation which becomes much more complex. In a real world example, we need to perform major changes on the top of an application. Here we are testing different upgrade strategies and comparing the code complexity, total time to upgrade, network usage of single upgrades strategy versus multiple upgrade strategy with and without Use of Avro. When code complexity is compared the case without Avro performs well in single upgrade strategy with less time to upgrade all six instances but the network usage is more compared to multiple upgrades. So single upgrade strategy is better to maintain high availability in Cloud by performing the upgrades in an efficient manner.
|
2 |
ANALYSIS OF SECURITY MEASURES FOR SEQUENCESKavuluru, Ramakanth 01 January 2009 (has links)
Stream ciphers are private key cryptosystems used for security in communication and data transmission systems. Because they are used to encrypt streams of data, it is necessary for stream ciphers to use primitives that are easy to implement and fast to operate. LFSRs and the recently invented FCSRs are two such primitives, which give rise to certain security measures for the cryptographic strength of sequences, which we refer to as complexity measures henceforth following the convention. The linear (resp. N-adic) complexity of a sequence is the length of the shortest LFSR (resp. FCSR) that can generate the sequence. Due to the availability of shift register synthesis algorithms, sequences used for cryptographic purposes should have high values for these complexity measures. It is also essential that the complexity of these sequences does not decrease when a few symbols are changed. The k-error complexity of a sequence is the smallest value of the complexity of a sequence obtained by altering k or fewer symbols in the given sequence. For a sequence to be considered cryptographically ‘strong’ it should have both high complexity and high error complexity values.
An important problem regarding sequence complexity measures is to determine good bounds on a specific complexity measure for a given sequence. In this thesis we derive new nontrivial lower bounds on the k-operation complexity of periodic sequences in both the linear and N-adic cases. Here the operations considered are combinations of insertions, deletions, and substitutions. We show that our bounds are tight and also derive several auxiliary results based on them.
A second problem on sequence complexity measures useful in the design and analysis of stream ciphers is to determine the number of sequences with a given fixed (error) complexity value. In this thesis we address this problem for the k-error linear complexity of 2n-periodic binary sequences. More specifically:
1. We characterize 2n-periodic binary sequences with fixed 2- or 3-error linear complexity and obtain the counting function for the number of such sequences with fixed k-error linear complexity for k = 2 or 3.
2. We obtain partial results on the number of 2n-periodic binary sequences with fixed k-error linear complexity when k is the minimum number of changes required to lower the linear complexity.
|
3 |
Towards immunization of complex engineered systems: products, processes and organizationsEfatmaneshnik, Mahmoud, Mechanical & Manufacturing Engineering, Faculty of Engineering, UNSW January 2009 (has links)
Engineering complex systems and New Product Development (NPD) are major challenges for contemporary engineering design and must be studied at three levels of: Products, Processes and Organizations (PPO). The science of complexity indicates that complex systems share a common characteristic: they are robust yet fragile. Complex and large scale systems are robust in the face of many uncertainties and variations; however, they can collapse, when facing certain conditions. This is so since complex systems embody many subtle, intricate and nonlinear interactions. If formal modelling exercises with available computational approaches are not able to assist designers to arrive at accurate predictions, then how can we immunize our large scale and complex systems against sudden catastrophic collapse? This thesis is an investigation into complex product design. We tackle the issue first by introducing a template and/or design methodology for complex product design. This template is an integrated product design scheme which embodies and combines elements of both design theory and organization theory; in particular distributed (spatial and temporal) problem solving and adaptive team formation are brought together. This design methodology harnesses emergence and innovation through the incorporation of massive amount of numerical simulations which determines the problem structure as well as the solution space characteristics. Within the context of this design methodology three design methods based on measures of complexity are presented. Complexity measures generally reflect holistic structural characteristics of systems. At the levels of PPO, correspondingly, the Immunity Index (global modal robustness) as an objective function for solutions, the real complexity of decompositions, and the cognitive complexity of a design system are introduced These three measures are helpful in immunizing the complex PPO from chaos and catastrophic failure. In the end, a conceptual decision support system (DSS) for complex NPD based on the presented design template and the complexity measures is introduced. This support system (IMMUNE) is represented by a Multi Agent Blackboard System, and has the dual characteristic of the distributed problem solving environments and yet reflecting the centralized viewpoint to process monitoring. In other words IMMUNE advocates autonomous problem solving (design) agents that is the necessary attribute of innovative design organizations and/or innovation networks; and at the same time it promotes coherence in the design system that is usually seen in centralized systems.
|
4 |
Independent component analysis for maternal-fetal electrocardiographyMarcynuk, Kathryn L. 09 January 2015 (has links)
Separating unknown signal mixtures into their constituent parts is a difficult problem in signal processing called blind source separation. One of the benchmark problems in this area is the extraction of the fetal heartbeat from an electrocardiogram in which it is overshadowed by a strong maternal heartbeat. This thesis presents a study of a signal separation technique called independent component analysis (ICA), in order to assess its suitability for the maternal-fetal ECG separation problem. This includes an analysis of ICA on deterministic, stochastic, simulated and recorded ECG signals. The experiments presented in this thesis demonstrate that ICA is effective on linear mixtures of known simulated or recorded ECGs. The performance of ICA was measured using visual comparison, heart rate extraction, and energy, information theoretic, and fractal-based measures. ICA extraction of clinically recorded maternal-fetal ECGs mixtures, in which the source signals were unknown, were successful at recovering the fetal heart rate.
|
5 |
Towards immunization of complex engineered systems: products, processes and organizationsEfatmaneshnik, Mahmoud, Mechanical & Manufacturing Engineering, Faculty of Engineering, UNSW January 2009 (has links)
Engineering complex systems and New Product Development (NPD) are major challenges for contemporary engineering design and must be studied at three levels of: Products, Processes and Organizations (PPO). The science of complexity indicates that complex systems share a common characteristic: they are robust yet fragile. Complex and large scale systems are robust in the face of many uncertainties and variations; however, they can collapse, when facing certain conditions. This is so since complex systems embody many subtle, intricate and nonlinear interactions. If formal modelling exercises with available computational approaches are not able to assist designers to arrive at accurate predictions, then how can we immunize our large scale and complex systems against sudden catastrophic collapse? This thesis is an investigation into complex product design. We tackle the issue first by introducing a template and/or design methodology for complex product design. This template is an integrated product design scheme which embodies and combines elements of both design theory and organization theory; in particular distributed (spatial and temporal) problem solving and adaptive team formation are brought together. This design methodology harnesses emergence and innovation through the incorporation of massive amount of numerical simulations which determines the problem structure as well as the solution space characteristics. Within the context of this design methodology three design methods based on measures of complexity are presented. Complexity measures generally reflect holistic structural characteristics of systems. At the levels of PPO, correspondingly, the Immunity Index (global modal robustness) as an objective function for solutions, the real complexity of decompositions, and the cognitive complexity of a design system are introduced These three measures are helpful in immunizing the complex PPO from chaos and catastrophic failure. In the end, a conceptual decision support system (DSS) for complex NPD based on the presented design template and the complexity measures is introduced. This support system (IMMUNE) is represented by a Multi Agent Blackboard System, and has the dual characteristic of the distributed problem solving environments and yet reflecting the centralized viewpoint to process monitoring. In other words IMMUNE advocates autonomous problem solving (design) agents that is the necessary attribute of innovative design organizations and/or innovation networks; and at the same time it promotes coherence in the design system that is usually seen in centralized systems.
|
6 |
Pareto multi-objective evolution of legged embodied organismsTeo, Jason T. W., Information Technology & Electrical Engineering, Australian Defence Force Academy, UNSW January 2003 (has links)
The automatic synthesis of embodied creatures through artificial evolution has become a key area of research in robotics, artificial life and the cognitive sciences. However, the research has mainly focused on genetic encodings and fitness functions. Considerably less has been said about the role of controllers and how they affect the evolution of morphologies and behaviors in artificial creatures. Furthermore, the evolutionary algorithms used to evolve the controllers and morphologies are pre-dominantly based on a single objective or a weighted combination of multiple objectives, and a large majority of the behaviors evolved are for wheeled or abstract artifacts. In this thesis, we present a systematic study of evolving artificial neural network (ANN) controllers for the legged locomotion of embodied organisms. A virtual but physically accurate world is used to simulate the evolution of locomotion behavior in a quadruped creature. An algorithm using a self-adaptive Pareto multi-objective evolutionary optimization approach is developed. The experiments are designed to address five research aims investigating: (1) the search space characteristics associated with four classes of ANNs with different connectivity types, (2) the effect of selection pressure from a self-adaptive Pareto approach on the nature of the locomotion behavior and capacity (VC-dimension) of the ANN controller generated, (3) the effciency of the proposed approach against more conventional methods of evolutionary optimization in terms of computational cost and quality of solutions, (4) a multi-objective approach towards the comparison of evolved creature complexities, (5) the impact of relaxing certain morphological constraints on evolving locomotion controllers. The results showed that: (1) the search space is highly heterogeneous with both rugged and smooth landscape regions, (2) pure reactive controllers not requiring any hidden layer transformations were able to produce sufficiently good legged locomotion, (3) the proposed approach yielded competitive locomotion controllers while requiring significantly less computational cost, (4) multi-objectivity provided a practical and mathematically-founded methodology for comparing the complexities of evolved creatures, (5) co-evolution of morphology and mind produced significantly different creature designs that were able to generate similarly good locomotion behaviors. These findings attest that a Pareto multi-objective paradigm can spawn highly beneficial robotics and virtual reality applications.
|
7 |
Antecipação de crises financeiras por meio de medidas de complexidade: evidências do Brasil. / Complexity measures as crises early warning: evidence from Brazil.Mortoza, Leticia Pelluci Duarte 11 October 2017 (has links)
O clássico Equilíbrio Econômico nunca foi realidade, especialmente após as primeiras crises dos mercados financeiros. Hoje se sabe que as economias estão longe da situação de equilíbrio, sendo vistas mais como um processo em construção do que um estado estático propriamente dito. Se assemelham a um sistema estocástico, e não determinístico como um dia se pensou. O Brasil é um país jovem, e seus sistemas econômico e político ainda estão em formação. Tendo em vista todas as mudanças e crises que o país tem sofrido em sua história recente, este estudo busca uma forma alternativa para que tais eventos possam ser detectados e, principalmente, de certa forma antecipados, para que as providências cabíveis possam ser tomadas a tempo de se evitar grandes perdas financeiras. Para tal, as medidas de Complexidade de SDL e LMC são aplicadas às séries do câmbio dolar-real, Ibovespa e CDS Brasil e avaliadas durante eventos de crises. Detectados os principais eventos de cada série, \"volta-se no tempo\", ao início da crise, e avalia-se, dada a informação disponível naquele momento, a possibilidade de se detectar a crise em seus primeiros estágios. Ao fim, conclui-se que as Medidas de Complexidade LMC e SDL são robustas na detecção de aumentos de volatilidade nos dados de séries financeiras. Assim sendo, apresentam grande potencial como indicadores precoces de crises financeiras. Para tal, não são necessários cálculos extensivos, nem grandes históricos de dados; e também não são necessárias hipóteses sobre a distribuição de probabilidades destes dados. Acredita-se que este seja o primeiro passo em direção à construção de um monitor de crises em tempo real. / The classical Economic Equilibrium has never been a reality, especially after the first financial markets crisis events. It is known nowadays that economies are far from their Equilibrium, they are seen more as a process under construction, not a static state; a stochastic instead of deterministic process, as it was thought before. Brazil is a young country, hence its economic and political systems are still maturing. In light of all the changes and crises it has been suffering in the recent history, this research seeks for an alternative mechanism to detect and anticipate these crisis events, in order to avoid massive financial losses. To this end, the LMC and SDL Complexity Measures are applied to the Dollar-Real exchange rates, Ibovespa and Brazilian CDS time series during crisis events. After detecting the main events, the idea is to \"turn back in time\", to the events\' inception, and analyse if, given the limited amount of information on that time, the crises could be detected on their early stages. Finally, this research concludes that both LMC and SDL Complexity Measures are robust in detecting volatility increases on the financial series, revealing good potential as crises early warning. However, no extensive calculus, large samples, or strong assumptions about the data probability distributions are needed to this aim. Therefore, these results represent the very first step towards a crises real time monitor.
|
8 |
Antecipação de crises financeiras por meio de medidas de complexidade: evidências do Brasil. / Complexity measures as crises early warning: evidence from Brazil.Leticia Pelluci Duarte Mortoza 11 October 2017 (has links)
O clássico Equilíbrio Econômico nunca foi realidade, especialmente após as primeiras crises dos mercados financeiros. Hoje se sabe que as economias estão longe da situação de equilíbrio, sendo vistas mais como um processo em construção do que um estado estático propriamente dito. Se assemelham a um sistema estocástico, e não determinístico como um dia se pensou. O Brasil é um país jovem, e seus sistemas econômico e político ainda estão em formação. Tendo em vista todas as mudanças e crises que o país tem sofrido em sua história recente, este estudo busca uma forma alternativa para que tais eventos possam ser detectados e, principalmente, de certa forma antecipados, para que as providências cabíveis possam ser tomadas a tempo de se evitar grandes perdas financeiras. Para tal, as medidas de Complexidade de SDL e LMC são aplicadas às séries do câmbio dolar-real, Ibovespa e CDS Brasil e avaliadas durante eventos de crises. Detectados os principais eventos de cada série, \"volta-se no tempo\", ao início da crise, e avalia-se, dada a informação disponível naquele momento, a possibilidade de se detectar a crise em seus primeiros estágios. Ao fim, conclui-se que as Medidas de Complexidade LMC e SDL são robustas na detecção de aumentos de volatilidade nos dados de séries financeiras. Assim sendo, apresentam grande potencial como indicadores precoces de crises financeiras. Para tal, não são necessários cálculos extensivos, nem grandes históricos de dados; e também não são necessárias hipóteses sobre a distribuição de probabilidades destes dados. Acredita-se que este seja o primeiro passo em direção à construção de um monitor de crises em tempo real. / The classical Economic Equilibrium has never been a reality, especially after the first financial markets crisis events. It is known nowadays that economies are far from their Equilibrium, they are seen more as a process under construction, not a static state; a stochastic instead of deterministic process, as it was thought before. Brazil is a young country, hence its economic and political systems are still maturing. In light of all the changes and crises it has been suffering in the recent history, this research seeks for an alternative mechanism to detect and anticipate these crisis events, in order to avoid massive financial losses. To this end, the LMC and SDL Complexity Measures are applied to the Dollar-Real exchange rates, Ibovespa and Brazilian CDS time series during crisis events. After detecting the main events, the idea is to \"turn back in time\", to the events\' inception, and analyse if, given the limited amount of information on that time, the crises could be detected on their early stages. Finally, this research concludes that both LMC and SDL Complexity Measures are robust in detecting volatility increases on the financial series, revealing good potential as crises early warning. However, no extensive calculus, large samples, or strong assumptions about the data probability distributions are needed to this aim. Therefore, these results represent the very first step towards a crises real time monitor.
|
9 |
Application of Complexity Measures to Stratospheric DynamicsKrützmann, Nikolai Christian January 2008 (has links)
This thesis examines the utility of mathematical complexity measures for the analysis of stratospheric dynamics. Through theoretical considerations and tests with artificial data sets, e.g., the iteration of the logistic map, suitable parameters are determined for the application of the statistical entropy measures sample entropy (SE) and Rényi entropy (RE) to methane (a long-lived stratospheric tracer) data from simulations of the SOCOL chemistry-climate model.
The SE is shown to be useful for quantifying the variability of recurring patterns in a time series and is able to identify tropical patterns similar to those reported by previous studies of the ``tropical pipe'' region. However, the SE is found to be unsuitable for use in polar regions, due to the non-stationarity of the methane data at extra-tropical latitudes. It is concluded that the SE cannot be used to analyse climate complexity on a global scale.
The focus is turned to the RE, which is a complexity measure of probability distribution functions (PDFs). Using the second order RE and a normalisation factor, zonal PDFs of ten consecutive days of methane data are created with a Bayesian optimal binning technique. From these, the RE is calculated for every day (moving 10-day window). The results indicate that the RE is a promising tool for identifying stratospheric mixing barriers. In Southern Hemisphere winter and early spring, RE produces patterns similar to those found in other studies of stratospheric mixing. High values of RE are found to be indicative of the strong fluctuations in tracer distributions associated with relatively unmixed air in general, and with gradients in the vicinity of mixing barriers, in particular. Lower values suggest more thoroughly mixed air masses.
The analysis is extended to eleven years of model data. Realistic inter-annual variability of some of the RE structures is observed, particularly in the Southern Hemisphere. By calculating a climatological mean of the RE for this period, additional mixing patterns are identified in the Northern Hemisphere. The validity of the RE analysis and its interpretation is underlined by showing that qualitatively similar patterns can be seen when using observational satellite data of a different tracer. Compared to previous techniques, the RE has the advantage that it requires significantly less computational effort, as it can be used to derive dynamical information from model or measurement tracer data without relying on any additional input such as wind fields.
The results presented in this thesis strongly suggest that the RE is a useful new metric for analysing stratospheric mixing and its variability from climate model data. Furthermore, it is shown that the RE measure is very robust with respect to data gaps, which makes it ideal for application to observations. Hence, using the RE for comparing observations of tracer distributions with those from model simulations potentially presents a novel approach for analysing mixing in the stratosphere.
|
10 |
Application of Complexity Measures to Stratospheric DynamicsKrützmann, Nikolai Christian January 2008 (has links)
This thesis examines the utility of mathematical complexity measures for the analysis of stratospheric dynamics. Through theoretical considerations and tests with artificial data sets, e.g., the iteration of the logistic map, suitable parameters are determined for the application of the statistical entropy measures sample entropy (SE) and Rényi entropy (RE) to methane (a long-lived stratospheric tracer) data from simulations of the SOCOL chemistry-climate model. The SE is shown to be useful for quantifying the variability of recurring patterns in a time series and is able to identify tropical patterns similar to those reported by previous studies of the ``tropical pipe'' region. However, the SE is found to be unsuitable for use in polar regions, due to the non-stationarity of the methane data at extra-tropical latitudes. It is concluded that the SE cannot be used to analyse climate complexity on a global scale. The focus is turned to the RE, which is a complexity measure of probability distribution functions (PDFs). Using the second order RE and a normalisation factor, zonal PDFs of ten consecutive days of methane data are created with a Bayesian optimal binning technique. From these, the RE is calculated for every day (moving 10-day window). The results indicate that the RE is a promising tool for identifying stratospheric mixing barriers. In Southern Hemisphere winter and early spring, RE produces patterns similar to those found in other studies of stratospheric mixing. High values of RE are found to be indicative of the strong fluctuations in tracer distributions associated with relatively unmixed air in general, and with gradients in the vicinity of mixing barriers, in particular. Lower values suggest more thoroughly mixed air masses. The analysis is extended to eleven years of model data. Realistic inter-annual variability of some of the RE structures is observed, particularly in the Southern Hemisphere. By calculating a climatological mean of the RE for this period, additional mixing patterns are identified in the Northern Hemisphere. The validity of the RE analysis and its interpretation is underlined by showing that qualitatively similar patterns can be seen when using observational satellite data of a different tracer. Compared to previous techniques, the RE has the advantage that it requires significantly less computational effort, as it can be used to derive dynamical information from model or measurement tracer data without relying on any additional input such as wind fields. The results presented in this thesis strongly suggest that the RE is a useful new metric for analysing stratospheric mixing and its variability from climate model data. Furthermore, it is shown that the RE measure is very robust with respect to data gaps, which makes it ideal for application to observations. Hence, using the RE for comparing observations of tracer distributions with those from model simulations potentially presents a novel approach for analysing mixing in the stratosphere.
|
Page generated in 0.0566 seconds