• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 9
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 20
  • 20
  • 4
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Advancing egress complexity in support of rule-based evacuation modelling

Livesey, Gillian Elizabeth January 2003 (has links)
No description available.
2

Evaluation of E-Participation Efficiency with Biodiversity Measures - the Case of the Digital Agenda Vienna

May, John, Leo, Hannes, Taudes, Alfred January 2015 (has links) (PDF)
We introduce the Effective Number of Issues measure for e-participation efficiency. This novel index is based on the Shannon entropy measure of biodiversity and summarizes the amount of information gained through an e-participation project in one number. This makes the comparison between different e-participation projects straightforward and lays the foundation for the rigorous analysis of success factors of e-participation projects in a data-driven way. After providing the formula and rationale for the new measure we use the ENI index to benchmark the idea generation process for the digital agenda Vienna against other projects. It turns out that the efficiency of this project is significantly higher than those observed for other cases. We conjecture that this can be attributed to the user-friendly design of the software platform and the effective communication strategy of the process management. Finally, suggestions for further research are given. (authors' abstract)
3

Investigation and improvement of criticality calculations in MCNP5 involving Shannon entropy convergence

Koch, David 08 June 2015 (has links)
Criticality calculations are often performed in MCNP5 using the Shannon entropy as an indicator of source convergence for the given neutron transport problem. The Shannon entropy is a concept that comes from information theory. The Shannon entropy is calculated for each batch in MCNP5, and it has been shown that the Shannon entropy tends to converge to a single value as the source distribution converges. MCNP5 has its own criteria for when the Shannon entropy has converged and recommends a number for how many batches should be skipped; however, this value for how many batches should be skipped is often not very accurate and has room for improvement. This work will investigate an approach for using the Shannon entropy source distribution convergence information obtained in a shorter simulation to predict the required number of generations skipped in the reference case with desired statistical precision. In several test cases, it has been found that running a lesser number of particles per batch produces a similar Shannon entropy graph when compared to running more particles per batch. Then, by appropriate adjustment through a synthetic model, one is able to determine when the Shannon entropy will converge by running fewer particles, finding the point where it converges and then using this value to determine how many batches one should skip for a given problem. This reduces computational time and any "guessing" involved when deciding how many batches to skip. Thus, the purpose of this research is to develop a model showing how one can use this concept and produce a streamlined approach for applying this concept to a criticality problem.
4

Information Theoretical Measures for Achieving Robust Learning Machines

Zegers, Pablo, Frieden, B., Alarcón, Carlos, Fuentes, Alexis 12 August 2016 (has links)
Information theoretical measures are used to design, from first principles, an objective function that can drive a learning machine process to a solution that is robust to perturbations in parameters. Full analytic derivations are given and tested with computational examples showing that indeed the procedure is successful. The final solution, implemented by a robust learning machine, expresses a balance between Shannon differential entropy and Fisher information. This is also surprising in being an analytical relation, given the purely numerical operations of the learning machine.
5

Bayesian estimation of Shannon entropy for bivariate beta priors

Bodvin, Joanna Sylvia Liesbeth 10 July 2010 (has links)
Having just survived what is arguably the worst financial crisis in time, it is expected that the focus on regulatory capital held by financial institutions such as banks will increase significantly over the next few years. The probability of default is an important determinant of the amount of regulatory capital to be held, and the accurate calibration of this measure is vital. The purpose of this study is to propose the use of the Shannon entropy when determining the parameters of the prior bivariate beta distribution as part of a Bayesian calibration methodology. Various bivariate beta distributions will be considered as priors to the multinomial distribution associated with rating categories, and the appropriateness of these bivariate beta distributions will be tested on default data. The formulae derived for the Bayesian estimation of Shannon entropy will be used to measure the certainty obtained when selecting the prior parameters. / Dissertation (MSc)--University of Pretoria, 2010. / Statistics / unrestricted
6

The Effective Spin Concept to Study the Properties of the Shannon Entropy of Arrays of Elastic Scatterers

Liu, Wei 19 April 2012 (has links)
No description available.
7

Security of Lightweight Cryptographic Primitives

Vennos, Amy Demetra Geae 10 June 2021 (has links)
Internet-of-Things (IoT) devices are increasing in popularity due to their ability to help automate many aspects of daily life while performing these necessary duties on billions of low-power appliances. However, the perks of these small devices also come with additional constraints to security. Security always has been an issue with the rise of cryptographic backdoors and hackers reverse engineering the security protocols within devices to reveal the original state that was encrypted. Security researchers have done much work to prevent attacks with high power algorithms, such as the international effort to develop the current Advanced Encryption Standard (AES). Unfortunately, IoT devices do not typically have the computational resources to implement high-power algorithms such as AES, and must rely on lightweight primitives such as pseudorandom number generators, or PRNGs.This thesis explores the effectiveness, functionality, and use of PRNGs in different applications. First, this thesis investigates the confidentiality of a single-stage residue number system PRNG, which has previously been shown to provide extremely high quality outputs for simulation and digital communication applications when evaluated through traditional techniques like the battery of statistical tests used in the NIST Random Number Generation and DIEHARD test suites or in using Shannon entropy metrics. In contrast, rather than blindly performing statistical analyses on the outputs of the single-stage RNS PRNG, this thesis provides both white box and black box analyses that facilitate reverse engineering of the underlying RNS number generation algorithm to obtain the residues, or equivalently the key, of the RNS algorithm. This thesis develops and demonstrate a conditional entropy analysis that permits extraction of the key given a priori knowledge of state transitions as well as reverse engineering of the RNS PRNG algorithm and parameters (but not the key) in problems where the multiplicative RNS characteristic is too large to obtain a priori state transitions. This thesis then discusses multiple defenses and perturbations for the RNS system that defeat the original attack algorithm, including deliberate noise injection and code hopping. We present a modification to the algorithm that accounts for deliberate noise, but rapidly increases the search space and complexity. Lastly, a comparison of memory requirements and time required for the attacker and defender to maintain these defenses is presented. The next application of PRNGs is in building a translation for binary PRNGs to non-binary uses like card shuffling in a casino. This thesis explores a shuffler algorithm that utilizes RNS in Fisher-Yates shuffles, and that calls for inputs from any PRNG. Entropy is lost through this algorithm by the use of PRNG in lieu of TRNG and by its RNS component: a surjective mapping from a large domain of size $2^J$ to a substantially smaller set of arbitrary size $n$. Previous research on the specific RNS mapping process had developed a lower bound on the Shannon entropy loss from such a mapping, but this bound eliminates the mixed-radix component of the original formulation. This thesis calculates a more precise formula which takes into account the radix, $n$. This formulation is later used to specify the optimal parameters to simulate the shuffler with different test PRNGs. After implementing the shuffler with PRNGs with varying output entropies, the thesis examines the output value frequencies to discuss if utilizing PRNG is a feasible alternative for casinos to the higher-cost TRNG. / Master of Science / Cryptography, or the encrypting of data, has drawn widespread interest for years, initially sparking public concern through headlines and dramatized reenactments of hackers targeting security protocols. Previous cryptographic research commonly focused on developing the quickest, most secure ways to encrypt information on high-power computers. However, as wireless low-power devices such as smart home, security sensors, and learning thermostats gain popularity in ordinary life, interest is rising in protecting information being sent between devices that don't necessarily have the power and capabilities as those in a government facility. Lightweight primitives, the algorithms used to encrypt information between low-power devices, are one solution to this concern, though they are more susceptible to attackers who wish to reverse engineer the encrypting process. The pesudorandom number generator (PRNG) is a type of lightweight primitive that generates numbers that are essentially random even though it is possible to determine the input value, or seed, from the resulting output values. This thesis explores the effectiveness and functionality of PRNGs in different applications. First, this thesis explores a PRNG that has passed many statistical tests to prove its output values are random enough for certain applications. This project analyzes the quality of this PRNG through a new lens: its resistance to reverse engineering attacks. The thesis describes and implements an attack on the PRNG that allows an individual to reverse engineer the initial seed. The thesis then changes perspective from attacker to designer and develop defenses to this attack: by slightly modifying the algorithm, the designer can ensure that the reverse engineering process is so complex, time-consuming, and memory-requiring that implementing such an attack would be impractical for an attacker. The next application of PRNGs is in the casino industry, in which low-power and cost-effective automatic card shufflers for games like poker are becoming popular. This thesis explores a solution for optimal shuffling of a deck of cards.
8

On the assessment of manufacturing systems complexity / Εκτίμηση πολυπλοκότητας συστημάτων παραγωγής

Ευθυμίου, Κωνσταντίνος 12 October 2013 (has links)
Objective of the present study is the development of methods for the assessment of manufacturing systems complexity and the investigation of flexibility and complexity relationship. Towards this target, a complete approach based on information theory permitting the analytical, quantitative and systematic modeling and quantification of both static and dynamic manufacturing complexity is proposed. Static complexity concerns the structure of the manufacturing systems, namely the products, the processes, the resources that constitute the systems as well as their interconnections. Static complexity is treated as the information that is required for the description of a manufacturing system. Multi domain matrices modeling the relationships between products, processes and resources are formalized as networks following the notions of graph theory. The information content of each matrix is assessed employing Shannon entropy measure and their aggregation yields the static complexity. Dynamic complexity is related to the uncertainty in the behaviour of a manufacturing system and in the present study is associated with the unpredictability of the performance indicators timeseries. The unpredictability of the performance indicators timeseries, which are provided by computer simulation, is captured employing the Lempel Ziv algorithm that calculates the Kolmogorov complexity. The dynamic complexity is either the unpredictability of a specific timeseries or the weighted mean of a series of performance indicators timeseries produced under different product demand scenarios. The relationship between flexibility and complexity is investigated for a group of 19 different configurations of a manufacturing system. In particular, operation flexibility that refers to the system’s ability to produce a set of products through different machines, materials, operations and sequences of operations and total complexity, and both static and dynamic are examined employing a utility function. As a case study, two assembly lines producing three car floor model types at three different product mixes are investigated. The dynamic complexity of each assembly line is assessed and the relationship between product mix and dynamic complexity is studied. The evaluation of the case study revealed the efficiency of the suggested approach validated its applicability to industrial environments. / Αντικείμενο της παρούσας διατριβής είναι η ανάπτυξη μεθόδων για την εκτίμηση πολυπλοκότητας συστημάτων παραγωγής και η διερεύνηση της σχέσης ευελιξίας και πολυπλοκότητας. Προς αυτή την κατεύθυνση προτείνεται μια ολοκληρωμένη προσέγγιση βασισμένη στην θεωρία της πληροφορίας που επιτρέπει μια αναλυτική, ποσοτικοποιημένη και συστηματική προτυποποίηση και εκτίμηση τόσο της στατικής όσο και της δυναμικής πολυπλοκότητας των συστημάτων παραγωγής. Η στατική πολυπλοκότητα αφορά την δομή των συστημάτων παραγωγής, και σχετίζεται με τα προϊόντα, τις διεργασίες, τους παραγωγικούς πόρους που αποτελούν το σύστημα καθώς και τις μεταξύ τους σχέσεις. Η στατική πολυπλοκότητα αντιμετωπίζεται ως η πληροφορία που απαιτείται για να περιγραφεί ένα σύστημα παραγωγής. Πολυ-πεδιακοί πίνακες αναπαριστούν τις σχέσεις μεταξύ προϊόντων, διεργασιών και πόρων και προτυποποιούνται ως δίκτυα ακολουθώντας την θεωρία γράφων. Το πληροφοριακό περιεχόμενο κάθε πίνακα εκτιμάται με την χρήση της εντροπίας Shannon και το άθροισμα για όλους τους πίνακες δίνει την στατική πολυπλοκότητα. Η δυναμική πολυπλοκότητα σχετίζεται με την αβεβαιότητα της συμπεριφοράς των συστημάτων παραγωγής και στην παρούσα διατριβή συνδέεται με την απροβλεψιμότητα των χρονοσειρών δεικτών απόδοσης ενός συστήματος. Οι χρονοσειρές των δεικτών απόδοσης προκύπτουν από υπολογιστική προσομοίωση και η απροβλεψιμότητα τους εκτιμάται με των αλγόριθμο Lempel Ziv ο οποίος υπολογίζει την πολυπλοκότητα Kolmogorov. Η δυναμική πολυπλοκότητα είναι η απροβλεψιμότητα είτε μιας συγκεκριμένης χρονοσειράς είτε ο σταθμισμένος μέσος όρος ενός συνόλου χρονοσειρών δεικτών απόδοσης. Η σχέση ευελιξίας – πολυπλοκότητας διερευνάται για 19 διαμορφώσεις ενός συστήματος παραγωγής. Συγκεκριμένα, η ευελιξία λειτουργίας που αναφέρεται στην ικανότητα ενός συστήματος να παράγει ένα σύνολο προϊόντων χρησιμοποιώντας διαφορετικές μηχανές και διεργασίες και πολυπλοκότητα τόσο η στατική όσο και η δυναμική μελετώνται με μια συνάρτηση χρησιμότητας. Ως περίπτωση μελέτης εξετάζονται δύο γραμμές συναρμολόγησης που παράγουν τρία δάπεδα αμαξιού σε τρία μείγματα παραγωγής. Η δυναμική πολυπλοκότητα κάθε γραμμής και η σχέση μείγματος παραγωγής και δυναμικής πολυπλοκότητα μελετώνται. Η αξιολόγηση της περίπτωσης μελέτης αποδεικνύει την αποτελεσματικότητα των προτεινόμενων μεθόδων σε βιομηχανικό περιβάλλον.
9

Acoustic emission monitoring of damage progression in fiber reinforced polymer rods

Shateri, Mohammadhadi 09 March 2017 (has links)
The fiber reinforced polymer (FRP) bars have been widely used in pre-stressing applications and reinforcing of the civil structures. High strength-to-weight ratio and high resistance to the corrosion make the FRP bars a good replacement for steel reinforcing bars in civil engineering applications. According to the CAN/CSA-S806-12 standard, the maximum recommended stress in FRP bars under service loads should not exceed 25% and 65% of the ultimate strength for glass FRP (GFRP) and carbon FRP (CFRP), respectively. These stress values are set to prevent creep failure in FRP bars. However, for in-service applications, there are few physical indicators that these values have been reached or exceeded. In this work analysis of acoustic emission (AE) signals is used. Two new techniques based on pattern recognition and frequency entropy of the isolated acoustic emission (AE) signal are presented for monitoring damage progression and prediction of failure in FRPs. / May 2017
10

Entropies and predictability of variability indices of the tropical Pacific

Tánchez, Luis Eduardo Ortiz 05 October 2004 (has links)
Die folgende Arbeit befasst sich mit der Vorhersagbarkeit und der zeitlichen Struktur von Indizes der klimatischen Variabilität des tropischen Pazifiks, bekannt in der Jahrzentenskala als El-Nino-Southern Oscillation (ENSO). Untersucht wurden die Zeitreihen der Anomalien und Persistenzen der Southern Oscillation Index (SOI), den Multivariate ENSO Index (MEI) und die Meeresoberflächentemperatur (SST). Methoden der dynamischen und bedingten schannonschen Entropien wurden für die Untersuchung der Vorhersagbarkeit von symbolischen Sequenzen der Zeitreihen angewendet. Die Untersuchung der bedingten Entropien für symbolische Sequenzen ergibt, dass die meist vorhersagbare Evente von ENSO nach konstanten Teilsequenzen stattfinden. Für mehrere Evente sind zeitliche Korrelationen nachweisbar, die die Vorhersagbarkeit eines Symbols nach einer Teilsequenz in Funktion derer Länge bestimmen. Die Evolution nach Teilsequenzen, die Übergangszuständen entsprechen, sind mit vergleichsweiseniedrigeren Vorhersagbarkeiten versehen. Dabei ist auf die meist vorhersagbaren Teilsequenzen im Detail eingegangen. Es wurde weiterhin festgestellt, dass sich die SST in den meisten Fällen als die zuverlässigste Informationsquelle erweist. Die Analyse der Waveletspektren der Zeitreihen zeigt starke Periodizitäten der Ordnung zwischen 2 und 4 Jahren, die zwischen 1900 und 1960, und 1970 und 2000 in ENSO auftreten. Es besteht Evidenz dafür, dass diese Frequenzkomponenten nicht von einem gefiteten Markovprozess erster Ordnung zurückzuführen sind. Eine Steigung der Frequenzkomponenten zu niedrigeren Perioden ist weiterhin in den Anomalien der Meerestemperatur vorzuweisen. / This doctoral thesis is concerned with the problems of the predictability and the temporal structure of indices of the climatic variability in the tropical Pacific, which is known in the scale of decades as El Nino-Southern Oscillation (ENSO). For this purpose, time series of the anomalies and persistences of the Southern Oscillation Index (SOI), Multivariate ENSO Index (MEI) and of the Sea Surface Temperature (SST) were investigated. Methods of the dynamical and conditional shannon entropies were applied for the investigation of the predictability of symbolic sequences derived from the time series. The investigation of the conditional entropies for symbolic sequences shows that the most probable Events of ENSO occur after constant short sequences. Time correlations are found for several events; these determine the predictability of a sequence as a function of its length. The evolutions of short sequences representing transitions between ENSO states are relatively less predictable. The most predictable short sequences have been studied in detail. It was further found that, in most cases, SST is the most reliable information source. The analysis of the wavelet spectra of the time series shows strong periodicities of 2 to 4 years, which appear between 1900 and 1960, and between 1970 and 2000 in ENSO. There is evidence of a non-markovian process being responsible for these frequency components. Furthermore, the anomalies of the SST series show a gradient of frequency components towards smaller periods.

Page generated in 0.0632 seconds