• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 178
  • 22
  • 18
  • 13
  • 9
  • 6
  • 6
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 2
  • 2
  • Tagged with
  • 320
  • 320
  • 105
  • 87
  • 76
  • 67
  • 44
  • 40
  • 37
  • 35
  • 28
  • 28
  • 26
  • 25
  • 25
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

The control and transformation metric: a basis for measuring model complexity

Wallace, Jack C. January 1985 (has links)
The purpose of this report is to develop a complexity metric suitable for discrete event simulation model representations. Current software metrics, based upon graphical analysis or static program characteristics, do not capture the influence on complexity stemming from the inherent dynamics of a model. A study of extant software metrics provides a basis for identifying desirable properties for model application. Various approaches are examined, and a set of characteristics for a model complexity metric is defined. A metric evolves from the recognition of the two types of complexity: transformation and control, both of which appear prominently in model representations. Experimental data are presented to verify that the Control and Transformation (CAT) metric reflects the desired behavior. Experimental evaluation supports the claim that the CAT metric is an improvement over existing software metrics for measuring the complexity of model representations. / Master of Science
102

Rijndael Circuit Level Cryptanalysis

Pehlivanoglu, Serdar 05 May 2005 (has links)
The Rijndael cipher was chosen as the Advanced Encryption Standard (AES) in August 1999. Its internal structure exhibits unusual properties such as a clean and simple algebraic description for the S-box. In this research, we construct a scalable family of ciphers which behave very much like the original Rijndael. This approach gives us the opportunity to use computational complexity theory. In the main result, we generate a candidate one-way function family from the scalable Rijndael family. We note that, although reduction to one-way functions is a common theme in the theory of public-key cryptography, it is rare to have such a defense of security in the private-key theatre. In this thesis a plan of attack is introduced at the circuit level whose aim is not break the cryptosystem in any practical way, but simply to break the very bold Rijndael security claim. To achieve this goal, we are led to a formal understanding of the Rijndael security claim, juxtaposing it with rigorous security treatments. Several of the questions that arise in this regard are as follows: ``Do invertible functions represented by circuits with very small numbers of gates have better than worst case implementations for their inverses?' ``How many plaintext/ciphertext pairs are needed to uniquely determine the Rijndael key?'
103

A generic predictive information system for resource planning and optimisation

Tavakoli, Siamak January 2010 (has links)
The purpose of this research work is to demonstrate the feasibility of creating a quick response decision platform for middle management in industry. It utilises the strengths of current, but more importantly creates a leap forward in the theory and practice of Supervisory and Data Acquisition (SCADA) systems and Discrete Event Simulation and Modelling (DESM). The proposed research platform uses real-time data and creates an automatic platform for real-time and predictive system analysis, giving current and ahead of time information on the performance of the system in an efficient manner. Data acquisition as the backend connection of data integration system to the shop floor faces both hardware and software challenges for coping with large scale real-time data collection. Limited scope of SCADA systems does not make them suitable candidates for this. Cost effectiveness, complexity, and efficiency-orientation of proprietary solutions leave space for more challenge. A Flexible Data Input Layer Architecture (FDILA) is proposed to address generic data integration platform so a multitude of data sources can be connected to the data processing unit. The efficiency of the proposed integration architecture lies in decentralising and distributing services between different layers. A novel Sensitivity Analysis (SA) method called EvenTracker is proposed as an effective tool to measure the importance and priority of inputs to the system. The EvenTracker method is introduced to deal with the complexity systems in real-time. The approach takes advantage of event-based definition of data involved in process flow. The underpinning logic behind EvenTracker SA method is capturing the cause-effect relationships between triggers (input variables) and events (output variables) at a specified period of time determined by an expert. The approach does not require estimating data distribution of any kind. Neither the performance model requires execution beyond the real-time. The proposed EvenTracker sensitivity analysis method has the lowest computational complexity compared with other popular sensitivity analysis methods. For proof of concept, a three tier data integration system was designed and developed by using National Instruments’ LabVIEW programming language, Rockwell Automation’s Arena simulation and modelling software, and OPC data communication software. A laboratory-based conveyor system with 29 sensors was installed to simulate a typical shop floor production line. In addition, EvenTracker SA method has been implemented on the data extracted from 28 sensors of one manufacturing line in a real factory. The experiment has resulted 14% of the input variables to be unimportant for evaluation of model outputs. The method proved a time efficiency gain of 52% on the analysis of filtered system when unimportant input variables were not sampled anymore. The EvenTracker SA method compared to Entropy-based SA technique, as the only other method that can be used for real-time purposes, is quicker, more accurate and less computationally burdensome. Additionally, theoretic estimation of computational complexity of SA methods based on both structural complexity and energy-time analysis resulted in favour of the efficiency of the proposed EvenTracker SA method. Both laboratory and factory-based experiments demonstrated flexibility and efficiency of the proposed solution.
104

Convergence rates of adaptive algorithms for deterministic and stochastic differential equations

Moon, Kyoung-Sook January 2001 (has links)
No description available.
105

A model-independent theory of computational complexity : from patience to precision and beyond

Blakey, Edward William January 2010 (has links)
The field of computational complexity theory--which chiefly aims to quantify the difficulty encountered when performing calculations--is, in the case of conventional computers, correctly practised and well understood (some important and fundamental open questions notwithstanding); however, such understanding is, we argue, lacking when unconventional paradigms are considered. As an illustration, we present here an analogue computer that performs the task of natural-number factorization using only polynomial time and space; the system's true, exponential complexity, which arises from requirements concerning precision, is overlooked by a traditional, `time-and-space' approach to complexity theory. Hence, we formulate the thesis that unconventional computers warrant unconventional complexity analysis; the crucial omission from traditional analysis, we suggest, is consideration of relevant resources, these being not only time and space, but also precision, energy, etc. In the presence of this multitude of resources, however, the task of comparing computers' efficiency (formerly a case merely of comparing time complexity) becomes difficult. We resolve this by introducing a notion of overall complexity, though this transpires to be incompatible with an unrestricted formulation of resource; accordingly, we define normality of resource, and stipulate that considered resources be normal, so as to rectify certain undesirable complexity behaviour. Our concept of overall complexity induces corresponding complexity classes, and we prove theorems concerning, for example, the inclusions therebetween. Our notions of resource, overall complexity, normality, etc. form a model-independent framework of computational complexity theory, which allows: insightful complexity analysis of unconventional computers; comparison of large, model-heterogeneous sets of computers, and correspondingly improved bounds upon the complexity of problems; assessment of novel, unconventional systems against existing, Turing-machine benchmarks; increased confidence in the difficulty of problems; etc. We apply notions of the framework to existing disputes in the literature, and consider in the context of the framework various fundamental questions concerning the nature of computation.
106

Fractional Brownian motion and dynamic approach to complexity.

Cakir, Rasit 08 1900 (has links)
The dynamic approach to fractional Brownian motion (FBM) establishes a link between non-Poisson renewal process with abrupt jumps resetting to zero the system's memory and correlated dynamic processes, whose individual trajectories keep a non-vanishing memory of their past time evolution. It is well known that the recrossing times of the origin by an ordinary 1D diffusion trajectory generates a distribution of time distances between two consecutive origin recrossing times with an inverse power law with index m=1.5. However, with theoretical and numerical arguments, it is proved that this is the special case of a more general condition, insofar as the recrossing times produced by the dynamic FBM generates process with m=2-H. Later, the model of ballistic deposition is studied, which is as a simple way to establish cooperation among the columns of a growing surface, to show that cooperation generates memory properties and, at same time, non-Poisson renewal events. Finally, the connection between trajectory and density memory is discussed, showing that the trajectory memory does not necessarily yields density memory, and density memory might be compatible with the existence of abrupt jumps resetting to zero the system's memory.
107

Fractional Calculus and Dynamic Approach to Complexity

Beig, Mirza Tanweer Ahmad 12 1900 (has links)
Fractional calculus enables the possibility of using real number powers or complex number powers of the differentiation operator. The fundamental connection between fractional calculus and subordination processes is explored and affords a physical interpretation for a fractional trajectory, that being an average over an ensemble of stochastic trajectories. With an ensemble average perspective, the explanation of the behavior of fractional chaotic systems changes dramatically. Before now what has been interpreted as intrinsic friction is actually a form of non-Markovian dissipation that automatically arises from adopting the fractional calculus, is shown to be a manifestation of decorrelations between trajectories. Nonlinear Langevin equation describes the mean field of a finite size complex network at criticality. Critical phenomena and temporal complexity are two very important issues of modern nonlinear dynamics and the link between them found by the author can significantly improve the understanding behavior of dynamical systems at criticality. The subject of temporal complexity addresses the challenging and especially helpful in addressing fundamental physical science issues beyond the limits of reductionism.
108

Cyclic Codes and Cyclic Lattices

Maislin, Scott 01 January 2017 (has links)
In this thesis, we review basic properties of linear codes and lattices with a certain focus on their interplay. In particular, we focus on the analogous con- structions of cyclic codes and cyclic lattices. We start out with a brief overview of the basic theory and properties of linear codes. We then demonstrate the construction of cyclic codes and emphasize their importance in error-correcting coding theory. Next we survey properties of lattices, focusing on algorithmic lattice problems, exhibit the construction of cyclic lattices and discuss their applications in cryptography. We emphasize the similarity and common prop- erties of the two cyclic constructions.
109

Prospect Theory Preferences in Noncooperative Game Theory

Leclerc, Philip 01 January 2014 (has links)
The present work seeks to incorporate a popular descriptive, empirically grounded model of human preference under risk, prospect theory, into the equilibrium theory of noncooperative games. Three primary, candidate definitions are systematically identified on the basis of classical characterizations of Nash Equilibrium; in addition, three equilibrium subtypes are defined for each primary definition, in order to enable modeling of players' reference points as exogenous and fixed, slowly and myopically adaptive, highly flexible and non-myopically adaptive. Each primary equilibrium concept was analyzed both theoretically and empirically; for the theoretical analyses, prospect theory, game theory, and computational complexity theory were all summoned to analysis. In chapter 1, the reader is provided with background on each of these theoretical underpinnings of the current work, the scope of the project is described, and its conclusions briefly summarized. In chapters 2 and 3, each of the three equilibrium concepts is analyzed theoretically, with emphasis placed on issues of classical interest (e.g. existence, dominance, rationalizability) and computational complexity (i.e, assessing how difficult each concept is to apply in algorithmic practice, with particular focus on comparison to classical Nash Equilibrium). This theoretical analysis leads us to discard the first of our three equilibrium concepts as unacceptable. In chapter 4, our remaining two equilibrium concepts are compared empirically, using average-level data originally aggregated from a number of studies by Camerer and Selten and Chmura; the results suggest that PT preferences may improve on the descriptive validity of NE, and pose some interesting questions about the nature of the PT weighting function (2003, Ch. 3). Chapter 5 concludes, systematically summarizes theoretical and empirical differences and similarities between the three equilibrium concepts, and offers some thoughts on future work.
110

Výpočetní složitost problémů kombinatorické optimalizace pro specifické třídy grafů / Computational complexity of combinatorial problems in specific graph classes

Masařík, Tomáš January 2014 (has links)
The topic of this diploma thesis is the edge distance labeling problem with specified parametres p, q and λ. We found a dychotomy for p = 2 and q = 1. So the problem is polynomial if λ ≤ 4 and it is NP-complete for λ > 4. The boundary is shifted by one prior to the vertex distance labeling problem, which has already been solved. Polynomial cases are characterized as some special paths and cycles with a few additional vertices. To show NP-completeness we use a well-known NP-complete problem of Monotone not all equal 3-SAT. That section has four parts: One for odd λ, one for even λ and two more reductions for λ = 5 and λ = 6. 1

Page generated in 0.128 seconds