• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 745
  • 350
  • 73
  • 73
  • 73
  • 73
  • 73
  • 72
  • 48
  • 31
  • 9
  • 5
  • 5
  • 4
  • 3
  • Tagged with
  • 1694
  • 1694
  • 271
  • 253
  • 236
  • 208
  • 186
  • 185
  • 173
  • 166
  • 145
  • 138
  • 137
  • 126
  • 125
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
431

Numerical computation as deduction in constraint logic programming

Lee, Jimmy Ho Man 04 July 2018 (has links)
Logic programming realizes the ideal of "computation as deduction," except when floating-point arithmetic is involved. In that respect, logic programming languages suffer the same deficiency as conventional algorithmic languages: floating-point operations are only approximate and it is not easy to tell how good the approximation is. This dissertation proposes a framework to extend the benefits of logic programming to computations involving floating-point arithmetic. John Cleary incorporated a relational form of interval arithmetic into Prolog so that variables already bound can be bound again. In this way, the usual logical interpretation of computation no longer holds. Based on Cleary's idea, we develop a technique for narrowing intervals. We present a relaxation algorithm for coordinating the applications of the interval narrowing operations to constraints in a network. We incorporate relational interval arithmetic into two constraint logic programming languages: CHIP and CLP(R). We modify CHIP by allowing domains to be intervals of real numbers. In CLP(R) we represent intervals by inequality constraints. The enhanced languages ICHIP and ICLP(R) preserve the semantics of logic so that numerical computations are deductions, even when floating-point arithmetic is used. We have constructed a prototype of ICLP(R) consisting of a meta-interpreter executed by an existing CLP(R) system. We show that interval narrowing belongs to the class of domain restriction operations in constraint-satisfaction algorithms. To establish a general framework for these operations, we use and generalize Ashby's notions of cylindrical closure and cylindrance. We show that Mackworth's algorithms can be placed in our framework. / Graduate
432

A product cost system selection framework for the banking industry

Ndwandwe, Sbusiso Vusi 20 October 2014 (has links)
M.Com. (Business Management) / Large organisations, such as banks, compete through a variety of products, geography and services dimensions (Stenzel & Stenzel, 2004). Firms achieve sustainable competitive advantage if they are able to generate higher economic profits relative to competitors in the long-term. Market economics combined with relative strength in market and cost structure positions enhances the ability of a firm to generate superior economic profits (Besanko et al, 2004). Determining the use and allocation of investment resources is one of core and critical task strategic activity for management in a large firm. Firms use management accounting information to determine product profitability, understand cost drivers and the implication of investment decisions on the overall products and markets performance. This reports come from the premise that the extent use, accuracy and deeper understanding of management accounting information is crucial for strategic management of the firm. Product cost systems produces the cost side of this management information and thus its use can have a far-reaching implications for the firm. The study explores the various usage of product costs information and position product costing system in the context strategic management. The main of the study is to determine the key factors that management should consider when selecting a product cost system This was achieved by a comprehensive discussion of each product cost system type and implications of the cost associated with each product. Furthermore, the product cost systems are discussed in terms of the level of sophistication which increases or decrease the level of product cost system design complexity. The theoretical foundation was applied in the South African banking industry to practically illustrate the problem in the real-world, the importance of the study; demonstrate the complexity of product cost system in two-sided markets as well as implication of implementing an incorrect system. The research questions were tested and answered using quantitative techniques. Data was collected from a sample which represented the big four banks in South Africa primarily using a questionnaire. Purposive sampling technique was used.
433

Probabilistic timing verification and timing analysis for synthesis of digital interface controllers

Escalante, Marco Antonio 08 September 2017 (has links)
In this dissertation we present two techniques on the topic of digital interface design: a probabilistic timing verification and a timing analysis for synthesis, both rooted in a formal specification. Interface design arises when two digital components (e.g., a processor and a memory device) are to be interconnected to build up a system. We have extended a Petri net specification to describe the temporal behavior of the interface protocols of digital components. The specification describes circuit delays as random variables thus making it suitable to model process variations and timing correlation. Interface probabilistic timing verification checks that a subsystem, composed of components to be interconnected and the associated interface logic, satisfies the timing constraints specified by the components' specifications. Our verification technique not only yields tighter results than previous techniques that do not take timing correlation into consideration but also, if the timing constraint is not satisfied, determines the probability that a constraint will be violated. The second technique, timing analysis for synthesis, finds tight bounds on the delays of the interface logic, which are unknown prior to synthesis, such that all the timing constraints given in the component specifications are satisfied. / Graduate
434

NetPro neural network simulator for Windows

Burger, Dewald 14 October 2015 (has links)
M.Ing. (Mechanical Engineering) / This thesis involves the development of a Neural Network software package within a Windows environment. This package is called NetPro. It contains most of the standard tools used in existing neural network packages e.g. shuffling of facts, automatic test file facts extraction, randomizing of weights values (before and during training), automatic/manual construction of network files, logging of network properties during training, noise can be added to inputs, etc. NetPro has three additional tools: (a) time delay actions on inputs, (b) a neural network calculator, and (c) automatic saving of the best network during training. The calculator is used to calculate the number of training facts needed for optimum generalization ...
435

Evaluation in built-in self-test

Zhang, Shujian 21 August 2017 (has links)
This dissertation addresses two major issues associated with a built-in self-test environment: (1) how to measure whether a given test vector generator is suitable for testing faults with sequential behavior, and (2) how to measure the safety of self-checking circuits. Measuring the two-vector transition capability for a given test vector generator is a key to the selection of the generators for stimulating sequential faults. The dissertation studies general properties for the transitions and presents a novel, comprehensive analysis for the linear feedback shift registers and the linear hybrid cellular automata. As a result, the analysis solves the open problem as to “how to properly separate the inputs when the LHCA-based generator is used for detecting delay faults”. In general, a self-checking circuit has additional hardware redundancy than the original circuit and as a result, the self-checking circuit may have a higher failure rate than the original one. The dissertation proposes a fail-safe evaluation to predict the probability of the circuit not being in the fail-state. Compared with existing evaluation methods, the fail-safe evaluation is more practical because it estimates the safety of the circuit, which is decreasing as time goes on, instead of giving a constant probability measure. Various other results about improving fault coverage for transition delay faults and testing in macro-based combinational circuits are derived as well. / Graduate
436

A measure of the investment climate in South Africa

Foto, Tongai January 2009 (has links)
Magister Scientiae - MSc / Investor confidence is a concept many investors are constantly trying to gauge. In practice however these concepts are usually not easy to measure. This study attempts to capture the total sum of investor perception in South Africa by examining market behaviour. Data from the JSE/FTSE (1995-2009) will be used to determine an Equity Risk Premium. Bond Yield Spreads will also be calculated from data provided by I-NET BRIDGE. An amalgamation of these components will produce the proposed Investment Confidence Index. Similar indices currently on the South African Market are based on subjective surveys and might therefore be biased. The proposed index which is a first in SA will prove invaluable to practitioners in the financial sector. / South Africa
437

Predicting program complexity from Warnier-Orr diagrams

White, Barbara January 1982 (has links)
Typescript (photocopy).
438

An economic analysis of multiple use forestry using FORPLAN-Version 2

Hackett, James Simpson January 1989 (has links)
This thesis examines a mathematical programming model called FORPLAN as a planning tool for strategic analysis of forest management alternatives. This model uses economic efficiency as the objective of forest management planning. The dynamic theory of multiple use forestry is analyzed and expressed as a linear programming analogue in FORPLAN. The main weakness of this theory is that it focuses on single stand analysis. Even so, forest wide constraints applied to certain FORPLAN formulations compensate for this weakness. A strata-based forest management problem is developed to show the economic implications of four forest management alternatives: (1) timber production; (2) timber production subject to a non-declining yield limitation; (3) timber and black-tailed deer (Odocolieus hemionus columbianus) production; and (4) timber and black-tailed deer production, again including a non-declining yield of timber. Demand curves for two analysis areas and a supply curve for deer winter range are developed using parametric analysis. The ability of FORPLAN to address economic implications of current forest management policies is discussed. Economic analysis of forest management alternatives would play a useful role in forest planning in British Columbia. The need for such evaluation is underlined by the ever increasing number of resource conflicts caused by the dominance of the timber industry and the continually growing demand for other forest resources. Three conclusions are drawn from this study. First, FORPLAN has the technical capability to be an effective tool for analyzing strategic multiple use plans under economic efficiency criteria. It does not have the timber bias of earlier models and the capability of FORPLAN to integrate area and strata-based variables makes it a very powerful model. Second, parametric programming of FORPLAN solutions provides marginal analysis for inputs and outputs. Comparative examination of these curves and their elasticities provide information about the relative importance of different analysis areas. Lastly, managing for timber and hunting services for black-tailed deer by preserving old growth winter range is not an economically viable management option. The relative value of the timber is significantly greater than the hunting services for the deer that it is just not worth managing for both. / Forestry, Faculty of / Graduate
439

Semi-automatic protocol implementation using an Estelle-C compiler, LAPB and RTS protocols as examples

Lu, Jing January 1990 (has links)
Formal Description Techniques allow for the use of automated tools during the specification and development of communication protocols. Estelle is a standardized formal description technique developed by ISO to remove ambiguities in the specification of communication protocols and services. The UBC Estelle-C compiler automates the implementation of protocols by producing an executable C implementation directly from its Estelle specification. In this thesis, we investigate the automated protocol implementation methodology using the Estelle-C compiler. First, we describe the improvements made to the compiler to support the latest version of Estelle. Then, we present and discuss the semiautomated implementations of the LAPB protocol in the CCITT X.25 Recommendation and the RTS protocol in the CCITT X.400 MHS series using this compiler. Finally, we compare the automatic and manual protocol implementations of LAPB and RTS protocols in terms of functional coverage, development time, code size, and performance measure. The results strongly indicate the overall advantages of automatic protocol implementation method over the manual approach. / Science, Faculty of / Computer Science, Department of / Graduate
440

A computer program analysing transients in multistage pumping systems

Schmitt, Klaus January 1980 (has links)
Transient pressures subsequent to simultaneous power failure at all pumps of a multistage pumping system are analysed. Distributing pumping stations along a pipeline, rather than placing all of the required pumps within one pumping station, significantly reduces transient pressure fluctuations within the system. A computer program using the FORTRAN language is developed to analyse multistage pumping systems, with appropriate surge controls, in the event of such a power failure. These surge controls consist of valves, vacuum breakers, air chambers and reservoirs; with other controls easily added as they develop. Boundary conditions determining system controls are not developed as a part of this thesis, but are described for completeness. By comparing the maximum and minimum transient pressures occurring within- single stage and multistage systems, the premise that multistage systems give significantly lower transient pressures than single stage systems is substantiated. This reduction in transient pressures allows for possible savings in costs, as pipe wall thicknesses and the size of large, expensive control structures may be reduced. Examples demonstrating the use of the program are included. / Applied Science, Faculty of / Civil Engineering, Department of / Graduate

Page generated in 0.1059 seconds