• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 745
  • 148
  • 141
  • 132
  • 30
  • 26
  • 14
  • 14
  • 14
  • 14
  • 14
  • 13
  • 12
  • 12
  • 9
  • Tagged with
  • 1419
  • 1419
  • 1300
  • 500
  • 326
  • 265
  • 265
  • 265
  • 261
  • 261
  • 261
  • 261
  • 261
  • 261
  • 261
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
121

A server consolidation solution

Swain, Richard J. January 2006 (has links) (PDF)
Thesis (M.S.C.I.T.)--Regis University, Denver, Colo., 2006. / Title from PDF title page (viewed on Mar. 27, 2006). Includes bibliographical references.
122

The design of a protocol for collaboration in a distributed repository - Nomad

Rama, Jiten. January 2007 (has links)
Thesis (M.Sc.)(Computer Science)--University of Pretoria, 2006 / Includes summary. Includes bibliographical references. Available on the Internet via the World Wide Web.
123

Efficient coordination techniques for non-deterministic multi-agent systems using distributed constraint optimization

Atlas, James. January 2009 (has links)
Thesis (Ph.D.)--University of Delaware, 2009. / Principal faculty advisor: Keith S. Decker, Dept. of Computer & Information Sciences. Includes bibliographical references.
124

An adaptive group mutual exclusion algorithm for message passing asynchronous distributed systems /

Patil, Vasant, January 2008 (has links)
Thesis (M.S.)--University of Texas at Dallas, 2008. / Includes vita. Includes bibliographical references (leaves 51-52)
125

Abstract machine design for increasingly more powerful Algol-languages

Gunn, Hamish Iain Elston January 1985 (has links)
This thesis presents the work and results of an investigation into language implementation. Some work on language design has also been undertaken. Three languages have been implemented which may be described as members of the Algol family with features and constructs typical of that family. These include block structure, nested routines, variables, and dynamic allocation of data structures such as vectors and user-defined structures. The underlying technique behind these Implementations has been that of abstract machine modelling. For each language an abstract intermediate code has been designed. Unlike other such codes we have raised the level of abstraction so that the code lies closer to the language than that of the real machine on which the language may be implemented. Each successive language is more powerful than the previous by the addition of constructs which were felt to be useful. These were routines as assignable values, dynamically initialised constant locations, types as assignable values and lists. The three languages were, Algol R a "typical" Algol based on Algol W h an Algol with routines as assignable values, enumerated types, restriction of pointers to sets of user-defined structures, and constant locations. nsl a polymorphic Algol with types as assignable values, routines as assignable values, lists, and type- and value-constant locations. The intermediate code for Algol R was based on an existing abstract machine. The code level was raised and designed so that it should be used as the input to a code generator. Such a code generator was written improving a technique called simulated evaluation. The language h was designed and a recursive descent compiler written for it which produced an intermediate code similar in level to the previous one. Again a simulated evaluation code generator was written, this time generating code for an interpreted abstract machine which implemented routines as assignable and storable values. Finally the language nsl was designed. The compiler for it produced code for an orthogonal, very high level tagged architecture abstract machine which was implemented by interpretation. This machine implemented polymorphism, assignable routine values and type- and value- constancy. Descriptions of the intermediate codes/abstract machines are given in appendices.
126

Tools and techniques for machine-assisted meta-theory

Adams, Andrew January 1997 (has links)
Machine-assisted formal proofs are becoming commonplace in certain fields of mathematics and theoretical computer science. New formal systems and variations on old ones are constantly invented. The meta-theory of such systems, i.e. proofs about the system as opposed to proofs within the system, are mostly done informally with a pen and paper. Yet the meta-theory of deductive systems is an area which would obviously benefit from machine support for formal proof. Is the software currently available sufficiently powerful yet easy enough to use to make machine assistance for formal meta-theory a viable proposition? This thesis presents work done by the author on formalizing proof theory from [DP97a] in various formal systems: SEQUEL [Tar93, Tar97], Isabelle [Pau94] and Coq [BB+96]. SEQUEL and Isabelle were found to be difficult to use for this type of work. In particular, the lack of automated production of induction principles in SEQUEL and Isabelle undermined confidence in the resulting formal proofs. Coq was found to be suitable for the formalisation methodology first chosen: the use of nameless dummy variables (de Bruijn indices) as pioneered in [dB72]. A second approach (inspired by the work of McKinna and Pollack [vBJMR94, MP97]) formalising named variables was also the subject of some initial work, and a comparison of these two approaches is presented. The formalisation was restricted to the implicational fragment of propositional logic. The informal theory has been extended to cover full propositional logic by Dyckhoff and Pinto, and extension of the formalisation using de Bruijn indices would appear to present few difficulties. An overview of other work in this area, in terms of both the tools and formalisation methods, is also presented. The theory formalised differs from other such work in that other formalisations have involved only one calculus. [DP97a] involves the relationships between three different calculi. There is consequently a much greater requirement for equality reasoning in the formalisation. It is concluded that a formalisation of any significance is still difficult, particularly one involving multiple calculi. No tools currently exist that allow for the easy representation of even quite simple systems in a way that fits human intuitions while still allowing for automatic derivation of induction principles. New work on integrating higher order abstract syntax and induction may be the way forward, although such work is still in the early stages.
127

A multichannel, general-purpose data logger

Gardener, Michael Edwin January 1986 (has links)
Thesis (Diploma (Electrical Engineering))--Cape Technikon, 1986. / This thesis describes the implementation of a general-purpose, microprocessor-based Data Logger. The Hardware allows analog data acquisition from one to thirty two channels with 12 bit resolution and at a data throughput of up to 2KHz. The data is logged directly to a Buffer memory and from there, at the end of each 109, it is dumped to an integral cassette data recorder. The recorded data can be transfered from the logger to a desk-top computer, via the IEEE 488 port, for further processing and display. All log parameters are user selectable by means of menu prompted keyboard entry and a Real-Time clock (RTC) provides date and time information automatically.
128

Digital analysis of mass spectra

Weichert, Dieter Horst January 1965 (has links)
The purpose of this thesis is twofold. A special type of signal is studied and methods for its reduction investigated. The results of this work are then applied to the automatic reduction of the trimethyllead mass spectrum with the aim of improving the information recovery from this signal, using modern data processing techniques. The precision measurement of amplitudes must always provide some form of averaging or filtering. It is demonstrated that the results become increasingly sensitive to irregularities in the abscissa of the record, as the width of the averaging function increases. The continuity of the transition from a purely height sensitive calculation to an area sensitive calculation depends on the shape of the impulse response of the measuring arrangement. For all physical shapes, a least squares fit will lead to area sensitive amplitudes. These results have not been known in mass spectrometry or, to our knowledge, in other fields. The mass spectrum is regarded as the convolution of an ideal line spectrum and a peak shape which is determined by the characteristics of the mass spectrometer and the associated electronics. The reduction presents essentially a problem in deconvolution. Although occassionally suggested in the literature, methods involving transformation to the frequency domain were not found to be useful. Thus it is necessary to carry out the analyses entirely in the time domain. Even then, methods requiring the use of derivatives of the spectrum, which have been successfully applied in special cases, usually suffer seriously from noise. Correlation techniques are especially valuable for signals with a high noise content. Correlation is an integration in the time domain and corresponds to a general filtering process in the frequency domain. A close relationship between the correlation technique and a least squares method is emphasized. The development of a practical procedure for this laboratory was part of the study. The trimethyllead group of the tetramethyllead spectrum is recorded in digital form on paper tape. At a constant sampling rate of two per second, about fifty readings per mass interval are recorded. The reduction of the paper tape record to relative isotopic abundances of lead has been automated, applying the experience gained from the general study. The saving in reduction time over former standard methods is substantial. There seems to be a real reduction in the standard deviations of individual peak measurements, but this does not exceed a factor of two. / Science, Faculty of / Earth, Ocean and Atmospheric Sciences, Department of / Graduate
129

Computer analysis of planar and spatial grid frameworks.

Kinra, Ravindar Kumar January 1964 (has links)
The application of the stiffness approach to the exact analysis of both planar and spatial grid frameworks of any complexity and high degree of statical indeterminacy is presented. Ordinarily, even the simplest planar grid is such a highly redundant structure that it cannot be analyzed rigorously by manual methods without recourse to some simplifying assumptions at the expense of accuracy. In general, most authors neglect the effect of member torsional rigidities in order to reduce the size of the problem and make use of the plate theory for the purpose of evaluating deflections. Matrix methods of analysis, however, remove the necessity for resorting to any such approximations and prove extremely convenient for computer application. The fundamentals of the stiffness approach are explained in complete detail and applied to the analysis of rectangular planar grids. For the purpose of comparison, example grids given by Ewell, Okubo ɛ Abrams and Woinowsky-Krieger have been analyzed by the stiffness method and the comparative results are tabulated. The principle of orthogonal transformation, which is an essential part of the analysis of diagrids and spatial grids is fully described and its application demonstrated by various numerical examples including a skew bridge, a cantilever diagrid and a hyperbolic paraboloid space grid. The application of stiffness analysis has been further extended to problems involving temperature changes and support settlements and, also, the procedure to reduce the size of symmetrical structures is described. A special successive elimination and matrix partitioning technique has also been introduced in order to enable the solution of extremely large numbers of simultaneous equations within the limited core memory capacity of digital computers, by taking advantage of the band form of the stiffness matrices of structures. A complete Fortran II computer program for the IBM 1620 and a 1405 disk file is given, as well as, sample inputs and outputs of the IBM 1620 and 7090. After the first attempts by Engessers in 1889 and Zschetzsche in 1893, a great variety of hand calculation methods have been developed for the analysis of planar grid frameworks. Among these, Hendry ɛ Jaeger's harmonic analysis and C. Massonet's anisotropic plate theory methods are the most convenient and easily applicable, in the opinion of the author. The basic assumptions and underlying principles of both these methods are outlined and the procedure of analysis is illustrated by means of a numerical example in each case. Furthermore, in order to obtain an idea of their accuracy several planar grids with 2 to 6 longitudinals have been analyzed by the stiffness method, harmonic analysis and anisotropic plate theory. In every case, two solutions have been performed, assuming the constituent members of the grid to possess first zero and then maximum values of torsional rigidity. The comparative values of the load distribution factors for the longitudinal and transversal bending moments have been tabulated. / Applied Science, Faculty of / Civil Engineering, Department of / Graduate
130

Examples of the use of a computer as a planning aid

Thom, Jane Elizabeth January 1973 (has links)
The growing amount of data available to the planners today and the necessity to easily order, display, and assimilate this data, has thrust the computer into the foreground of planning tools. Computers are unequalled in their ability to handle, reorganize and manipulate large volumes of data. Computer techniques are developed here to handle and display data for the planner so that he can more effectively spend his time on the evaluation and decision-making aspects of planning. Thus a minimum amount of time need be spent in assimilating the information necessary for a decision; this is particularly beneficial in the preliminary phase of planning. Three computer techniques to simplify data handling and visually display data are described in this report. One generates simple three-dimensional drawings on a graphics display terminal. A second technique visually and dynamically displays growth and change by simulating the evolution of a cityscape. The third technique extends McHarg's space allocation map overlay technique. It utilizes spatially distributed data, and allows interactive manipulation of this data to indicate areas of "suitability" for a particular use. / Applied Science, Faculty of / Civil Engineering, Department of / Graduate

Page generated in 0.0585 seconds