• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 543
  • 135
  • 59
  • 26
  • 26
  • 26
  • 26
  • 26
  • 26
  • 24
  • 8
  • 7
  • 7
  • 7
  • 4
  • Tagged with
  • 1007
  • 1007
  • 787
  • 208
  • 206
  • 188
  • 139
  • 138
  • 116
  • 116
  • 112
  • 109
  • 107
  • 103
  • 88
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
131

Exercises in chemical engineering using GPSS

Schultheisz, Daniel Joseph 12 1900 (has links)
No description available.
132

Interactive IMGRID, direct communication with a geographic information system

Schoen, David A. January 1980 (has links)
The planning field is presently undergoing a technical revolution; computers are increasingly being used to store, manipulate and display planning data. Computer software used by planners for spatial analysis is generally still limited but one program, IMGRID, developed at Harvard University Graduate School of Design by David Sinton, is unusually powerful and adaptable. One major drawback does exist with this computer program; input of user commands to IMGRID is highly structured and in many situations requires use of technical manuals even by those familiar with the technical aspects of IMGRID. Consequently, this creative project develops additional algorithms which make IMGRID an interactive program using standard computer terminals, thus eliminating the rigorous inputting format. In addition, an educational element is added which makes this program even more useful. A description of the user key words and new subroutines are given along with a discussion of a trial computer run indicating the potential of the new interactive system. / Department of Urban Planning
133

Creating an Ada module description tool

Rice, Richard M. January 1988 (has links)
The purpose of this project was to develop, using Object Oriented Development (OOD), a software tool identified as the Ada Module Description Tool (AMDT). The AMDT provides an automated way to get a module level description of Ada code. A module level description will identify packages, subprograms, objects and type declarations and relationships. This software tool also has the ability to compare Ada source code with a module level description. The comparison shall identify any object, type, subprogram, or package declared in the module level description that does not match the provided source code.The AMDT is made up of two executable programs that run on a VAX/VMS system. The Module Description Generator (MDG) generates a module level description from a set of Ada source code files. The Module Description Checker (MDC) compares a module level description to the Ada source code. Ada is the required High Order Language for the Department Of Defense. The development methodology used was basically Object Oriented Development as described in the book Software Engineerinq With-AAA by Grady Booch and the Software Standards and Procedure Manual for Object Oriented Development (SSPM-M02.04 Draft).Booch's book is a description of Object Oriented Development methodology, while the SSPM is a set of instructions and standard format to implement the methodology. The total design of the AMDT is documented in five segments. The SSPM defines a segment as the code and documentation resulting from a pass through the OOD process. From a Software Quality Engineer's point of view the AMDT would save time in not having to check module descriptions by hand. From the Software Engineer's point of view, when the code is updated a new module description can be generated easily to keep the documentation current with the code. The AMDT tool as written does not find object declarations in the code. Fortunately the effect is minor because the module descriptions needs to be edited anyway. The module description generated by the MDG may have too much information in it. The designer wants only the types, objects, and operations that aid in the understandability of the design and how it is implemented. The only checks the MDC makes are to see if an identifier on the module description is in the code. It does not check to see if there are extra items in the code that should be required in the module description. / Department of Computer Science
134

Some PL/1 subroutines for natural language analysis

Fink, John William January 1973 (has links)
The purpose of this dissertation was to write and make available a small set of PL/1 computer subroutines that can be used in other computer programs attempting to do any kind of analysis of natural language data. The subroutines present in the dissertation are for some of the housekeeping, that is the jobs that must be done before analysis can begin.Four subroutines were written and tested: a subroutine called FINDONE (find one) that isolateswords in an input string of characters, and three subroutines, called the LAGADOs, that find words or word parts on lists of words or word parts. The reliability of the subroutines was tested in small testing programs and in a larger lexical diversity program that was modified to use the subroutines.FINDONE finds graphemic words and punctuation marks in an input character string. In addition, it truncates the input string from the left so that repeated calls of the subroutine finds the words in the input string in sequence. FINDONE takes as parameters the name of the input string and a name to be associated with the word found.The three LAGADO functions search for words on lists of words. Each of the functions is designed to search a list of a certain structure. LAGADO1 searches an alphabetized list where to length of the list is known. It uses the economical binary search technique. LAGADO1 takes as parameters the name of the word searched for, the name of the list to be searched and the length of the list to be searched.LAGADO2 searches a list in any order that is alphabetically indexed by an indexing array. LAGADO2 takes as parameters the name of the word being searched for, the name of the list being searched, the name of the indexing array, and the length of the list being searched.LAGAD03 searches any list that has an end-of-list symbol. LAGADO3 uses a linear search technique and looks at each element of the list being searched in order until it either finds the word being searched for or the final boundary symbols. LAGADO3 takes as parameters the name of the word searced for, the name of the list being searched, and the name of the end-of-list symbol.Each of the LAGADO functions returns a positive value equal to the subscript of the list element that matches the input word if the input word is matched, or a negative number whose absolute value is the subscript of the location of the cell where the input word would have to be inserted into the list if the input word is not matched.Two of the subroutines, FINDONE and LAGADO2, were tested by being incorporated into SUPRFRQ, a lexical diversity program developed from an earlier program written by Robert Wachal. An Appendix includes the documented texts of he subroutines and of the lexical diversity program. In addition, the appendix includes the result of a run of SUPRFQ on for short dialect texts collected, by Charles Houck in Leeds, England.
135

Evaluation and denotation of pure LISP programs : a worked example in semantics

Gordon, Michael J. C. January 1974 (has links)
A Scott/Strachey style denotational semantics intended to describe pure LISP is examined. I present evidence that it is an accurate rendering of the language described in chapter 1 of the LISP 1.5 Programmer's, Manual, in particular I show that call-by-value and fluid variables are correctly handled. To do this I have: (1) written an operational 'semantics' of pure LISP and shown it equivalent to the denotational one (2) Proved that, relative to the denotational semantics, the LISP functions apply,eval,...,etc. correctly compute meanings. The proof techniques used are derived from the work of Wadsworth; roughly one first proves the results for a class of 'finite' programs and then extends them to all programs by a limiting argument. Conceptually these arguments are inductions on length of computation and to bring this out I've formulated a rule of inference which enables such operational reasoning to be applied to the denotational semantics.
136

Programmer-friendly decompiled Java

Naeem, Nomair A. January 2006 (has links)
Java decompilers convert Java class files to Java source. Common Java decompilers are javac-specific decompilers since they target bytecode produced from a particular javac compiler. We present work carried out on Dava, a tool-independent decompiler that decompiles bytecode produced from any compiler. A known deficiency of tool-independent decompilers is the generation of complicated decompiled Java source which does not resemble the original source as closely as output produced by javac-specific decompilers. This thesis tackles this short-coming, for Dava, by introducing a new back-end consisting of simplifying transformations. / The work presented can be broken into three major categories: transformations using tree traversals and pattern matching to simplify the control flow, the creation of a flow analysis framework for an Abstract Syntax Tree (AST) representation of Java source code and the implementation of flow analyses with their use in complicated transformations. / The pattern matching transformations rewrite the ASTs to semantically-equivalent ASTs that correspond to code that is easier for programmers to understand. The targeted Java constructs include If and If-Else aggregation, for-loop creation and the removal of abrupt control flow. Pattern matching using tree traversals has its limitations. Thus, we introduce a new structure-based data flow analysis framework that can be used to gather information required by more complex transformations. Popular compiler analyses e.g., reaching definitions, constant propagation etc. were implemented using the framework. Information from these analyses is then leveraged to perform more advanced AST transformations. / We performed experiments comparing different decompiler outputs for different sources of bytecode. The results from these experiments indicate that the new Dava back-end considerably improves code comprehensibility and readability.
137

Computerized Test Procedure For Industrial Radiographic Examination Of Metallic Welded Joints

Gunes, E.evren 01 November 2004 (has links) (PDF)
Radiography is an extensively used NDT method, especially in nuclear, aerospace and automotive industries where optimal designs call for greater reliability. The rules corresponding to industrial radiography are defined in a system of radiographic standards. The standards related to the radiographic testing of metallic welded joints had been harmonised in all over the Europe and at the end in 1997, the standard &quot / EN 1435&quot / was established and published. Since then, this standard has become the most widely used standard where the radiographic applications are necessary. To eliminate the person based errors during application of the standard, moreover to save time, cost and effort in radiographic exposures, in this study it was aimed to write a computer program which is able to calculate all necessary parameters for a radiographic exposure related to this standard EN 1435. In the programming stage, Visual Basic 6.0 &amp / #61651 / was used. The program consists of many windows, each giving and controlling separate parameters related to the exposure. Besides giving all the needed parameters, the evolved program is able to prepare a report with these parameters. So, both radiography technicians and experts can use it. It is believed that this study constructs a basis for developing other computerised test procedures for any kind of non-destructive testing methods used in industry today.
138

Prediction of construction project cashflow using Fincash /

Owusu, Albert H. Unknown Date (has links)
Thesis (M App Sci (Project Management)) --University of South Australia, 1992
139

Cash flows' forecasting and monitoring for building construction projects (case studies in Indonesian housing projects) /

Tresani, Nurahma. Unknown Date (has links)
Thesis (MAppSc in Project Management)--University of South Australia, 1996
140

A study of PostScript as a graphics programming language /

Miller, Ruane. January 1988 (has links)
Thesis (M.F.A.)--Rochester Institute of Technology, 1988. / Includes bibliographical references (leaves 48-49).

Page generated in 0.0888 seconds