• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 476
  • 104
  • 29
  • 24
  • 24
  • 24
  • 24
  • 24
  • 24
  • 8
  • 7
  • 6
  • 2
  • 2
  • 2
  • Tagged with
  • 795
  • 795
  • 787
  • 196
  • 191
  • 191
  • 128
  • 123
  • 100
  • 96
  • 94
  • 90
  • 89
  • 89
  • 86
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
461

On the adaptability of multipass Pascal compilers to variants of (Pascal) P-code machine architectures

Litteken, Mark A January 2011 (has links)
Typescript (photocopy). / Digitized by Kansas Correctional Industries
462

Exploiting and/or Parallelism in Prolog

Shah, Bankim 01 January 1991 (has links)
Logic programming languages have generated increasing interest over the last few years. Logic programming languages like Prolog are being explored for different applications. Prolog is inherently parallel. Attempts are being made to utilize this inherent parallelism. There are two kinds of parallelism present in Prolog, OR parallelism and AND parallelism. OR parallelism is relatively easy to exploit while AND parallelism poses interesting issues. One of the main issues is dependencies between literals. It is very important to use the AND parallelism available in the language structure as not exploiting it would result in a substantial loss of parallelism. Any system trying to make use of either or both kinds of parallelism would need to have the capability of performing faster unification, as it affects the overall execution time greatly. A new architecture design is presented in this thesis that exploits both kinds of parallelism. The architecture efficiently implements some of the key concepts in Conery's approach to parallel execution [5]. The architecture has a memory hierarchy that uses associative memory. Associative memories are useful for faster lookup and response and hence their use results in quick response time. Along with the use of a memory hierarchy, execution algorithms and rules for ordering of literals are presented. The rules for ordering of literals are helpful in determining the order of execution. The analysis of response time is done for different configurations of the architecture, from sequential execution with one processor to multiple processing units having multiple processors. A benchmark program, "query," is used for obtaining results, and the map coloring problem is also solved on different configurations and results are compared. To obtain results the goals and subgoals are assigned to different processors by creating a tree. These assignments and transferring of goals are simulated by hand. The total time includes the time needed for moving goals back and forth from one processor to another. The total time is calculated in number of cycles with some assumptions about memory response time, communication time, number of messages that can be sent on the bus at a particular instant, etc. The results obtained show that the architecture efficiently exploits the AND parallelism and OR parallelism available in Prolog. The total time needed for different configurations is then compared and conclusions are drawn.
463

Measurement Techniques for Noise Figure and Gain of Bipolar Transistors

Jung, Wayne Kan 02 June 1993 (has links)
First, the concepts of reflection coefficients, s-parameters, Smith chart, noise figure, and available power gain will be introduced. This lays out the foundation for the presentation of techniques on measuring noise figure and gain of high speed bipolar junction transistors. Noise sources in a bipolar junction transistor and an equivalent circuit including these noise sources will be presented. The process of determining the noise parameters of a transistor will also be discussed. A Pascal program and several TEKSPICE scripts are developed to calculate the stability, available power gain, and noise figure circles. Finally, these circles are plotted on a Smith chart to give a clear view of how a transistor will perform due to a change in source impedances.
464

Annotation-Enabled Interpretation and Analysis of Time-Series Data

Venugopal, Niveditha 07 November 2018 (has links)
As we continue to produce large amounts of time-series data, the need for data analysis is growing rapidly to help gain insights from this data. These insights form the foundation of data-driven decisions in various aspects of life. Data annotations are information about the data such as comments, errors and provenance, which provide context to the underlying data and aid in meaningful data analysis in domains such as scientific research, genomics and ECG analysis. Storing such annotations in the database along with the data makes them available to help with analysis of the data. In this thesis, I propose a user-friendly technique for Annotation-Enabled Analysis through which a user can employ annotations to help query and analyze data without having prior knowledge of the details of the database schema or any kind of database programming language. The proposed technique receives the request for analysis as a high-level specification, hiding the details of the schema, joins, etc., and parses it, validates the input and converts it into SQL. This SQL query can then be executed in a relational database and the result of the query returned to the user. I evaluate this technique by providing real-world data from a building-data platform containing data about Portland State University buildings such as room temperature, air volume and CO2 level. This data is annotated with information such as class schedules, power outages and control modes (for example, day or night mode). I test my technique with three increasingly sophisticated levels of use cases drawn from this building science domain. (1) Retrieve data with include or exclude annotation selection (2) Correlate data with include or exclude annotation selection (3) Align data based on include annotation selection to support aggregation over multiple periods. I evaluate the technique by performing two kinds of tests: (1) To validate correctness, I generate synthetic datasets for which I know the expected result of these annotation-enabled analyses and compare the expected results with the results generated from my technique (2) I evaluate the performance of the queries generated by this service with respect to execution time in the database by comparing them with alternative SQL translations that I developed.
465

Common subexpression detection in dataflow programs

Jones, Philip E. C. (Philip Ewan Crossley) January 1989 (has links) (PDF)
Processed. Bibliography: leaves 123-124.
466

Dynamic data flow analysis for object oriented programs

Cain, Andrew Angus, n/a January 2005 (has links)
There are many tools and techniques to help developers debug and test their programs. Dynamic data flow analysis is such a technique. Existing approaches for performing dynamic data flow analysis for object oriented programs have tended to be data focused and procedural in nature. An approach to dynamic data flow analysis that used object oriented principals would provide a more natural solution to analysing object oriented programs. Dynamic data flow analysis approaches consist of two primary aspects; a model of the data flow information, and a method for collecting action information from a running program. The model for data flow analysis presented in this thesis uses a meta-level object oriented approach. To illustrate the application of this meta-level model, a model for the Java programming language is presented. This provides an instantiation of the meta-level model provided. Finally, several methods are presented for collecting action information from Java programs. The meta-level model contains elements to represent both data items and scoping components (i.e. methods, blocks, objects, and classes). At runtime the model is used to create a representation of the executing program that is used to perform dynamic data flow analysis. The structure of the model is created in such a way that locating the appropriate meta-level entity follows the scoping rules of the language. In this way actions that are reported to the meta-model are routed through the model to their corresponding meta-level elements. The Java model presented contains classes that can be used to create the runtime representation of the program under analysis. Events from the program under analysis are then used to update the model. Using this information developers are able to locate where data items are incorrectly used within their programs. Methods for collecting action information from Java programs include source code instrumentation, as used in earlier approaches, and approaches that use Java byte code transformation, and the facilities of the Java Platform Debugger Architecture. While these approaches aimed to achieve a comprehensive analysis, there are several issues that could not be resolved using the approaches covered. Of the approaches presented byte code transformation is the most practical.
467

Maori language integration in the age of information technology: a computational approach

Laws, Mark R., n/a January 2001 (has links)
A multidisciplinary approach that involves language universals, linguistic discourse analysis and computer information technology are combined to support the descriptive nature of this research dissertation. Utilising comparative methods to determine rudimentary language structures which reflect both the scientific and historic parameters that are embedded in all languages. From a hypothesis to the proof of concept, a multitude of computer applications have been used to test these language models, templates and frameworks. To encapsulate this entire approach, it is best described as "designing then building the theoretical, experimental, and practical projects that form the structural network of the Maori language system". The focus on methods for integrating the language is to investigate shared characteristics between Maori and New Zealand English. This has provided a complete methodology for a bilingual based system. A system with text and speech for language generation and classification. This approach has looked at existing computational linguistic and information processing techniques for the analysis of each language�s phenomena; where data from basic units to higher-order linguistic knowledge has been analysed in terms of their characteristics for similar and/or dissimilar features. The notion that some language units can have similar acoustic sounds, structures or even meanings in other languages is plausible. How these are identified was the key concept to building an integrated language system. This research has permitted further examination into developing a new series of phonological and lexical self organising maps of Maori. Using phoneme and word maps spatially organised around lower to higher order concepts such as �sounds like�. To facilitate the high demands placed on very large data stores, the further development of the speech database management system containing phonological, phonetic, lexical, semantic, and other language frameworks was also developed. This database has helped to examine how effectively Maori has been fully integrated into an existing English framework. The bilingual system will allow full interaction with a computer-based speech architecture. This will contribute to the existing knowledge being constructed by the many different disciplines associated with languages; naturally or artificially derived. Evolving connectionist systems are new tools that are trained in an unsupervised manner to be both adaptable and flexible. This hybrid approach is an improvement on past methods in the development of more effective and efficient ways for solving applied problems for speech data analysis, classification, rule extraction, information retrieval and knowledge acquisition. A preliminary study will apply bilingual data to an �evolving clustering method� algorithm that returns a structure containing acoustic clusters plotted using visualisation techniques. In the true practical sense, the complete bilingual system has had a bi-directional approach. Both languages have undergone similar data analysis, language modelling, data access, text and speech processing, and human-computer network interface interaction.
468

Memory management strategies to improve the space-time performance of Java programs

Yu, Ching-han. January 2006 (has links)
Thesis (Ph. D.)--University of Hong Kong, 2006. / Title proper from title frame. Also available in printed format.
469

Development of a discrete-event, object-oriented framework for network-centric simulation modeling using Java

Colvin, Kurt 21 May 1997 (has links)
The primary objective of this research is to develop a network-centric simulation modeling framework that can be used to build simulation models through the use of Internet-based resources. An object-oriented programming approach was used to build a Java-based modeling framework focused on modeling a semiconductor fabrication system. This research is an initial step in what may be a new network-centric simulation modeling methodology, where simulation models are created using software objects that are physically located in many different sites across the Internet. Once the ability to create and run a relatively simple model using a network-centric approach has been established, future research may lead to a simulation environment that not only lets a user interactively build models but also allows concurrent model development between a group of users, independent of their location, operating system, or computer architecture. The prototype system implemented as a portion of this research is performed in the Java object-oriented programming language. A target system model is presented as an example of how the environment can be used to apply the network-centric simulation modeling methodology. / Graduation date: 1998
470

Memory management strategies to improve the space-time performance of Java programs /

Yu, Ching-han. January 2006 (has links)
Thesis (Ph. D.)--University of Hong Kong, 2006. / Also available online.

Page generated in 0.04 seconds