• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 19646
  • 3370
  • 2417
  • 2007
  • 1551
  • 1432
  • 877
  • 406
  • 390
  • 359
  • 297
  • 234
  • 208
  • 208
  • 208
  • Tagged with
  • 38133
  • 12457
  • 9252
  • 7111
  • 6698
  • 5896
  • 5291
  • 5197
  • 4727
  • 3455
  • 3303
  • 2815
  • 2726
  • 2539
  • 2116
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
321

A study of abstract syntax notation 1. : value processing

Smith, Graeme Richard January 1992 (has links)
No description available.
322

Machine learning approaches to medical decision making

Veropoulos, Konstantinos January 2001 (has links)
No description available.
323

A rapid response multilevel differential modem for narrowband mobile fading channels

Castle, Robert John January 1993 (has links)
No description available.
324

Molecular and conventional data sets and the systematics of Rhododenron L. subgenus Hymenanthes (blume) K.Koch

Hyam, Roger January 1997 (has links)
No description available.
325

The design of a meteorological facsimile converter

Andrews, Anthony W. January 1989 (has links)
No description available.
326

Computer vision and control for autonomous robotic assembly

Wright, Stephen Michael January 1991 (has links)
No description available.
327

SYSTEMS FOR INCOHERENT OPTICAL CONVOLUTION WITH APPLICATION IN COMPUTED TOMOGRAPHY.

GMITRO, ARTHUR FRANK. January 1982 (has links)
This dissertation discusses a certain aspect of opitcal data processing--namely the concept of performing a convolution operation of an incoherent optical light field with a specified processing kernel. The theory that shows that an incoherent imaging system performs a convolution by the very process of imaging is reviewed. The constraints on the form of processing kernel are discussed. The most severe constraint is the restriction of positive real kernels. Methods for extending the versatility of incoherent systems to include bipolar and even complex kernels are described. The most promising methods are those that encode the bipolar or complex information on either a spatial or temporal carrier frequency. The dissertation includes a presentation of two systems that are applicable to the demodulation of the signals generated by a temporal carrier approach. One of the systems introduces the concept of bipolar detection, which may have a strong influence on the performance of incoherent optical processing systems in the future. The other system is a synergism of optical and digital components that produces a hybrid system capable of high performance. The main motivation of this investigation was an outgrowth of our interest in developing a computed tomography system based on film recording of the projection data. The theory of computed tomography is reviewed in this text and an optical processing system based in part on the hybrid approach to the filtering operation is presented. This system represents a very concrete example of the capabilities of an incoherent optical processor.
328

OPTICAL COMPUTING IN BOLTZMANN MACHINES.

TICKNOR, ANTHONY JAMES. January 1987 (has links)
This dissertation covers theoretical and experimental work on applying optical processing techniques ot the operation of a Boltzmann machine. A Boltzmann machine is a processor that solves a problem by iteratively optimizing an estimate of the solution. The optimization is done by finding a minimum of an energy surface over the solution space. The energy function is designed to consider not only data but also a priori information about the problem to assist the optimization. The dissertation first establishes a generic line-of-approach for designing an algorithmic optical computer that might successfully operate using currently realizable analog optical systems for highly-parallel operations. Simulated annealing, the algorithm of the Boltzmann machine, is then shown to be adaptable to this line-of-approach and is chosen as the algorithm to demonstrate these concepts throughout the dissertation. The algorithm is analyzed and optical systems are outlined that will perform the appropriate tasks within the algorithm. From this analysis and design, realizations of the optically-assisted Boltzmann machine are described and it is shown that the optical systems can be used in these algorithmic computations to produce solutions as precise as the single-pass operations of the analog optical systems. Further considerations are discussed for increasing the usefulness of the Boltzmann machine with respect to operating on larger data sets while maintaining the full degrees of parallelism and to increasing the speed by reducing the number of electronical-optical transducers and by utilizing more of the available parallelism. It is demonstgrated how, with a little digital support, the analog optical systems can be used to produce solutions with digital precision but without compromising the speed of the optical computations. Finally there is a short discussion as to how the Boltzmann machine may be modelled as a neuromorphic system for added insight into the computational functioning of the machine.
329

UTS: A type system for facilitating data communication.

Hayes, Roger Leonard January 1989 (has links)
This dissertation presents a type scheme called UTS. The goal of UTS is to support composition between autonomous systems and programs. Composition is defined to include procedure call and message passing; it also includes command invocation and the use of stored data. The design of UTS and the principles that guided that design are discussed. The UTS type system is intended as an easily-ported pidgin language. It includes the most common scalar types, such as integer, floating point, and string; the common type constructors such as record and array; and it supports a mechanism for reference to procedures. An innovation of the type scheme is that every value, including procedure values, is tagged with a type indicator, so that it is self-describing. In order to provide a high degree of portability, to provide access to a wide variety of systems, and to support dynamic binding, UTS requires a minimum of centralized knowledge and shared data definitions. It does provide a mechanism for underspecification of types that supports flexible commands and generic procedures. UTS was originally developed as the type system for the Saguaro distributed operating system. UTS is used in Saguaro for all stored data and for procedure invocation both at the system call level and the user interface level. UTS is also used as the type system for MLP, a system that provides heterogeneous remote procedure calls. MLP is designed to minimize the cost of adding new languages while providing the ability to handle common situations easily and automatically. More complex situations can be handled by making use of routines for programmer-controlled inspection and translation of UTS values. Two implementations of MLP are described, with the changes between the versions and the rationale for those changes. The run-time systems for the two versions are also described. The use of MLP is illustrated by projects built on an MLP platform. The largest of these is a prototype of the Saguaro command interpreter. Another is an interface between MLP and the Emerald programming language. The dissertation ends with a summary and discussion of possibilities for future research.
330

FACTORS AFFECTING THE REDUCTION OF NARRATIVE DATA.

ENGLE, MOLLY ANN. January 1983 (has links)
Narrative data enable evaluators to understand other people's viewpoints without predetermining those viewpoints by using preselected questionnaire categories. Narrative data yield rich detail, insight, and information. However, reducing narrative data into meaningful conclusions is difficult and time consuming, and requires attention, commitment, and skill on the part of trained coders. The personal and situational characteristics of the coders (called value inertia and cognitive limitation biases) affect data reduction. The effects of coder exposure to expected project outcomes and the level of coder research methodology sophistication were investigated. Coders considered either sophisticated or naive in research methodology were exposed to positive, ambiguous, or negative project outcome expectations. The coders reduced, or categorized, 25 open-ended interview response sets into previously established positive, negative, and ambiguous statement-type, content-code categories. The effectiveness of coder training was also explored by computing generalizability (reliability) coefficients. High generalizability coefficients were found, regardless of level of exposure to project outcome expectations. This indicates that coders were able to code the same statements the same way and is an indication of the coders' ability to reproduce the results. Results of this study also indicate that evaluators should use sophisticated coders for the reduction of narrative data, given that option. Sophisticated coders appear more resistant to the effect of exposure to project outcome expectations, coding narrative data more positively with less variability than naive coders when exposed to positive outcome expectations.

Page generated in 0.1803 seconds