• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 12378
  • 2597
  • 1735
  • 1326
  • 549
  • 495
  • 464
  • 308
  • 249
  • 249
  • 249
  • 249
  • 249
  • 246
  • 246
  • Tagged with
  • 26065
  • 7099
  • 6515
  • 6174
  • 5186
  • 3538
  • 3233
  • 3174
  • 3050
  • 2379
  • 2332
  • 2298
  • 2292
  • 2037
  • 1737
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
121

An investigation on the applicability of multi-microprocessing in the two dimensional digital filtering problem /

Whitcher, Timothy J. January 1980 (has links)
Thesis (M.S.)--Rochester Institute of Technology, 1980. / Typescript.
122

Low complexity, narrow baseline beamformer for hand-held devices

Kale, Kaustubh R. January 2003 (has links)
Thesis (M.S.)--University of Florida, 2003. / Title from title page of source document. Includes vita. Includes bibliographical references.
123

Characterization of spatial and temporal brain activation patterns in functional magnetic resonance imaging data

Lee, Jae-Min. January 2005 (has links)
Thesis (Ph. D.)--University of Florida, 2005. / Title from title page of source document. Document formatted into pages; contains 123 pages. Includes vita. Includes bibliographical references.
124

Blind separation of convolutive mixtures using Renyi's divergence

Hild, Kenneth E. January 2003 (has links)
Thesis (Ph. D.)--University of Florida, 2003. / Title from title page of source document. Includes vita. Includes bibliographical references.
125

A computer-aided design scheme for drainage and runoff systems

Battle, Timothy P. January 1985 (has links)
A computer-aided design scheme for both man-made and natural runoff systems is presented. The model uses linear programming to solve Muskingum routing equations through a drainage system, and provides design information through post-optimality (sensitivity) analysis. With the objective of minimizing the peak outflow from the system and using hydrograph ordinates as the decision variables, the output of the linear programming analysis shows the extent that each flow ordinate at every node in the network influences the peak flow at some downstream location. This information can aid the user in speeding up the design process to arrive at an efficient design - i.e., one which either minimizes construction costs or reduces the potential risk of flood damage. / Applied Science, Faculty of / Civil Engineering, Department of / Graduate
126

Reliable client-server communication in distributed programs

Ravindran, K. January 1987 (has links)
Remote procedure call (RPC) and shared variable are communication abstractions which allow the various processes of a distributed program, often modelled as clients and servers, to communicate with one another across machine boundaries. A key requirement of the abstractions is to mask the machine and communication failures that may occur during the client-server communications. In practice, many distributed applications can inherently tolerate failures under certain situations. If such application layer information is available to the client-server communication layer (RPC and shared variable), the failure masking algorithms in the communication layer may relax the constraints under which the algorithms may have to operate if the information is not available. The relaxation significantly simplifies the algorithms and the underlying message transport layer and allows formulation of efficient algorithms. This application-driven approach forms the backbone of the failure masking techniques described in the thesis, as outlined below: Orphan handling in RPCs: Using the application-driven approach, the thesis introduces a new technique of adopting the orphans caused by failures during RPCs. The adoption technique is preferable to orphan killing because orphan killing wastes any work already completed and requires rollback which may be expensive and sometimes not meaningful. The thesis incorporates orphan adoption into two schemes of replicating a server: i) Primary-secondary scheme in which one of the replicas of the server acts as the primary and executes RPCs from clients while the other replicas stand by as secondaries. When the primary fails, one of the secondaries becomes the primary, restarts the server execution from the most recent checkpoint and adopts the orphan, ii) Replicated execution scheme in which an RPC on the server is executed by more than one replica of the server. When any of the replicas fails, the orphan generated by the failure is adopted by the surviving replicas. Both schemes employ call re-executions by servers based on the application-level idempotency properties of the calls. Access to shared variables: Contemporary distributed programs deal with a new class of shared variables such as information on name bindings, distributed load and leadership within a service group. Since the consistency constraints on such system variables need not be as strong as those for user data, the access operations on the variables may be made simpler using this application layer information. Along this direction, the thesis introduces an abstraction, which we call application-driven shared variable, to govern access operations on the variables. The algorithms for the access operations on a variable use intra-server group communication and enforce consistency of the variable to the extent required by the application. The thesis describes complete communication models incorporating the application-driven approach to mask failures. / Science, Faculty of / Computer Science, Department of / Graduate
127

A direct temporal domain approach for ultrafast optical signal processing and its implementation using planar lightwave circuits /

Xia, Bing, 1972 Nov. 7- January 2006 (has links)
No description available.
128

USE OF A PRIORI INFORMATION FOR IMPROVED TOMOGRAPHIC IMAGING IN CODED-APERTURE SYSTEMS.

GINDI, GENE ROBERT. January 1982 (has links)
Coded-aperture imaging offers a method of classical tomographic imaging by encoding the distance of a point from the detector by the lateral scale of the point response function. An estimate, termed a layergram, of the transverse sections of the object can be obtained by performing a simple correlation operation on the detector data. The estimate of one transverse plane contains artifacts contributed by source points from all other planes. These artifacts can be partially removed by a nonlinear algorithm which incorporates a priori knowledge of total integrated object activity per transverse plane, positivity of the quantity being measured, and lateral extent of the object in each plane. The algorithm is iterative and contains, at each step, a linear operation followed by the imposition of a constraint. The use of this class of algorithms is tested by simulating a coded-aperture imaging situation using a one-dimensional code and two-dimensional (one axis perpendicular to aperture) object. Results show nearly perfect reconstructions in noise-free cases for the codes tested. If finite detector resolution and Poisson source noise are taken into account, the reconstructions are still significantly improved relative to the layergram. The algorithm lends itself to implementation on an optical-digital hybrid computer. The problems inherent in a prototype device are characterized and results of its performance are presented.
129

A CASE STUDY OF FLEXIBLE DISTRIBUTED PROCESSING SYSTEM IN COPIER DEVELOPMENT (PROPOTYPE, DRIVER, PROTOCOL).

Nguyen, Thuyen Dinh, 1959- January 1986 (has links)
No description available.
130

The development and application of informatics-based systems for the analysis of the human transcriptome.

Kelso, Janet January 2003 (has links)
<p>Despite the fact that the sequence of the human genome is now complete it has become clear that the elucidation of the transcriptome is more complicated than previously expected. There is mounting evidence for unexpected and previously underestimated phenomena such as alternative splicing in the transcriptome. As a result, the identification of novel transcripts arising from the genome continues. Furthermore, as the volume of transcript data grows it is becoming increasingly difficult to integrate expression information which is from different sources, is stored in disparate locations, and is described using differing terminologies. Determining the function of translated transcripts also remains a complex task. Information about the expression profile &ndash / the location and timing of transcript expression &ndash / provides evidence that can be used in understanding the role of the expressed transcript in the organ or tissue under study, or in developmental pathways or disease phenotype observed.<br /> <br /> In this dissertation I present novel computational approaches with direct biological applications to two distinct but increasingly important areas of research in gene expression research. The first addresses detection and characterisation of alternatively spliced transcripts. The second is the construction of an hierarchical controlled vocabulary for gene expression data and the annotation of expression libraries with controlled terms from the hierarchies. In the final chapter the biological questions that can be approached, and the discoveries that can be made using these systems are illustrated with a view to demonstrating how the application of informatics can both enable and accelerate biological insight into the human transcriptome.</p>

Page generated in 0.1384 seconds