• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 744
  • 350
  • 73
  • 73
  • 73
  • 73
  • 73
  • 72
  • 48
  • 31
  • 9
  • 5
  • 5
  • 4
  • 3
  • Tagged with
  • 1693
  • 1693
  • 271
  • 253
  • 236
  • 208
  • 186
  • 185
  • 173
  • 166
  • 145
  • 138
  • 137
  • 126
  • 125
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
171

Gestion d'information sur les procédés thermiques par base de données

Gagnon, Bertrand. January 1986 (has links)
No description available.
172

A Computer Method to Evaluate the Vibration Response of Stiffened Floor Systems to Impact Load

Donahue, Michael E. 01 January 1982 (has links) (PDF)
Composite construction of floor systems with steel joist or I-beam stiffeners supporting a concrete slab have a tendency to sustain perceptible vibration due to small impacts. An engineering design aid, utilizing a small micro-computer and a BASIC computer program, was developed to analyze rectangular floor systems for susceptibility to sustain vibrations perceptible to humans. The analytical method for the normal modes of vibration using the Rayleigh-Ritz method and the superposition of the normal mode response is derived. A vibration index is calculated to qualitatively rate the floor. The computer method was validated using published data for an I-beam supported concrete slab and a design application for a steel-joist supported floor is demonstrated using a representative human induced impact load.
173

A software system for laptop performance and improvisation /

Zadel, Mark. January 2006 (has links)
No description available.
174

Effects of automated cartographic generalization on linear map features

Young, John A. 04 December 2009 (has links)
The process of automated cartographic generalization is critically reviewed, and methods developed for implementation and analysis are discussed. The manner in which automated generalization relates to manual cartographic methods and feature representation is analyzed. It is suggested that the nature of representation of linear features on maps be considered in the analysis of effectiveness of automated generalization. The development of a computer platform for evaluating linear generalization algorithms is described and three studies which make use of the platform are discussed. An analysis of the performance of five simplification algorithms is compared to performance of a random simplification algorithm. It was found that in most cases tested, the five simplification algorithms performed better than random. An analysis of the stability of fractal dimension estimated on simplified lines was conducted and it is suggested that the fractal dimension is a poor guide for linear simplification due the instability in measurement. An examination of the effect of generalization on linear features as represented by contoured topography and paired stream bank lines was performed. Through the use of measurements of slope on contour lines and width on stream lines, it was determined that automated generalization has an effect on linear feature representations. Guidelines for application of linear generalization algorithms are suggested and needs and direction for future research are discussed. / Master of Science
175

Purchase selection of a major computer system in Hong Kong: considerations and marketingimplications

聶禮明, Nip, Lai-ming. January 1988 (has links)
published_or_final_version / Business Administration / Master / Master of Business Administration
176

An optimizing code generator generator.

Wendt, Alan Lee. January 1989 (has links)
This dissertation describes a system that constructs efficient, retargetable code generators and optimizers. chop reads nonprocedural descriptions of a computer's instruction set and of a naive code generator for the computer, and it writes an integrated code generator and peephole optimizer for it. The resulting code generators are very efficient because they interpret no tables; they are completely hard-coded. Nor do they build complex data structures to communicate between code generation and optimization phases. Interphase communication is reduced to the point that the code generator's output is often encoded in the program counter and conveyed to the optimizer by jumping to the right label. chop's code generator and optimizer are based on a very simple formalism, namely rewriting rules. An instrumented version of the compiler infers the optimization rules as it complies a training suite, and it records them for translation into hard code and inclusion into the production version. I have replaced the Portable C Compiler's code generator with one generated by chop. Despite a costly interface, the resulting compiler runs 30% to 50% faster than the original Portable C Compiler (pcc) and generates comparable code. This figure is diluted by common lexical analysis, parsing, and semantic analysis and by comparable code emission. Allowing for these, the new code generator appears to run approximately seven times faster than that of the original pcc.
177

The design of special purpose finite element packages

Butterfield, David January 1984 (has links)
No description available.
178

QoS-enabled integration of wireless sensor networks and the internet

AlMaharmeh, Bassam T. 09 1900 (has links)
Recent developments in sensor networking for both military and civilian applications emphasized the need for a reliable integration of sensor networks with the Internet. For sensor networks deployed in various military applications, it is important that collected information be delivered as fast as possible with minimum delays. Information collected by sensor networks has different priority levels and hence QoS profiles must be provided in accordance with those priorities. In this study, an integration module is proposed. The objective of the module is to provide preferential services for high-priority traffic. The integration process consists of three phases: registration, control, and monitor. The three phases will be conducted by three software components: the registration service manager (RSM), the QoS control manager (QCM), and the network monitor manager (NMM). The three software components run on a stand-alone laptop and together form the integration controller (IC), which is the core of the integration module.
179

A novel approach to modeling tunnel junction diodes using Silvaco Atlas software

Gelinas, Robert J. 12 1900 (has links)
This thesis investigates the ability to model a tunnel junction device using the ATLAS device simulator by Silvaco International. The tunnel junction is a critical component of a multijunction solar cell. This thesis will concentrate on simulating the tunnel junction for application as part of a multijunction solar cell. It will try several methods, in ATLAS device simulator, to produce a model of the tunnel junction that can later be used while designing multijunction devices. These methods will consist of the review of past work, attempting to modify past work to be applied in the current design, producing a new tunnel junction simulation from the ground-up, and review of the simulations of similar devices to learn if they can be modified and applied to making a working tunnel junction model.
180

Analysis of the reliability disparity and reliability growth analysis of a combat system using AMSAA extended reliability growth models

Er, Kim Hua. 12 1900 (has links)
The first part of this thesis aims to identify and analyze what aspects of the MIL-HDBK-217 prediction model are causing the large variation between prediction and field reliability. The key findings of the literature research suggest that the main reason for the inaccuracy in prediction is because of the constant failure rate assumption used in MIL-HDBK-217 is usually not applicable. Secondly, even if the constant failure rate assumption is applicable, the disparity may still exist in the presence of design and quality related problems in new systems. A possible solution is to apply reliability growth testing (RGT) to new systems during the development phase in an attempt to remove these design deficiencies so that the system's reliability will grow and approach the predicted value. In view of the importance of RGT in minimizing the disparity, this thesis provides a detailed application of the AMSAA Extended Reliability Growth Models to the reliability growth analysis of a combat system. It shows how program managers can analyze test data using commercial software to estimate the system demonstrated reliability and the increased in reliability due to delayed fixes.

Page generated in 0.0539 seconds