• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 993
  • 547
  • 221
  • 212
  • 136
  • 100
  • 77
  • 17
  • 17
  • 17
  • 17
  • 17
  • 16
  • 15
  • 12
  • Tagged with
  • 2753
  • 458
  • 278
  • 237
  • 230
  • 192
  • 169
  • 164
  • 160
  • 145
  • 145
  • 128
  • 127
  • 119
  • 117
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
351

Problems concerning the design of an electronic instrument equivalent to a pipe organ

Kent, Earle L. (Earle Lewis), 1910-1994 January 2011 (has links)
Typescript, etc. / Digitized by Kansas State University Libraries
352

Design and construction of mass spectrograph components

Hartke, Jerome Luther. January 1956 (has links)
Call number: LD2668 .T4 1956 H38 / Master of Science
353

A proportional counter spectrometer

Cox, Eugene B. January 1953 (has links)
Call number: LD2668 .T4 1953 C69 / Master of Science
354

Accounting for financial instruments in corporate treasuries

Mulder, Ignatius Jacobus 06 1900 (has links)
The purpose of this thesis is to demonstrate the need for enhanced accounting methodology for financial instruments which are traded in the global financial markets. The thesis proposes an accounting framework within which the value-at-risk of financial instruments can be disclosed in the financial statements of enterprises. The thesis considers accounting developments in recent years and analyses the latest proposals suggested by international accounting bodies. It furthermore contemplates the requirements of the Bank for International Settlements in terms of capital adequacy and value-at-risk requirements. In order to provide a meaningful analysis of the subject matter of financial instruments, the various market risks pertaining to the accounting of financial instruments are discussed and considered in terms of their application to the underlying bu. siness of the enterprise. - Extensive analysis is done of valuation techniques and the mathematical concepts of value-atrisk. In this regard the pioneering works of professor Philippe Jorion of the University of California is used to illustrate the application of value-at-risk. The objective of this comprehensive analysis of value-at-risk is to suggest a meaningful method to account for risk exposures in financial instruments and ensure greater transparency in terms of disclosure. In this regard the thesis follows the guidelines proposed by the International Accounting Standards Committee in terms of recognition (definitions}, measurement (valuation}, presentation (classification} and disclosure (terms, conditions and accounting policies} of financial instruments. Consideration is also given to global accounting harmonisation and a number of accounting concerns which are presently unresolved. In this regard certain hedge issues as well as the differences between accrual accounting and fair value accounting are considered. Disclosure requirements are analysed in detail, especially in respect of value-at-risk accounting. Finally, the thesis illustrates the significant growth of products and instruments in the financial markets and the severe financial impact it has in terms of global capital and global financial losses. / Financial Accounting / DCom (Applied Accountancy)
355

Graphical Programming and Implementation of the NI-7962 and NI-5781 FPGA Interface

Al-Daghestani, Anas, AlKassem, Mahinour January 2016 (has links)
FPGA systems can have a wide variety of applications within electrical engineering, product development, and prototyping. Their flexibility, low cost, and high performance have made it burst into the market with results that exceeded many expectations. National Instruments offers several software and hardware that integrate FPGA systems in their design and implementation. In this thesis work, a NI FPGA system is used along with LabVIEW myRIO 2014 software to run a graphical FPGA code, hence, identifying best practices that must be associated with using the software and the hardware of National Instruments FPGA interfaces and also compare different methods for programming, communication, and data conversion of the FPGA interfaces.
356

Flexural rigidity of nickel-titanium instruments

Ho, Wing-lam., 何潁琳. January 2003 (has links)
published_or_final_version / Dentistry / Master / Master of Dental Surgery
357

A study of magnetic spectrograph measurements of muons at sea-level

柳啓瑞, Lau, Kai-shui. January 1973 (has links)
published_or_final_version / Physics / Doctoral / Doctor of Philosophy
358

Low-cycle fatigue of NiTi rotary instruments in hypochlorite

Abduljabbar, Fouad Abdulbaky. January 2009 (has links)
published_or_final_version / Endodontics / Master / Master of Dental Surgery
359

OPTICS FOR LARGE TELESCOPE.

WAN, DER-SHEN. January 1987 (has links)
There are two topics in this dissertation: one is to develop new phase reduction algorithms for test interferograms especially of large optics and the other one is to find more accurate analytical expression of surface deflection due to gravity when the mirror is supported in the axial direction. Two new algorithms for generating phase maps from interferograms are developed. Both methods are sensitive to small-scale as well as large-scale surface errors. The first method is designed to generate phase from an interferogram that is sampled and digitized only along fringe centers, as in the case of manual digitization. A new interpolation algorithm uses the digitized data more efficiently than the fitting of Zernike polynomials, so the new method can detect small-scale surface error better than Zernike polynomial fitting. The second algorithm developed here is an automatic phase reduction process which works on test interferograms recorded by CCD camera and transferred digitally to a personal computer through a frame grabber. The interferogram results from interference of the test wavefront with a tilted reference wave-front. Phase is generated by assuming it to be proportional to the intensity of the interferogram, apart from changes of sign and offset occurring every half fringe so as to make the phase increase monotoically. The error of the new algorithm is less than 1/20 waves in the wavefront, which can be reduced further by averaging several phase maps which are generated by interferograms with random phase shifts. The new algorithm is quick and involves no smoothing, so it can detect surface errors on large mirrors on a scale of several centimeters. A new model is developed to calculate analytically the surface deflection of a mirror supported axially on multiple points. It is based on thin plate theory, but considerations of thickness variation of a curved mirror, lightweight honeycomb structure and shear are included. These additions improve the accuracy of the calculated surface deflection, giving results close to those obtained from the accurate but computer intensive finite element model.
360

ANALYSIS OF ALIGNMENT AND SURFACE FIGURE ERRORS IN OPTICAL SYSTEMS.

SHU, KER-LI. January 1982 (has links)
The effects of alignment and surface figure errors and their compensation with each other in optical systems are analyzed based on computer simulations with exact ray tracing data. These effects are included in the prediction of system performance and the testing of optics. Several simple systems are used as examples. In the prediction of system performance, a Ritchey-Chretian telescope and a Reflaxicon system are studied. A correct alignment can be found to compensate certain surface figure errors in the system. This will allow larger surface figure errors to be tolerated in the system. In the testing of optics, a method to separate the figure errors from the alignment error contributions is discussed and an off-axis test configuration, the Ritchey-Common test, is studied thoroughly. A figure design approach is suggested and compared with other approaches for reduction of the measured wavefront data in the Ritchey-Common test.

Page generated in 0.0488 seconds