• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7936
  • 3170
  • 1596
  • 831
  • 749
  • 716
  • 218
  • 177
  • 149
  • 109
  • 109
  • 106
  • 106
  • 106
  • 106
  • Tagged with
  • 19311
  • 2592
  • 2103
  • 1863
  • 1778
  • 1700
  • 1513
  • 1499
  • 1472
  • 1453
  • 1392
  • 1358
  • 1245
  • 1222
  • 1167
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Photometric observations of auroral oscillations

Singh, S. January 1980 (has links)
No description available.
62

Assessing materials for use as outdoor insulation

Watson, J. F. January 1980 (has links)
No description available.
63

Design methods for production machinery companies

Gouvinhas, Reidson Pereira January 1998 (has links)
No description available.
64

Structure and activity of the amide group : conformational and stereoelectronic effects on biological and chemical activity

Lewis, Richard J. January 1990 (has links)
No description available.
65

Active isolation of vibration

McKinnell, Robert James January 1989 (has links)
No description available.
66

A NON-PARAMETRIC TEST PROCEDURE BASED ON RANGE STATISTICS TO IDENTIFY CAUSES OF NON-NORMALITY IN SPECULATIVE PRICE CHANGE DISTRIBUTIONS.

ABRAHAMSON, ALLEN ARNOLD. January 1982 (has links)
Most models of asset pricing or market equilibrium generally require the assumption of stationary price change generation. That is, the mean and/or variance of the price change is hypothesized to be constant over time. On the other hand, the widely accepted models of speculative price change generation, such as the subordinated stochastic process models, have their basis in mixtures of random variables. These mixtures, or compositisations, define non-stationary, non-Normally distributed forms. Therefore, the models based on mixtures cannot be reconciled to requirements of stationarity. A contaminated process, such as that suggested by Mandelbroit, implies continuously changing mean and/or variance. However, an alternative concept of mixture exists, which is consistent with models requiring stationary moments. This process is referred to as slippage. Slippage defines a state where moments are constant for intervals of time, but do change value. If speculative price changes were found to be characterized by slippage, rather than by contamination, then such a finding would still be consistent with the empirical distributions of price changes. More importantly, slippage would meet the requirement of stationarity imposed on the capital market and options models. This work advanced a methodology that discriminates between contamination-based and slippage-based non-stationarity in speculative price changes. Such a technique is necessary, inasmuch as curve fitting or estimation of moments cannot so discriminate. The technique employs non-parametric range estimators. Any given form of non-Normality induces an identifiable pattern of bias upon these estimators. Once a pattern induced by a time series of price changes is identified; this pattern then infers whether contamination, or, alternatively, slippage, generated the time series. Due to the composition and technique of the procedure developed here, it is referred to as a "Range Spectrum." The results examined here find that stocks do display contamination, as hypothesized by the subordinate stochastic models. A broad based index of price change, however, displays the characteristics of slippage. This quality not only has implications for, but suggests possibilities for further research, in the areas of diversification, securities and options pricing, and market timing.
67

Complex seismic sources and time-dependent moment tensor inversion.

Kim, Junkyoung. January 1989 (has links)
There are many examples of earthquakes whose surface expressions are much more complicated than the seismologically derived faulting models. Seismologists also have found seismic source complexity and improved seismicity data have shown that rupture may occur on irregular or multiple shear surfaces. To simultaneously map both geometrical and temporal variation of the seismic sources for a complex rupture history from observed seismograms, it is possible to use a time dependent moment tensor (TDMT) inversion. The TDMT inversion algorithm has been tested with three synthetic data examples with varying degrees of complexity. The first example demonstrates that a multiple source with no focal depth change can be recovered, and the source parameters of each of the subevents can be accurately determined. In the second case we allowed the depth to vary for subevents (9-km and 13-km source depth, respectively). The two subevents can be identified on the basis of simultaneous shape-change of the moment tensor elements along with non-causality and the size of the CLVD component. The third example introduced source complexity by having two subevents which overlapped in time. The overlapped period could be seen in the moment tensor elements as unusually abrupt changes in the time function shape. The TDMT inversion was also performed on long-period body waves for three earthquakes: the 1982 Yemen earthquake, the 1971 San Fernando earthquake, and the 1952 Kern County earthquake. The Yemen earthquake was mapped as two simple, normal-slip subevents (with onset timing of the second subevent 5 seconds after the first) without a significant component of left- or right-lateral displacement or source depth change. The San Fernando earthquake is interpreted as two shear dislocation sources with changing source depths, possibly indicating upward rupture propagation (from 13-km to 7-km). The interpretation of the TDMT inversion for the Kern County earthquake was also a double point source which propagates upward from 20-km to 5-km. The resultant moment tensor functions from inversion of the synthetic waveforms, a combination of isotropic and tectonic release, demonstrated that the tectonic release associated with underground nuclear explosion can be separated and identified if the source depth between the explosions and tectonic release are different.
68

The application of the jackknife in geostatistical resource estimation: Robust estimator and its measure of uncertainty.

Adisoma, Gatut Suryoprapto January 1993 (has links)
The application of the jackknife in geostatistical resource estimation (in conjunction with kriging) is shown to yield two significant contributions. The first one is a robust new estimator, called jackknife kriging, which retains ordinary kriging's simplicity and global unbiasedness while at the same time reduces its local bias and oversmoothing tendency. The second contribution is the ability, through the jackknife standard deviation, to set a confidence limit for a reserve estimate of a general shape. Jackknifing the ordinary kriging estimate maximizes sample utilization, as well as information of sample spatial correlation. The jackknife kriging estimator handles the high grade smearing problem typical in ordinary kriging by assigning more weight to the closest sample(s). The result is a reduction in the local bias without sacrificing global unbiasedness. When data distribution is skewed, log transformation of the data prior to jackknifing is shown to improve the estimate by making the data behave better under jackknifing. The technique of block kriging short-cut, combined with jackknifing, are shown as an easy-to-use solution to the problem of grade estimation of a general three-dimensional digitized shape and the uncertainty associated with the estimate. The results are a single jackknife kriging estimate for the shape and its corresponding jackknife variance. This approach solves the problem of combining independent block estimation variances, and provides a simple way to set confidence levels for global estimates. Unlike the ordinary kriging variance, which is a measure of data configuration and is independent of data values, the jackknife kriging variance reflects the variability of the values being inferred, both on an individual block level and on the global level. Case studies involving two exhaustive (symmetric and highly skewed) data sets indicates the superiority of the jackknife kriging estimator over the original (ordinary kriging) estimator. Some instability of the log-transformed jackknife estimate is noted in the highly skewed situation, where the data do not generally behave well under standard jackknifing. A promising solution for future investigations seems to lie in the use of weighted jackknife formulation, which should better handle a wider spectrum of data distribution.
69

Investigation of the supported liquid membrane process for metal extraction

Mead, D. A. January 1987 (has links)
No description available.
70

Studies of the IgE receptor using sequence-specific peptides, antibodies and other probes

Gao, Bin January 1995 (has links)
No description available.

Page generated in 0.0338 seconds