• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 519
  • 246
  • 208
  • 111
  • 56
  • 24
  • 20
  • 18
  • 16
  • 15
  • 14
  • 14
  • 12
  • 11
  • 10
  • Tagged with
  • 1463
  • 280
  • 201
  • 167
  • 140
  • 123
  • 122
  • 121
  • 118
  • 117
  • 116
  • 114
  • 110
  • 106
  • 97
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
231

Most přes Mordovu rokli / Viaduct Mordava rokle

Ondřej, Václav January 2022 (has links)
The aim of this thesis is design of the loadbearing structure of a bridge. Out of two proposed variants was chosen prestressed girder deck of 5 spans. The bridge is built span-by-span in a formwork supported by launching girders. Calculation of load cases is made in software Midas Civil 2021 and Scia Engineer 18.1. The construction is evaluated considering the ultimate limit state and serviceability limit state. The design and evaluation were made according to valid standards.
232

nádrž ČOV / Cast-in-place tank of sewage plant

Sivčák, Jozef Unknown Date (has links)
The master’s thesis designs and checks the reinforced concrete tanks of sewage plant. Part of this thesis are also drawings. Tanks are designed as a watertight underground structure with aspect on standards and watertight function. Foundation slab and concrete walls were designed according to ultimate and serviceability limit states. The thesis includes design of reinforcement according to non-force effects in early stage. The structure is checked also to loss of equilibrium of a structure due to uplift by vertical actions from water pressure.
233

Structural adaptive models in financial econometrics

Mihoci, Andrija 05 October 2012 (has links)
Moderne statistische und ökonometrische Methoden behandeln erfolgreich stilisierte Fakten auf den Finanzmärkten. Die vorgestellten Techniken erstreben die Dynamik von Finanzmarktdaten genauer als traditionelle Ansätze zu verstehen. Wirtschaftliche und finanzielle Vorteile sind erzielbar. Die Ergebnisse werden hier in praktischen Beispielen ausgewertet, die sich vor allem auf die Prognose von Finanzmarktdaten fokussieren. Unsere Anwendungen umfassen: (i) die Modellierung und die Vorhersage des Liquiditätsangebotes, (ii) die Lokalisierung des ’Multiplicative Error Model’ und (iii) die Erbringung von Evidenz für den empirischen Zustandsfaktorparadox über Landern. / Modern methods in statistics and econometrics successfully deal with stylized facts observed on financial markets. The presented techniques aim to understand the dynamics of financial market data more accurate than traditional approaches. Economic and financial benefits are achievable. The results are here evaluated in practical examples that mainly focus on forecasting of financial data. Our applications include: (i) modelling and forecasting of liquidity supply, (ii) localizing multiplicative error models and (iii) providing evidence for the empirical pricing kernel paradox across countries.
234

Ineliminable idealizations, phase transitions, and irreversibility

Jones, Nicholaos John 21 November 2006 (has links)
No description available.
235

Multiple imputation in the presence of a detection limit, with applications : an empirical approach / Shawn Carl Liebenberg

Liebenberg, Shawn Carl January 2014 (has links)
Scientists often encounter unobserved or missing measurements that are typically reported as less than a fixed detection limit. This especially occurs in the environmental sciences when detection of low exposures are not possible due to limitations of the measuring instrument, and the resulting data are often referred to as type I and II left censored data. Observations lying below this detection limit are therefore often ignored, or `guessed' because it cannot be measured accurately. However, reliable estimates of the population parameters are nevertheless required to perform statistical analysis. The problem of dealing with values below a detection limit becomes increasingly complex when a large number of observations are present below this limit. Researchers thus have interest in developing statistical robust estimation procedures for dealing with left- or right-censored data sets (SinghandNocerino2002). The aim of this study focuses on several main components regarding the problems mentioned above. The imputation of censored data below a fixed detection limit are studied, particularly using the maximum likelihood procedure of Cohen(1959), and several variants thereof, in combination with four new variations of the multiple imputation concept found in literature. Furthermore, the focus also falls strongly on estimating the density of the resulting imputed, `complete' data set by applying various kernel density estimators. It should be noted that bandwidth selection issues are not of importance in this study, and will be left for further research. In this study, however, the maximum likelihood estimation method of Cohen (1959) will be compared with several variant methods, to establish which of these maximum likelihood estimation procedures for censored data estimates the population parameters of three chosen Lognormal distribution, the most reliably in terms of well-known discrepancy measures. These methods will be implemented in combination with four new multiple imputation procedures, respectively, to assess which of these nonparametric methods are most effective with imputing the 12 censored values below the detection limit, with regards to the global discrepancy measures mentioned above. Several variations of the Parzen-Rosenblatt kernel density estimate will be fitted to the complete filled-in data sets, obtained from the previous methods, to establish which is the preferred data-driven method to estimate these densities. The primary focus of the current study will therefore be the performance of the four chosen multiple imputation methods, as well as the recommendation of methods and procedural combinations to deal with data in the presence of a detection limit. An extensive Monte Carlo simulation study was performed to compare the various methods and procedural combinations. Conclusions and recommendations regarding the best of these methods and combinations are made based on the study's results. / MSc (Statistics), North-West University, Potchefstroom Campus, 2014
236

Multiple imputation in the presence of a detection limit, with applications : an empirical approach / Shawn Carl Liebenberg

Liebenberg, Shawn Carl January 2014 (has links)
Scientists often encounter unobserved or missing measurements that are typically reported as less than a fixed detection limit. This especially occurs in the environmental sciences when detection of low exposures are not possible due to limitations of the measuring instrument, and the resulting data are often referred to as type I and II left censored data. Observations lying below this detection limit are therefore often ignored, or `guessed' because it cannot be measured accurately. However, reliable estimates of the population parameters are nevertheless required to perform statistical analysis. The problem of dealing with values below a detection limit becomes increasingly complex when a large number of observations are present below this limit. Researchers thus have interest in developing statistical robust estimation procedures for dealing with left- or right-censored data sets (SinghandNocerino2002). The aim of this study focuses on several main components regarding the problems mentioned above. The imputation of censored data below a fixed detection limit are studied, particularly using the maximum likelihood procedure of Cohen(1959), and several variants thereof, in combination with four new variations of the multiple imputation concept found in literature. Furthermore, the focus also falls strongly on estimating the density of the resulting imputed, `complete' data set by applying various kernel density estimators. It should be noted that bandwidth selection issues are not of importance in this study, and will be left for further research. In this study, however, the maximum likelihood estimation method of Cohen (1959) will be compared with several variant methods, to establish which of these maximum likelihood estimation procedures for censored data estimates the population parameters of three chosen Lognormal distribution, the most reliably in terms of well-known discrepancy measures. These methods will be implemented in combination with four new multiple imputation procedures, respectively, to assess which of these nonparametric methods are most effective with imputing the 12 censored values below the detection limit, with regards to the global discrepancy measures mentioned above. Several variations of the Parzen-Rosenblatt kernel density estimate will be fitted to the complete filled-in data sets, obtained from the previous methods, to establish which is the preferred data-driven method to estimate these densities. The primary focus of the current study will therefore be the performance of the four chosen multiple imputation methods, as well as the recommendation of methods and procedural combinations to deal with data in the presence of a detection limit. An extensive Monte Carlo simulation study was performed to compare the various methods and procedural combinations. Conclusions and recommendations regarding the best of these methods and combinations are made based on the study's results. / MSc (Statistics), North-West University, Potchefstroom Campus, 2014
237

Restricting dry matter intake of stocker calves and its subsequent effects on grazing, feedlot performance, and carcass characteristics

Anglin, Chad O'Neal January 1900 (has links)
Master of Science / Department of Animal Sciences and Industry / Dale A. Blasi / An experiment was conducted to evaluate the effects of dry matter intake (DMI) restriction on early receiving performance by steers in a drylot and subsequent grazing performance, feedlot performance, and carcass characteristics. During the backgrounding period, crossbred, weanling steers (n = 329; initial BW = 191± 5.52 kg ) were randomly assigned to 1 of 4 DMI levels corresponding to ad libitum, 2.50% of BW (2.50%), 2.25% of BW (2.25%), and 2.00% of BW (2.00%) for 62 d. During the subsequent grazing period, the same steers were randomly assigned to 13 paddocks to graze for 90 d. Paddocks were stocked at 281 kg live weight per hectare. Initial steer BW were similar on each pasture and each backgrounding treatment was equally represented within a paddock. During the feedlot period, steers were finished at a commercial feedlot and were assigned to 1 of 4 pens according to their rank in BW. Entire pens were harvested when average steer BW reached 545 kg. During the backgrounding period, ad libitum-fed steers had greater (P < 0.001) ADG and final BW than other treatments; steers fed at 2.50 and 2.25% of BW had similar ADG and final BW and were greater (P < 0.001) than steers fed 2.00% of BW. During the grazing period, compensatory gain was observed in restricted DMI treatments. Steers fed at 2.00% of BW had greater (P = 0.006) ADG than ad libitum-fed steers but an ADG similar to that of the other restricted DMI treatments. Steers fed ad libitum, 2.50% of BW, and 2.25% of BW had similar final BW and steers fed 2.00% of BW had lesser (P < 0.001) final BW than other treatments. During the feedlot phase, steers fed 2.00% of BW were on feed longer (P < 0.05) than other treatments. Growth compensation during grazing illustrated that restricted feeding immediately prior to pasture grazing can reduce backgrounding costs.
238

Nutrition and management strategies for confinement fed cattle: step-up programs, alternative feed ingredients, and health programs

Wallace, Justin Oliver January 1900 (has links)
Master of Science / Department of Animal Sciences and Industry / Christopher D. Reinhardt / Three experiments were conducted to examine nutritional and management strategies for different segments of the beef industry. The first experiment examined the effects of feeding traditional step-up diets (STEP) vs. limit-feeding (LIMIT) the finishing diet to adapt cattle to high-concentrate diets. When all cattle reached ad libitum intake of the finishing diet there was a trend (P = 0.09) for DMI to be different between treatments. During week 1, STEP cattle had higher total VFA concentrations (P = 0.02), while LIMIT cattle had higher valerate absorption (P = 0.02) and disappearance (P = 0.08). During week 4, LIMIT cattle had higher total VFA concentrations (P = 0.03) and lower valerate disappearance and absorption (P = 0.05) than STEP cattle. These results indicate that limit-feeding the finishing diet may inhibit nutrient absorption from the rumen or this method may cause increased production of valerate by lactate utilizing bacteria due to a more acidotic rumen environment. The second experiment examined the effects of feeding 5% (DM basis) dried, full-fat corn germ (GERM) on feedlot performance and carcass characteristics of naturally raised yearling steers and heifers. Carcass-adjusted ADG was higher for GERM cattle (P = 0.04). There were no other differences in performance or carcass characteristics. Total incidence of liver abscesses and the incidence of severe liver abscesses were decreased by 12 and 8.2% (P = 0.01 and 0.02, respectively) when GERM was added to the diet. Corn germ can be added to finishing diets at 5% without affecting performance and carcass characteristics. Producers raising natural cattle may also be able to benefit from the reduced incidence of liver abscesses. The third experiment examined concurrent metaphylactic treatment of high-risk calves with tulathromycin and chlortetracycline. Calves were placed on 1 of 3 treatments: 1) no top-dress pellets; 2) diet top-dressed with pellets containing chlortetracycline; or 3) diet top-dressed with pellets containing no chlortetracycline. There were no differences in the performance or health of these calves (P > 0.25). There are no additive benefits of concurrent metaphylaxis using both tulathromycin and chlortetracycline. This information could assist producers when designing receiving health protocols for high-risk calves.
239

Limits to temporal synchronization in fundamental hand and finger actions

Gu, Yanjia January 2014 (has links)
Coordinated movement is critical not only to sports technique and performance but to daily living and as such represents a fundamental area of research. Coordination requires being able to produce the right actions at the right time and has to incorporate perception, cognition, and forceful neuro-muscular interaction with the environment. Coordinated movements of the hands and fingers are some of the most complex activities undertaken where continuous learning and adaptation take place, but the temporal variability of the most basic movement components is still unknown. This thesis investigates the extent of temporal variability in the execution of four different simple hand and finger coordination tasks, with the purpose to find the various intrinsic temporal variability which limit the ability to coordinate the hands in space and time. Study one showed that in a synchronized bi-lateral two finger tapping test (<<1 cm movement to target) the best participant had a temporaltiming variability of 4.8 ms whereas the largest time variability could be as high as 24.8 ms. No obvious improvement was found after transfer practice, whereas the average time variability for asynchronized tapping decreased from 62.1 ms to 30.3 ms after instructed practice indicating a likely change in task grouping. Study two showed that in a unilateral thumb-index finger pinch and release test, the largest mean timing variability was 12 ms for pinching irrespective of performing the task in a slow alert manner or at a faster speed. However, the mean temporal variability for release was only 6.3 ms when the task was performed in a more alert manner and indicates that release is more accurately controlled temporally than grip. Study three suggested that in a unilateral sagittal plane throwing action of the lower arm and hand, that elbow and wrist coordination for dynamic index finger tip location was better with a radial-ulnar deviation, darts-type, throwing action than a wrist flexor-extensor type action, basketball free throw type action (the mean variability was 37.5 ms and 27.2 ms, respectively). Study four compared the variability in bi-lateral finger tapping between voluntary tapping and involuntary finger contraction tapping. Electrically stimulated neural contractions had significantly lower force onset variability than voluntary or direct magnetic stimulation of muscles (6 ms, 9.5 ms, and 10.3 ms for electrically stimulated, voluntary and Transcranial Magnetic Stimulation stimulated contraction). This work provides a comprehensive analysis of the temporal variability in various fundamental digital movement tasks that can aid with the understanding of basic human coordination in sporting, daily living and clinical areas.
240

Terahertz Near-field Investigation of a Plasmonic GaAs Superlens

Fehrenbacher, Markus 26 April 2016 (has links) (PDF)
This work presents the first demonstration of a semiconductor based plasmonic near-field superlens, utilizing highly doped GaAs to generate infrared optical images with a spatial resolution beyond the difraction limit. Being easily transferable to other semiconductor materials, the concept described in this thesis can be exploited to realize spectrally adjustable superlenses in a wide spectral range. The idea of superlensing has been introduced theoretically in 2000, followed by numerous publications including experimental studies. The effect initiated great interest in optics, since in contrast to difraction limited conventional optical microscopy it enables subwavelength resolved imaging by reconstructing the evanescent waves emerging from an object. With techniques like scanning near-field optical microscopy (SNOM) and stimulated emission depletion (STED) being already successfully established to overcome the conventional restrictions, the concept of superlensing provides a novel, different route towards high resolution. Superlensing is a resonant phenomenon, relying either on the excitation of surface plasmons in metallic systems or on phonon resonances in dielectric structures. In this respect a superlens based on doped semiconductor benefits from the potential to be controlled in its operational wavelength by shifting the plasma frequency through adjustment of the free carrier concentration. For a proof of principle demonstration, we investigate a superlens consisting of a highly n-doped GaAs layer (n = 4 x 10^18 cm-3) sandwiched between two intrinsic layers. Recording near-field images of subwavelength sized gold stripes through the trilayer structure by means of SNOM in combination with a free-electron laser, we observe both enhanced signal and improved spatial resolution at radiation wavelengths close to l = 22 µm, which is in excellent agreement with simulations based on the Drude-Lorentz model of free electrons. Here, comparative investigations of a purely intrinsic reference sample confirm that the effect is mediated by the charge carriers within the doped layer. Furthermore, slightly differently doped samples provide indications for the expected spectral shift of the resonance. According to our calculations, the wavelength range to be exploited by n-GaAs based superlenses reaches far into the terahertz region, whereas other semiconductor materials are required to explore the near infrared.

Page generated in 0.0235 seconds