• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 81
  • 31
  • 12
  • 11
  • 10
  • 6
  • 6
  • 4
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 193
  • 29
  • 22
  • 22
  • 14
  • 14
  • 13
  • 13
  • 12
  • 12
  • 12
  • 12
  • 11
  • 11
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Corruption : the Erosion of African Economic Standards

Persson, David January 2005 (has links)
<p>Africa has during the past decades experienced vast difficulties in inducing greater levels of economic growth, which in turn has stirred intensive debates in an attempt to unveil its causes. A dawning debate to surface during recent years places corruption as a potent obstacle to impede and dent African economic progress. Embracing a theoretical and regression analysis, this thesis sets out to unravel the causes of African corruption, its implications, and its effects upon the economic standards of a number of selected countries. The findings reveal that corruption, amid all time-periods analyzed, discloses a strong deleterious impact upon GNI per capita primarily by damaging and undermining the African insti-tutional framework, which in turn is unable to function optimally. The outcome is that less economic progress [and thus lower levels of income] is being generated as resources are allocated and squandered in a non-optimal way. It is also substantiated that Protestantism and a high degree of homogeneity are factors that exercise a positive influence upon corruption and economic standards. The thesis finally illuminates the intricate and ubiquitous impediments that obscure Africa’s economic progress. It is concluded that inept governments and institutions too often lie at the core of the quandary. The current standard of Africa’s governments and institutions thus often leave much to be desired.</p>
52

An Experimental Study of a Liquid Steel Sampling Process

Ericsson, Ola January 2010 (has links)
During the steelmaking process samples are taken from the liquid steel, mainly to assess the chemical composition of the steel. Recently, methods for rapid determination of inclusion characteristics (size and composition) have progressed to the level where they can be implemented in process control. Inclusions in steel can have either good or detrimental effects depending on their characteristics (size, number, composition and morphology). Thereby, by determination of the inclusion characteristics during the steelmaking process it is possible to steer the inclusion characteristics in order to increase the quality of the steel. However, in order to successfully implement these methods it is critical that the samples taken from the liquid steel represent the inclusion characteristics in the liquid steel at the sampling moment.   The purpose of this study is to investigate the changes in inclusion characteristics during the liquid steel sampling process. Experimental studies were carried out at steel plants to measure filling velocity and solidification rate in real industrial samples. The sampling conditions for three sample geometries and two slag protection types were determined. Furthermore, the dispersion of the total oxygen content in the samples was evaluated as a function of sample geometry and type of slag protection. In addition, the effects of cooling rate as well as oxygen and sulfur content on the inclusion characteristics were investigated in laboratory and industrial samples. Possibilities to separate primary (existing in the liquid steel at sampling moment) and secondary (formed during cooling and solidification) inclusions depending on size and composition were investigated. Finally, in order to evaluate the homogeneity and representative of the industrial samples the dispersion of inclusion characteristics in different zones and layers of the samples were investigated.   It was concluded that the type of slag protection has a significant effect on the filling velocity and the sampling repeatability. Furthermore, that the thickness of the samples is the main controlling factor for the solidification rate. It was shown that top slag can contaminate the samples. Therefore, the choice of slag protection type is critical to obtain representative samples. It was shown that the cooling rate has a significant effect on the number of secondary precipitated inclusions. However, the number of primary inclusions was almost constant and independent on the cooling rate. In most cases it is possible to roughly separate the secondary and primary oxide inclusions based on the particle size distributions. However, in high-sulfur steels a significant amount of sulfides precipitate heterogeneously during cooling and solidification. This makes separation of secondary and primary inclusions very difficult. Moreover, the secondary sulfides which precipitate heterogeneously significantly change the characteristics (size, composition and morphology) of primary inclusions. The study revealed that both secondary and primary inclusions are heterogeneously dispersed in the industrial samples. In general, the middle zone of the surface layer is recommended for investigation of primary inclusions. / QC 20101112
53

A Study of Lexical Availability Among Monolingual-Bilingual Speakers of Spanish and English

Victery, John Bailey Jr. January 1971 (has links)
The purpose of this thesis has been to study vocabulary elicited from ten different areas of subject matter by means of limited time testing on mixed control-groups comprising ninety-nine students in the 16-17-18 year-old age range, of which 33 were monolinguals in Spanish (from Monterrey, Mexico), 33 were monolinguals in English (from Houston, Texas), and 33 were bilinguals (also from Houston). The history of such testing and its recent evolutions; important pre-testing discoveries; the manual and technological methods used in carrying out the actual testing; its rationale, and how vocabulary studies may be classified are important properties of the techniques and analyses of this study. The practical applications of lexical availability are seen in relation to what lexical availability is and how it may be measured. The testing was divided into distinct areas of subject matter and for each, the students gave, by means of a free association type response, words which they related directly or indirectly to one stimulus at a time. The participants were required to write down their responses as pretesting experimentation uncovered some severe disadvantages in using oraltype recording devices. Each stimulus was allowed two minutes. Lexical homogeneity to the highest degree possible was desirable; therefore, subject matters were selected on a basis of universality. The socioeconomic and sociocultural backgrounds of the Spanish speaking monolinguals was seen to be advantaged over that of the English speaking monolinguals and the bilinguals, based on the occupational statuses of their families and types of city districts wherein their homes are found. The analytical development of the results brought to light some surprising findings. The English speaking monolinguals ranked first in production of total lexical items (6,/140); the Spanish speaking bilinguals ranked second in total production (5,672); English speaking bilinguals ranked third (5,572) and Spanish speaking monolinguals totaled 4,696 items, ranking fourth. As to different items, Spanish speaking bilinguals produced 2,539, ranking first; English speaking monolinguals elicited 2,454, ranking second; English speaking bilinguals ranked third with 2,384 and Spanish speaking monolinguals yielded 1,904 different items, ranking fourth. Females consistently outranked the males in lexical production-by 11.59% in total items and 10.89% in different items. Of the 22,380 total items produced, girls elicited 13,404 to the males' production of 8,976 (weight-corrected figure: 10,145). That portion of the entire corpus which yielded items of 8 occurrences or more comprised 44.81%. Significant to the study of lexical availability is cognitive concomitance; that is, the degree of universal agreement to be found concurrent to the participating group. In the case of this study, the fact that 45% (rounded figure) of the lexical items were shared by and dispersed to such a substantial degree among all informants was confirmation of lexical homogeneity.
54

Statistical tests based on N-distances / Statistinių hipotezių tikrinimas, naudojant N-metrikas

Bakšajev, Aleksej 09 April 2010 (has links)
The thesis is devoted to the application of a new class of probability metrics, N-distances, introduced by Klebanov (Klebanov, 2005; Zinger et al., 1989), to the problems of verification of the classical statistical hypotheses of goodness of fit, homogeneity, symmetry and independence. First of all a construction of statistics based on N metrics for testing mentioned hypotheses is proposed. Then the problem of determination of the critical region of the criteria is investigated. The main results of the thesis are connected with the asymptotic behavior of test statistics under the null and alternative hypotheses. In general case the limit null distribution of proposed in the thesis tests statistics is established in terms of the distribution of infinite quadratic form of random normal variables with coefficients dependent on eigenvalues and functions of a certain integral operator. It is proved that under the alternative hypothesis the test statistics are asymptotically normal. In case of parametric hypothesis of goodness of fit particular attention is devoted to normality and exponentiality criteria. For hypothesis of homogeneity a construction of multivariate distribution free two-sample test is proposed. Testing the hypothesis of uniformity on hypersphere in more detail S1 and S2 cases are investigated. In conclusion, a comparison of N-distance tests with some classical criteria is provided. For simple hypothesis of goodness of fit in univariate case as a measure for... [to full text] / Disertacinis darbas yra skirtas N-metrikų teorijos (Klebanov, 2005; Zinger et al., 1989) pritaikymui klasikinėms statistinėms suderinamumo, homogeniškumo, simetriškumo bei nepriklausomumo hipotezėms tikrinti. Darbo pradžioje pasiūlytas minėtų hipotezių testinių statistikų konstravimo būdas, naudojant N-metrikas. Toliau nagrinėjama problema susijusi su suformuotų kriterijų kritinės srities nustatymu. Pagrindiniai darbo rezultatai yra susiję su pasiūlytų kriterijaus statistikų asimptotiniu skirstiniu. Bendru atveju N-metrikos statistikų asimptotinis skirstinys esant nulinei hipotezei sutampa su Gauso atsitiktinių dydžių begalinės kvadratinės formos skirstiniu. Alternatyvos atveju testinių statistikų ribinis skirstinys yra normalusis. Sudėtinės suderinamumo hipotezės atveju išsamiau yra analizuojami normalumo ir ekponentiškumo kriterijai. Daugiamačiu atveju pasiūlyta konstrukcija, nepriklausanti nuo skirstinio homogeniškumo testo. Tikrinant tolygumo hipersferoje hipotezę detaliau yra nagrinėjami apskritimo ir sferos atvejai. Darbo pabaigoje lyginami pasiūlytos N-metrikos bei kai kurie klasikiniai kriterijai. Neparametrinės suderinamumo hipotezės vienamačiu atveju, kaip palyginimo priemonė, nagrinėjamas Bahaduro asimptotinis santykinis efektyvumas (Bahadur, 1960; Nikitin, 1995). Kartu su teoriniais rezultatais pasiūlytų N-metrikos tipo testų galingumas ištirtas, naudojant Monte-Karlo metodą. Be paprastos ir sudėtinės suderinamumo hipotezių yra analizuojami homogeniškumo testai... [toliau žr. visą tekstą]
55

Statistinių hipotezių tikrinimas, naudojant N-metrikas / Statistical tests based on N-distances

Bakšajev, Aleksej 09 April 2010 (has links)
Disertacinis darbas yra skirtas N-metrikų teorijos (Klebanov, 2005; Zinger et al., 1989) pritaikymui klasikinėms statistinėms suderinamumo, homogeniškumo, simetriškumo bei nepriklausomumo hipotezėms tikrinti. Darbo pradžioje pasiūlytas minėtų hipotezių testinių statistikų konstravimo būdas, naudojant N-metrikas. Toliau nagrinėjama problema susijusi su suformuotų kriterijų kritinės srities nustatymu. Pagrindiniai darbo rezultatai yra susiję su pasiūlytų kriterijaus statistikų asimptotiniu skirstiniu. Bendru atveju N-metrikos statistikų asimptotinis skirstinys esant nulinei hipotezei sutampa su Gauso atsitiktinių dydžių begalinės kvadratinės formos skirstiniu. Alternatyvos atveju testinių statistikų ribinis skirstinys yra normalusis. Sudėtinės suderinamumo hipotezės atveju išsamiau yra analizuojami normalumo ir ekponentiškumo kriterijai. Daugiamačiu atveju pasiūlyta konstrukcija, nepriklausanti nuo skirstinio homogeniškumo testo. Tikrinant tolygumo hipersferoje hipotezę detaliau yra nagrinėjami apskritimo ir sferos atvejai. Darbo pabaigoje lyginami pasiūlytos N-metrikos bei kai kurie klasikiniai kriterijai. Neparametrinės suderinamumo hipotezės vienamačiu atveju, kaip palyginimo priemonė, nagrinėjamas Bahaduro asimptotinis santykinis efektyvumas (Bahadur, 1960; Nikitin, 1995). Kartu su teoriniais rezultatais pasiūlytų N-metrikos tipo testų galingumas ištirtas, naudojant Monte-Karlo metodą. Be paprastos ir sudėtinės suderinamumo hipotezių yra analizuojami homogeniškumo testai... [toliau žr. visą tekstą] / The thesis is devoted to the application of a new class of probability metrics, N-distances, introduced by Klebanov (Klebanov, 2005; Zinger et al., 1989), to the problems of verification of the classical statistical hypotheses of goodness of fit, homogeneity, symmetry and independence. First of all a construction of statistics based on N-metrics for testing mentioned hypotheses is proposed. Then the problem of determination of the critical region of the criteria is investigated. The main results of the thesis are connected with the asymptotic behavior of test statistics under the null and alternative hypotheses. In general case the limit null distribution of proposed in the thesis tests statistics is established in terms of the distribution of infinite quadratic form of random normal variables with coefficients dependent on eigenvalues and functions of a certain integral operator. It is proved that under the alternative hypothesis the test statistics are asymptotically normal. In case of parametric hypothesis of goodness of fit particular attention is devoted to normality and exponentiality criteria. For hypothesis of homogeneity a construction of multivariate distribution-free two-sample test is proposed. Testing the hypothesis of uniformity on hypersphere in more detail S1 and S2 cases are investigated. In conclusion, a comparison of N-distance tests with some classical criteria is provided. For simple hypothesis of goodness of fit in univariate case as a measure for... [to full text]
56

Uncertainty Quantification of the Homogeneity of Granular Materials through Discrete Element Modeling and X-Ray Computed Tomography

Noble, Patrick 2012 August 1900 (has links)
Previous research has shown that the sample preparation method used to reconstitute specimens for granular materials can have a significant impact on its mechanistic behavior. As the Discrete Element Method becomes a more popular choice for modeling multiphysics problems involving granular materials, the sample heterogeneity should be correctly characterized in order to obtain accurate results. In order to capture the effect of sample preparation on the homogeneity of the sample, standard procedures were used to reconstitute samples composed of a homogeneous granular material. X-ray computed tomography and image analysis techniques were then used to characterize the spatial heterogeneity of a typical sample. The sample preparation method was modeled numerically using the Discrete Element program PFC3D. The resulting microstructure of the numerical sample was compared to the results of the image analysis to determine if the heterogeneity of the sample could be reproduced correctly for use in Discrete Element Modeling.
57

Modification of the least-squares collocation method for non-stationary gravity field modelling

Darbeheshti, Neda January 2009 (has links)
Geodesy deals with the accurate analysis of spatial and temporal variations in the geometry and physics of the Earth at local and global scales. In geodesy, least-squares collocation (LSC) is a bridge between the physical and statistical understanding of different functionals of the gravitational field of the Earth. This thesis specifically focuses on the [incorrect] implicit LSC assumptions of isotropy and homogeneity that create limitations on the application of LSC in non-stationary gravity field modeling. In particular, the work seeks to derive expressions for local and global analytical covariance functions that account for the anisotropy and heterogeneity of the Earth's gravity field. / Standard LSC assumes 2D stationarity and 3D isotropy, and relies on a covariance function to account for spatial dependence in the observed data. However, the assumption that the spatial dependence is constant throughout the region of interest may sometimes be violated. Assuming a stationary covariance structure can result in over-smoothing, e.g., of the gravity field in mountains and under-smoothing in great plains. The kernel convolution method from spatial statistics is introduced for non-stationary covariance structures, and its advantage in dealing with non-stationarity in geodetic data is demonstrated. / Tests of the new non-stationary solutions were performed over the Darling Fault, Western Australia, where the anomalous gravity field is anisotropic and non-stationary. Stationary and non-stationary covariance functions are compared in 2D LSC to the empirical example of gravity anomaly interpolation. The results with non-stationary covariance functions are better than standard LSC in terms of formal errors and cross-validation. Both non-stationarity of mean and covariance are considered in planar geoid determination by LSC to test how differently non-stationarity of mean and covariance affects the LSC result compared with GPS-levelling points in this area. Non-stationarity of the mean was not very considerable in this case, but non-stationary covariances were very effective when optimising the gravimetric quasigeoid to agree with the geometric quasigeoid. / In addition, the importance of the choice of the parameters of the non-stationary covariance functions within a Bayesian framework and the improvement of the new method for different functionals on the globe are pointed out.
58

Asymptotic methods for tests of homogeneity for finite mixture models

Stewart, Michael Ian January 2002 (has links)
We present limit theory for tests of homogeneity for finite mixture models. More specifically, we derive the asymptotic distribution of certain random quantities used for testing that a mixture of two distributions is in fact just a single distribution. Our methods apply to cases where the mixture component distributions come from one of a wide class of one-parameter exponential families, both continous and discrete. We consider two random quantities, one related to testing simple hypotheses, the other composite hypotheses. For simple hypotheses we consider the maximum of the standardised score process, which is itself a test statistic. For composite hypotheses we consider the maximum of the efficient score process, which is itself not a statistic (it depends on the unknown true distribution) but is asymptotically equivalent to certain common test statistics in a certain sense. We show that we can approximate both quantities with the maximum of a certain Gaussian process depending on the sample size and the true distribution of the observations, which when suitably normalised has a limiting distribution of the Gumbel extreme value type. Although the limit theory is not practically useful for computing approximate p-values, we use Monte-Carlo simulations to show that another method suggested by the theory, involving using a Studentised version of the maximum-score statistic and simulating a Gaussian process to compute approximate p-values, is remarkably accurate and uses a fraction of the computing resources that a straight Monte-Carlo approximation would.
59

Semimarkovský model pro řízení kreditního rizika / Semi-markov model for credit risk management

Benková, Markéta January 2011 (has links)
With the arrival of the New Basel Capital Accord, which was acknowledged by most of Czech banks during the years 2007 and 2008, the importance of internal ratings for the assessment of the health of the whole financial sector has grown tremendously. Internal ratings are now used for the calculation and allocation of capital, as well as for the determination of interest rates and margins. It is the changes of internal ratings which are obvious applications of the multi-states models. Through the use of methods usual for the Semimarkovian chains analysis, it is possible to analyze the structure of the internal ratings changes, to monitor the periods between successive changes, and to focus also on the transition matrices themselves. The important part of this work is the comparison of given parameters as observed during steady times, and during the nancial crises, which dates from the fall of the Lehman Brothers in September 2008.
60

Použití TL dozimetrů při měření nehomogenity ozáření / The use of the TL dosimeters for measuring inhomogeneities irradiation

CANDROVÁ, Daniela January 2015 (has links)
Radiotherapy has commonly been utilised to cure cancer for more than a century. It is counted among the fundamental branches of medicine and represents an effective local or locally-regional method of curing both cancer and some non-cancerous conditions. It utilises ionising radiation which unfortunately eliminates tumour cells along with healthy ones. This is why a wide range of harmful effects of the radiation can be observed on humans. Patients treated with radiotherapy are in some cases monitored with dosimeters so that the intended dose can be compared with what is really absorbed. Nemocnice České Budějovice, a. s. monitors the dose in expected locations of non-homogeneous irradiation using the aforementioned TL dosimeters during the process of rotary irradiation of Mycosis fungoides. Before actually being used in an in vivo dosimetry, these dosimeters must be properly calibrated and have their sensitivity adjusted. Other than that, they are fairly easy to use, do not require much time or money invested and are able to constantly monitor the dose received by a patient treated with rotary irradiation. This diploma thesis discusses the usage of TL dosimeters to measure non-homogeneity of irradiation. It therefore compares the doses received by patients in various parts of the body during irradiation by the TSEI method. These 22 irradiated patients had thorough measurements taken of the doses they received in the so-called black hole region, their axillae and neck while holding their arms up and with arms loosely positioned close to the body. The thesis also suggests the possibility of the dosimeters being used by the integrated emergency service to assist during rescues and demolitions performed in emergency situations when a leak of ionising radiation occurs. Using TL dosimeters would mean more precise measurements of the dose received by the involved personnel in various parts of their body. If a patient is treated with the TSEI method, the dose received is monitored in a reference point, critical areas and areas with residual infiltrations or tumours. Doses measured in critical areas indicate that should the acral parts of limbs be left uncovered, they absorb larger doses than the rest of the body. They thus exceed the intended dose significantly. In the case of fingers, the dose tends to reach as much as 3 Gy. Depending on clinical picture, the attending physician determines whether special covering should be used. This covering reduces the doses received to merely a few tenths of a Gy. There also tends to be a large difference in absorbed doses in axillar areas and the neck, depending on whether the patient's arms are held up or close to the body respectively. Test results indicate that holding arms up or keeping them close to the body on a particular side have always had a profound effect on the dose absorbed by the patient in this particular area. This means that the position of the body greatly influences how much is absorbed in various parts of the body. TL dosimeters are also used in other areas than in vivo dosimetry in radiotherapy. An example of this could be determining the dose received in upper limbs of the workers monitored with thermoluminescent dosimeters in the form of bracelets or rings. They also serve to monitor the external irradiation of persons within the limits of the Czech Republic (TLD network). In radiation therapy, both the doses in skin and body are taken into account when conducting measurements. Depending on the data extracted, it is then possible to assess the course of treatment and ensure safe implementation of ionising radiation. Integrated emergency service teams deployed in cases of radiation emergencies are not considering using TL dosimeters so far, although they could most likely be used to ascertain the exact doses absorbed. Further research would however need to be conducted to either confirm or disprove the benefits of using TLD.

Page generated in 0.1068 seconds