• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 548
  • 506
  • 119
  • 66
  • 55
  • 36
  • 26
  • 18
  • 17
  • 13
  • 10
  • 9
  • 8
  • 7
  • 7
  • Tagged with
  • 1663
  • 204
  • 142
  • 105
  • 102
  • 99
  • 95
  • 91
  • 90
  • 88
  • 86
  • 84
  • 83
  • 77
  • 75
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
151

Normal and Δ-Normal Configurations in Toric Algebra

Solus, Liam 17 June 2011 (has links)
No description available.
152

Robustness of normal theory inference when random effects are not normally distributed

Devamitta Perera, Muditha Virangika January 1900 (has links)
Master of Science / Department of Statistics / Paul I. Nelson / The variance of a response in a one-way random effects model can be expressed as the sum of the variability among and within treatment levels. Conventional methods of statistical analysis for these models are based on the assumption of normality of both sources of variation. Since this assumption is not always satisfied and can be difficult to check, it is important to explore the performance of normal based inference when normality does not hold. This report uses simulation to explore and assess the robustness of the F-test for the presence of an among treatment variance component and the normal theory confidence interval for the intra-class correlation coefficient under several non-normal distributions. It was found that the power function of the F-test is robust for moderately heavy-tailed random error distributions. But, for very heavy tailed random error distributions, power is relatively low, even for a large number of treatments. Coverage rates of the confidence interval for the intra-class correlation coefficient are far from nominal for very heavy tailed, non-normal random effect distributions.
153

A simulation study of the robustness of Hotelling’s T2 test for the mean of a multivariate distribution when sampling from a multivariate skew-normal distribution

Wu, Yun January 1900 (has links)
Master of Science / Department of Statistics / Paul I. Nelson / Hotelling’s T2 test is the standard tool for inference about the mean of a multivariate normal population. However, this test may perform poorly when used on samples from multivariate distributions with highly skewed marginal distributions. The goal of our study was to investigate the type I error rate and power properties of Hotelling’s one sample test when sampling from a class of multivariate skew-normal (SN) distributions, which includes the multivariate normal distribution and, in addition to location and scale parameters, has a shape parameter to regulate skewness. Simulation results of tests carried out at nominal type I error rate 0.05 obtained from various levels of shape parameters, sample sizes, number of variables and fixed correlation matrix showed that Hotelling’s one sample test provides adequate control of type I error rates over the entire range of conditions studied. The test also produces suitable power levels for detecting departures from hypothesized values of a multivariate mean vector when data result from a random sample from a multivariate SN. The shape parameter of the SN family appears not to have much of an effect on the robustness of Hotelling’s test. However, surprisingly, it does have a positive impact on power.
154

Game Theory and Adaptive Modulation for Cognitive Radios

Sharma, Guarav 10 1900 (has links)
ITC/USA 2008 Conference Proceedings / The Forty-Fourth Annual International Telemetering Conference and Technical Exhibition / October 27-30, 2008 / Town and Country Resort & Convention Center, San Diego, California / In a multi-user cognitive radio network, there arises a need for coordination among the network users for efficient utilization of the available electromagnetic spectrum. While adaptive modulation alone helps cognitive radios actively determine the channel quality metric for the next transmission, Game theory combined with an adaptive modulation system helps them achieve mutual coordination among channel users and avoids any possible confusion about transmitting/receiving through a channel in the future. This paper highlights how the concepts of game theory and adaptive modulation can be incorporated in a cognitive radio framework to achieve better communication for telemetry applications.
155

An Analysis on the Coverage Distance of LDPC-Coded Free-Space Optical Links

Luna, Ricardo, Tapse, Hrishikesh 10 1900 (has links)
ITC/USA 2008 Conference Proceedings / The Forty-Fourth Annual International Telemetering Conference and Technical Exhibition / October 27-30, 2008 / Town and Country Resort & Convention Center, San Diego, California / We design irregular Low-Density Parity-Check (LDPC) codes for free-space optical (FSO) channels for different transmitter-receiver link distances and analyze the error performance for different atmospheric conditions. The design considers atmospheric absorption, laser beam divergence, and random intensity fluctuations due to atmospheric turbulence. It is found that, for the same transmit power, a system using the designed codes works over much longer link distances than a system that employs regular LDPC codes. Our analysis is particularly useful for portable optical transceivers and mobile links.
156

Harm and enhancement : philosophical and ethical perspectives

Hall, Susan 12 1900 (has links)
Thesis (PhD)--Stellenbosch University, 2012. / ENGLISH ABSTRACT: The distinction between treatment and enhancement is often considered to be a morally significant boundary, which, at the very least, marks the limits of our moral obligations. This conviction holds despite the fact that treatment and enhancement are situated along a continuum of interventions that are directed towards the improvement of human functioning. The distinction between these two sorts of interventions is based upon a notion of normative normality, which suggests that we are morally obligated to provide interventions which are directed toward the achievement of normal functioning, but that no obligation exists to improve functioning beyond this point. This dissertation will subject this position to critique by examining the constitution of normal functioning, and by suggesting that this kind of functioning cannot operate as a normative standard which determines the limits of our moral obligations. The moral desirability which we attribute to the achievement of normal functioning is based upon the independent ethical imperative to promote the possibilities for well-being of moral agents. This motivation, however, equally suggests that we will be obligated to provide certain kinds of enhancement interventions which will be likely to promote the welfare interests of moral agents, when these become available. This argument also implies that the development of enhancement technologies will require us to rethink our ethical conception of harmful non-benefits. We currently think of the non-provision of medical treatment and some environmental enhancements, such as education, as harmful to the extent that state intervention is justified to rectify this. We recognise that such non-provision, and the resultant failure to promote the welfare interests of moral agents, where such promotion is possible, harms persons by putting them in a worse position than they could have been in, with regards to their chances of leading a good life. The new technological possibilities offered by the prospect of genetic enhancement mean that we might soon have a better alternative, in terms of our chances of leading a good life, to the level of functioning that we have thus far been able to achieve. This implies that the non-provision of these enhancements would be harmful to the extent that intervention to bring about this provision would be justified. / AFRIKAANSE OPSOMMING: Die onderskeid tussen behandeling (“treatment”) en verbetering (“enhancement”) word dikwels geag 'n skeiding daar te stel wat van morele belang is, in soverre dit ten minste, die perke van ons morele verpligtinge afbaken. Hierdie oortuiging geld ten spyte van die feit dat behandeling en verbetering op „n kontinuum van ingrype wat op die verbetering van menslike funksionering gerig is, geleë is. Die onderskeid tussen hierdie twee tipes ingrype is gebaseer op 'n bepaalde begrip van normatiewe normaliteit, wat suggereer dat ons moreel verplig is om ingrype te voorsien wat daarop gerig is om normale funksionering te bewerkstellig, maar dat geen sodanige verpligting bestaan om funksionering duskant hierdie punt te verbeter nie. Hierdie proefskrif sal laasgenoemde posisie aan kritiek onderwerp deur die manier waarop ons normale funksionering verstaan, te ondersoek, en deur aan die hand te doen dat hierdie tipe funksionering nie as normatiewe standaard wat die perke van ons morele verpligtinge bepaal, kan dien nie. Die morele gewenstheid wat ons toeskryf daaraan om normale funksionering mee te bring, is op die onafhanklike etiese imperatief om die moontlikhede vir welstand van morele agente te bevorder, gebaseer. Hierdie motivering doen egter eweseer aan die hand dat ons verplig sal wees om sekere tipes verbeteringsingrype te verskaf wat waarskynlik die welsynbelange van morele agente sal bevorder, wanneer sulke verbeteringsingrype beskikbaar word. Hierdie argument impliseer ook dat die ontwikkeling van verbeteringstegnologieë van ons sal vereis om ons etiese konsepsie van skadelike nie-voordele opnuut te deurdink. Tans dink ons dat die nie-voorsiening van mediese behandeling, sowel as sommige omgewingsverbeterings soos opvoeding, tot so „n mate skadelik is dat staatsinmenging met die doel om dit reg te stel, geregverdig is. Ons erken dat sulke nie-voorsiening en die gevolglike versuim om die welsynsbelange van morele agente te bevorder, waar sulke bevordering moontlik is, mense skade berokken deur hulle in „n slegter posisie te plaas as waarin hul kon gewees het, ten aansien van hul kanse om 'n goeie lewe te leef. Die nuwe tegnologiese moontlikhede wat die voortuitsig van genetiese verbetering ons bied, beteken dat ons binnekort 'n beter alternatief mag hê vir die vlak van funksionering wat ons tot dusver kon bewerkstellig, ooreenkomstig ons kanse om 'n goeie lewe te leef. Dit impliseer dat die nie-voorsiening van hierdie verbeterings skadelik sal wees tot die mate wat ingrype om hierdie voorsiening teweeg te bring, geregverdig sal wees.
157

Databasdesign: Nulägesanalys av normalisering

Wesslén Weiler, Johannes, Öhrn, Emelie January 2016 (has links)
År 1970 introducerades normalisering med syfte att organisera data i relationsdatabaser för att undvika redundant data och reducera risker för anomalier. Idag finns indikationer på att en mer nyanserad bild av normalisering behövs då dagens databaser ställs inför nya utmaningar och krav. Det här arbetet utförs i form av en fallstudie där en analys av tre databaser inom olika verksamheter genomförs. Med utgångspunkt i normalformerna genomförs en explorativ analys för att identifiera vilka aspekter som påverkar normalisering i industrin. Slutsatsen av arbetet är att det är svårt för en oberoende part till databasen att avgöra och tolka normalformernas uppfyllnad. Faktorer som påverkar normalisering av databaser är: utvecklarens intuition, användarens påverkan av datakvalitet samt den tekniska skuld som quickfixes orsakar. / Normalization was first introduced in 1970 with the purpose to organize data within relational databases in a way to avoid data redundancy and reduce the number of anomalies. As databases are facing new challenges and requirements, indications have been identified which points to a need for a more detailed view of normalization. This work is the outcome of a case study where three databases are analyzed. With the normal forms as starting point, an explorative analysis is made with the goal to identify different aspects that affects the way normalization is conducted within the industry. The conclusion is that it is difficult for an outsider to the database to interpret and determine whether the normal forms are fulfilled or not. Aspects affecting normalization are: the developer's intuition, users' impact on data quality and the technical debt that quickfixes creates.
158

Spatial regression-based model specifications for exogenous and endogenous spatial interaction

LeSage, James P., Fischer, Manfred M. 03 September 2014 (has links) (PDF)
Spatial interaction models represent a class of models that are used for modeling origin destination flow data. The interest in such models is motivated by the need to understand and explain the flows of tangible entities such as persons or commodities or intangible ones such as capital, information or knowledge between regions. The focus here is on the log-normal version of the model. In this context, we consider spatial econometric specifications that can be used to accommodate two types of dependence scenarios, one involving endogenous interaction and the other exogenous interaction. These model specifications replace the conventional assumption of independence between origin-destination-flows with formal approaches that allow for two different types of spatial dependence in flow magnitudes. (authors' abstract) / Series: Working Papers in Regional Science
159

Fetal cardiac defects and increased nuchal translucency at 10-14 weeks of gestation

Hyett, Jonathan A. January 2000 (has links)
No description available.
160

Algorithms for Normal Forms for Matrices of Polynomials and Ore Polynomials

Cheng, Howard January 2003 (has links)
In this thesis we study algorithms for computing normal forms for matrices of Ore polynomials while controlling coefficient growth. By formulating row reduction as a linear algebra problem, we obtain a fraction-free algorithm for row reduction for matrices of Ore polynomials. The algorithm allows us to compute the rank and a basis of the left nullspace of the input matrix. When the input is restricted to matrices of shift polynomials and ordinary polynomials, we obtain fraction-free algorithms for computing row-reduced forms and weak Popov forms. These algorithms can be used to compute a greatest common right divisor and a least common left multiple of such matrices. Our fraction-free row reduction algorithm can be viewed as a generalization of subresultant algorithms. The linear algebra formulation allows us to obtain bounds on the size of the intermediate results and to analyze the complexity of our algorithms. We then make use of the fraction-free algorithm as a basis to formulate modular algorithms for computing a row-reduced form, a weak Popov form, and the Popov form of a polynomial matrix. By examining the linear algebra formulation, we develop criteria for detecting unlucky homomorphisms and determining the number of homomorphic images required.

Page generated in 0.0456 seconds