• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 3017
  • 943
  • 349
  • 314
  • 185
  • 108
  • 49
  • 49
  • 49
  • 49
  • 49
  • 48
  • 40
  • 35
  • 27
  • Tagged with
  • 6243
  • 1434
  • 1105
  • 1056
  • 827
  • 720
  • 714
  • 702
  • 643
  • 608
  • 510
  • 488
  • 477
  • 459
  • 449
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Selected problems in lattice statistical mechanics /

Chan, Yao-ban. January 2005 (has links)
Thesis (Ph.D.)--University of Melbourne, Dept. of Mathematics and Statistics, 2005. / Typescript. Includes bibliographical references (leaves [237]-245).
12

An examination of the statistical mechanics of one-dimensional chains on a periodic substrate

Hartnett, Allison Sarah, Phillips, James M., January 2008 (has links)
Thesis (Ph. D.)--Dept. of Physics and Dept. of Mathematics. University of Missouri--Kansas City, 2008. / "A dissertation in physics and mathematics." Advisor: James M. Phillips. Typescript. Vita. Title from "catalog record" of the print edition Description based on contents viewed Sept. 12, 2008. Includes bibliographical references (leaves 79-83). Online version of the print edition.
13

The emergence of macroscopic determinism and irreversibility /

La Cour, Brian Robert, January 2000 (has links)
Thesis (Ph. D.)--University of Texas at Austin, 2000. / Vita. Includes bibliographical references (leaves 228-242). Available also in a digital version from Dissertation Abstracts.
14

The polygonograph a substitute for the pie graph /

Balogun, Olayinka Yaro, January 1976 (has links)
Thesis--Wisconsin. / Includes bibliographical references (leaves 156-158).
15

The Boltzmann equation in non-equilibrium statistical mechanics

Hollinger, Henry B., January 1960 (has links)
Thesis (Ph. D.)--University of Wisconsin--Madison, 1960. / Typescript. Vita. eContent provider-neutral record in process. Description based on print version record. Includes bibliographical references.
16

Master equations and memory effects

Carlson, Brett V. January 1981 (has links)
Thesis (Ph. D.)--University of Wisconsin--Madison, 1981. / Typescript. Vita. eContent provider-neutral record in process. Description based on print version record. Includes bibliographical references (leaves 243-246).
17

The application of statistical decision theory to a perceptual decision-making problem

Papsdorf, James Daniel January 1962 (has links)
The object of this study was to determine whether statistical decision theory, or a special application of it, the theory of signal detection, could be of value in accounting for the behaviour of subjects in a perceptual decision-making task. The amount of information in these tasks was varied to see if the theory could predict changes in subject performance. Five subjects were required to distinguish between fifty percent time compressed recordings of the stimulus words "commination" and "comminution” embedded in "white" noise. Under one treatment, compression was gained by discarding many small letter segments while in the other this same compression value was obtained by discarding a few large letter segments. It was hypothesized that large-discard- interval compression would be more detrimental to stimulus intelligibility than small-discard-interval compression. Five other subjects were asked to distinguish between the two noise-embedded stimulus words which had been time-compressed sixty and seventy-four percent. It was predicted that sixty percent compression would be less detrimental to the intelligibility of the stimulus words than seventy-four percent compression. Concurrently, in both groups, an attempt was made to manipulate the degree of cautiousness or decision criteria of all ten subjects. Such manipulation was attempted in order to permit the separation of each subjects' actual sensitivity from each's variable decision criterion. This manipulation involved varying the costs and fines associated with correct and incorrect decisions as well as the probabilities of each stimulus word's occurrence. Large-discard-interval compression was found to be less detrimental to intelligibility, as inferred from subject performance, than small-discard-interval compression. This finding was contrary to the first hypothesis. Sixty percent compression, as predicted, was less detrimental to intelligibility than seventy-four percent compression. It was observed that the theory of signal detection permitted separation of each subjects' sensitivity from his monetary degree of cautiousness. This cautiousness was also found to be accessible to manipulation. It is suggested that since the approach of statistical decision theory detected changes in subject performance in response to varying amounts of information, it can be profitably applied to the study of perception. / Arts, Faculty of / Psychology, Department of / Graduate
18

An analysis of multidimensional contingency talbes

Mast, Lilian G. (Feuerverger) January 1973 (has links)
In this thesis we consider the following model for a three-dimensional r ˣ s ˣ t contingency table: [formula omitted]. A dot indicates summation over the replaced subscript. The f[formula omitted]’s represent the frequencies and the P[formula omitted]'s represent the proportions. The problem we are concerned with is testing the hypothesis H₀: [formula omitted] = 0 for all i, j, k. i.e. no second order interaction is present. We then seek to extend the model and problem to a w-way table. We use the method of the likelihood ratio . To assist us in determining the numerator of the likelihood ratio we reformulate a theorem about constrained extrema and Lagrange multipliers and prove this reformulation. Some general conclusions we draw are: there are two extensions to our 3-way model; results we obtain using our model and methods are in close agreement with results obtained using the models and methods of other statisticians. / Science, Faculty of / Mathematics, Department of / Graduate
19

Two representation theorems and their application to decision theory

Chew, Soo Hong 11 1900 (has links)
This dissertation consists of two parts. Part I contains the statements and proofs of two representation theorems. The first theorem, proved in Chapter 1, generalizes the quasilinear mean of Hardy, Littlewood and Poly by weakening their axiom of quasilinearity. Given two distributions with the same means, quasilinearity requires that mixtures of these distributions with another distribution in the same proportions share the same mean, regardless of the distribution that they are mixed with. We weaken the quasilinearity axiom by allowing the proportions that give rise to the same means to be different, This leads to a more general mean, denoted by M[sub=αФ], which has the form: M[sub=αФ] = Ф⁻¹(ʃ[sub=R] αФF/ʃαdF), where α is continuous and strictly monotone, a is continuous and strictly positive (negative) and F is a probability distribution. The quasilinear mean, denoted by M[sub=Ф], results when the a function is constant. We showed, in addition, that the M[sub=αФ] mean has the intermediate value property, and can be consistent with the stochastic dominance (including higher degree ones) partial order. We also generalized a well known inequality among quasilinear means, via the observation that the M[sub=αФ] mean of a distribution F can be written as the quasilinear mean of a distribution F[sup=α], where F[sup=α] is derived from F via a as the Radon-Nikodym derivative of F[sup=α] with respect to F. We noted that the M[sub=αФ] mean induces an ordering among probability distributions via the maximand, ʃ[sub=R] αФF/ʃαdF, that contains the (expected utility) maximand, ʃ[sub=R] αФF, of the quasilinear mean as a special case. Chapter 2 provides an alternative characterization of the above representation for simple probability measures on a more general outcome set where mean values may not be defined. In this case, axioms are stated directly in terms of properties of the underlying ordering. We retained several standard properties of expected utility, namely weak order, solvability and monotonicity but relaxed the substitutability axiom of Pratt, Raiffa and Schlaifer, which is essentially a restatement of quasi-linearity in the context of an ordering. Part II of the dissertation concerns one specific area of application decision theory. Interpreting the M[sub=αФ](F) mean of Chapter 1 as the certainty equivalent of a monetary lottery F, the corresponding induced binary relation has the natural interpretation as 'strict preference' between lotteries. For non-monetary (finite) lotteries, we apply the representation theorem of Chapter 2. The hypothesis, that a choice agent's preference among lotteries can be represented by a pair of α and Ф functions through the induced ordering, is referred to as alpha utility theory. This is logically equivalent to saying that the choice agent obeys either the mean value (certainty equivalent) axioms or the axioms on his strict preference binary relation. Alpha utility theory is a generalization of expected utility theory in the sense that the expected utility representation is a special case of the alpha utility representation. The motivation for generalizing expected utility comes from difficulties it faced in the description of certain choice phenomena, especially the Allais paradox. These are summarized in Chapter 3. Chapter 4 contains the formal statements of assumptions and the derivations of normative and descriptive implications of alpha utility theory. We stated conditions, taken from Chapter 1, for consistency with stochastic dominance and global risk aversion and derived a generalized Arrow-Pratt index of local risk aversion. We also demonstrated how alpha utility theory can be consistent with those choice phenomena that contradict the implications of expected utility, without violating either stochastic dominance or local risk aversion. The chapter ended with a comparison of alpha utility with two other theories that have attracted attention; namely, Allais' theory and prospect theory. Several other applications of the representation theorems of Part I are considered in the Conclusion of this dissertation. These include the use of the M[sub=αФ] mean as a model of the equally-distributed-equivalent level of income (Atkinson, 1970), and as a measure of asymmetry of a distribution (Canning, 1934). The alpha utility representation can also be used to rank social situations in the sense of Harsanyi (1977). We ended by pointing out an open question regarding conditions for comparative risk aversion and stated an extension of Samuelson's (1967) conjecture that Arrow's impossibility theorem would hold if individuals and society express their preferences by von Neumann-Morgenstern utility functions. / Graduate and Postdoctoral Studies / Graduate
20

Modelling hepatotoxicity in HIV/TB co-infected patients: extensions of the Cox Proportional Hazards Model

Mlotshwa, Vintia Philile 10 February 2021 (has links)
Hepatotoxicity which is also known as liver damage is mainly caused by intake of medicine. It is common among patients who are co-administering Tuberculosis (TB) treatment and the antiretroviral therapy (ART) for the Human Immunodeficiency Viruses (HIV). If severe, hepatotoxicity sometimes necessitates cessation or interruption of treatment. Therefore, understanding, monitoring and managing hepatotoxicity in patients co-infected with TB and HIV is crucial for optimal treatment outcomes. Hepatotoxicity has been investigated in patients coinfected with TB and HIV, however, most studies have analyzed only the first occurrence of hepatotoxicity and discarded information relating to the resolution and recurrence of hepatotoxicity. Data from the ‘Starting Antiretroviral therapy at three Points in Tuberculosis' (SAPiT) trial is used in this project. This was a trial that was instrumental in finalizing treatment guidelines for patients co-infected with HIV and TB in South Africa. The clinical objectives of this project are to estimate incidence rates and determine risk factors associated with hepatotoxicity. The statistical objectives are to fit a Cox regression model, the resolution model of hepatotoxicity, and the extended Cox models for recurring events, including the Andersen Gill (AG) model, the Shared frailty model, the Prentice, Williams and Peterson (PWP) total time (TT) model, the PWP gap time (GT) model, as well as a Cox based recurrent model, that models only the second occurrence of hepatotoxicity. There were 593 patients assessed for hepatotoxicity in the study, 30% (179/593) developed the first occurrence of hepatotoxicity (grade >=1) and 2% (13/593) developed severe hepatotoxicity (grade >=3). Resolved cases (grade = 0) are 76% (136/179) and recurring cases (grade >=1) 24% (32/136). In the Cox multivariable analyses: time-varying treatment arm, older patients, alcohol consumption, low baseline total bilirubin and a positive baseline Hepatitis B surface antigen status, were associated with a higher risk of developing the first occurrence of hepatotoxicity. The extended Cox models (AG model, Shared frailty model, PWP TT model and PWP GT model) in combination identified that: time-varying treatment arm, older patients, alcohol consumption, baseline CD4 count that is greater than 50 cells per mm3 , low baseline total bilirubin, and a positive baseline Hepatitis B surface antigen status were associated with an increased risk of developing recurrent hepatotoxicity. In the resolution model multivariable analyses; non-consumers of alcohol and an abnormal liver function tests at baseline, were associated with an increased chance of resolving the first occurrence of hepatotoxicity. In the multivariable analyses of the recurrent model: younger patients and the time-varying treatment arm were associated with the development of the second occurrence hepatotoxicity. Since the Cox regression model utilized data up to the first occurrence of hepatotoxicity, in some instances, the time-varying treatment effect based on the Cox regression model was closer to unity and marginally significant. And the corresponding effect based on the recurrent event models (AG model, Shared frailty model, PWP TT model, PWP GT model and the recurrent model), that utilized data of the first and second occurrence of hepatotoxicity, generally produced a time-varying treatment effect slightly far from unity with a strong statistical significance. This trend was similar for other predictors of hepatotoxicity, like CD4 count and alcohol consumption. In conclusion, hepatotoxicity is common in this study, however, it is often transient or mild and did not necessitate treatment interruption. However, close monitoring of patients especially in the first 5 months of TB-treatment is recommended. The PWP TT model seemed to be the best model for modelling recurring hepatotoxicity, since the identified risk factors that were associated with hepatotoxicity, changed from the first occurrence of hepatotoxicity to the second occurrence of hepatotoxicity.

Page generated in 0.0948 seconds