• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 3
  • Tagged with
  • 3
  • 3
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Critical evaluation of methods for estimation of increase in systemic drug exposure for renally impaired patients

Svensson, Robin January 2013 (has links)
Introduction: The effect of renal impairment (RI) on systemic exposure is assessed in phase I with RI studies and/or in phase III with population pharmacokinetic analysis. Regulatory review has indicated that the estimated effect of RI from the two methods may differ. Aim: To map the estimated effect of RI on systemic exposure based on phase I and III data, to investigate if the estimated effect based on the two data sources differ and to investigate causes to this potential discrepancy. Methods: Marketing authorisation applications (MAA) were scrutinised with focus on impact of RI on systemic exposure estimated based on phase I and III data. In addition, a simulation-estimation study was performed to explore causes to discrepancies. Phase I and III data were simulated and analysed with non-compartmental analysis (NCA) and population analysis. The phase III data were simulated under several alternative conditions thought to be potential sources for discrepancies,  such as uncertainty in creatinine clearance (CLCR) measurements and varying number of subjects. Results: Six examples were found in MAAs in which a discrepancy was observed, where phase III tended to estimate a lower effect of RI compared with phase I. In the simulation-estimation study, the NCA of phase I data over-predicted the effect of RI on systemic exposure, while the population analysis of phase III data estimated the effect of RI without bias. Uncertainty in CLCR measurement in the phase III data resulted in under-prediction of the effect of RI on systemic exposure. Conclusions: A discrepancy in the estimated effect of RI on systemic exposure between phase I and III was observed in existing MAAs. The NCA of phase I RI study and uncertain CLCR measurements were identified as possible reasons to the discrepancy.
2

The role of higher moments in high-frequency data modelling

Schmid, Manuel 24 November 2021 (has links)
This thesis studies the role of higher moments, that is moments behind mean and variance, in continuous-time, or diffusion, processes, which are commonly used to model so-called high-frequency data. Thereby, the first part is devoted to the derivation of closed-form expression of general (un)conditional (co)moment formulas of the famous CIR process’s solution. A byproduct of this derivation will be a novel way of proving that the process’s transition density is a noncentral chi-square distribution and that its steady-state law is a Gamma distribution. In the second part, we use these moment formulas to derive a near-exact simulation algorithm to the Heston model, in the sense that our algorithm generates pseudo-random numbers that have the same first four moments as the theoretical diffusion process. We will conduct several in-depth Monte Carlo studies to determine which existing simulation algorithm performs best with respect to these higher moments under certain circumstances. We will conduct the same study for the CIR process, which serves as a diffusion for the latent spot variance in the Heston model. The third part discusses several estimation approaches to the Heston model based on high-frequency data, such as MM, GMM, and (pseudo/quasi) ML. For the GMM approach, we will use the moments derived in the first part as moment conditions. We apply the best methodology to actual high-frequency price series of cryptocurrencies and FIAT stocks to provide benchmark parameter estimates. / Die vorliegende Arbeit untersucht die Rolle von höheren Momenten, also Momente, welche über den Erwartungswert und die Varianz hinausgehen, im Kontext von zeitstetigen Zeitreihenmodellen. Solche Diffusionsprozesse werden häufig genutzt, um sogenannte Hochfrequenzdaten zu beschreiben. Teil 1 der Arbeit beschäftigt sich mit der Herleitung von allgemeinen und in geschlossener Form verfügbaren Ausdrücken der (un)bedingten (Ko-)Momente der Lösung zum CIR-Prozess. Mittels dieser Formeln wird auf einem alternativen Weg bewiesen, dass die Übergangsdichte dieses Prozesses mithilfe einer nichtzentralen Chi-Quadrat-Verteilung beschrieben werden kann, und dass seine stationäre Verteilung einer Gamma-Verteilung entspricht. Im zweiten Teil werden die zuvor entwickelten Ausdrücke genutzt, um einen nahezu exakten Simulationsalgorithmus für das Hestonmodell herzuleiten. Dieser Algorithmus ist in dem Sinne nahezu exakt, dass er Pseudo-Zufallszahlen generiert, welche die gleichen ersten vier Momente besitzen, wie der dem Hestonmodell zugrundeliegende Diffusionsprozess. Ferner werden Monte-Carlo-Studien durchgeführt, die ergründen sollen, welche bereits existierenden Simulationsalgorithmen in Hinblick auf die ersten vier Momente die besten Ergebnisse liefern. Die gleiche Studie wird außerdem für die Simulation des CIR-Prozesses durchgeführt, welcher im Hestonmodell als Diffusion für die latente, instantane Varianz dient. Im dritten Teil werden mehrere Schätzverfahren für das Hestonmodell, wie das MM-, GMM und pseudo- beziehungsweise quasi-ML-Verfahren, diskutiert. Diese werden unter Benutzung von Hochfrequenzdaten studiert. Für das GMM-Verfahren dienen die hergeleiteten Momente aus dem ersten Teil der Arbeit als Momentebedingungen. Um ferner Schätzwerte für das Hestonmodell zu finden, werden die besten Verfahren auf Hochfrequenzmarktdaten von Kryptowährungen, sowie hochliquider Aktientitel angewandt. Diese sollen zukünftig als Orientierungswerte dienen.
3

On specification and inference in the econometrics of public procurement

Sundström, David January 2016 (has links)
In Paper [I] we use data on Swedish public procurement auctions for internal regularcleaning service contracts to provide novel empirical evidence regarding green publicprocurement (GPP) and its effect on the potential suppliers’ decision to submit a bid andtheir probability of being qualified for supplier selection. We find only a weak effect onsupplier behavior which suggests that GPP does not live up to its political expectations.However, several environmental criteria appear to be associated with increased complexity,as indicated by the reduced probability of a bid being qualified in the postqualificationprocess. As such, GPP appears to have limited or no potential to function as an environmentalpolicy instrument. In Paper [II] the observation is made that empirical evaluations of the effect of policiestransmitted through public procurements on bid sizes are made using linear regressionsor by more involved non-linear structural models. The aspiration is typically to determinea marginal effect. Here, I compare marginal effects generated under both types ofspecifications. I study how a political initiative to make firms less environmentally damagingimplemented through public procurement influences Swedish firms’ behavior. Thecollected evidence brings about a statistically as well as economically significant effect onfirms’ bids and costs. Paper [III] embarks by noting that auction theory suggests that as the number of bidders(competition) increases, the sizes of the participants’ bids decrease. An issue in theempirical literature on auctions is which measurement(s) of competition to use. Utilizinga dataset on public procurements containing measurements on both the actual and potentialnumber of bidders I find that a workhorse model of public procurements is bestfitted to data using only actual bidders as measurement for competition. Acknowledgingthat all measurements of competition may be erroneous, I propose an instrumental variableestimator that (given my data) brings about a competition effect bounded by thosegenerated by specifications using the actual and potential number of bidders, respectively.Also, some asymptotic results are provided for non-linear least squares estimatorsobtained from a dependent variable transformation model. Paper [VI] introduces a novel method to measure bidders’ costs (valuations) in descending(ascending) auctions. Based on two bounded rationality constraints bidders’costs (valuations) are given an imperfect measurements interpretation robust to behavioraldeviations from traditional rationality assumptions. Theory provides no guidanceas to the shape of the cost (valuation) distributions while empirical evidence suggeststhem to be positively skew. Consequently, a flexible distribution is employed in an imperfectmeasurements framework. An illustration of the proposed method on Swedishpublic procurement data is provided along with a comparison to a traditional BayesianNash Equilibrium approach.

Page generated in 0.193 seconds