• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 8
  • 3
  • 3
  • 3
  • 3
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 26
  • 26
  • 11
  • 6
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

An Inverse Approach to a Probability Model for Fractured Networks

Vail, Stacy G. 01 May 1994 (has links)
A common problem in science and engineering applications deals with finding information about a system where only limited information is known. One example of this problem is determining the geometry of an aquifer or oil reservoir based on well tests taken at the site. The Conditional Coding Method attacks this type of problem. This method uses the Simulated Annealing Algorithm in conjunction with a probability model which generates possible solutions based on a uniform random number list. The Annealing Algorithm generates a conditional probability distribution on all possible solutions generated by the probability model, conditioned on the observed data set. The problem is attacked by sampling from this distribution. This method accounts for the noise inherent in the data set as well as the uncertainty due to the limited amount of data available.
2

Reagerar olika grupper av bilköpare annorlunda på en ekonomisk policyförändring? : En studie om anpassning av konsumtionsbeslut i anslutning till bonus malus-systemets ikraftträdande

Broman, Julius, Olausson, Erik January 2019 (has links)
Uppsatsen undersöker om olika grupper av bilköpare skiljer sig åt avseende hur de anpassade tidpunkten för sina inköp av personbilar, i anslutning till bonus malus-systemetsikraftträdande den 1 juli 2018. Med utgångspunkt i ett teoretiskt ramverk om priskänslighet och rationella val, tillämpas en linjär sannolikhetsmodell på ett datamaterial över nybilsköpare i juni och juli 2018. Vi estimerar sannolikheten för respektive grupp att införskaffa en personbil i den månad som är förknippad med en lägre kostnad. När vi inkluderar valda kontrollvariabler finner vi, med statistisk signifikans, att juridiska personer och boende i icke tätbefolkade län har anpassat tidpunkten för införskaffande i större utsträckning än andra grupper. Mer specifik data hade gynnat studien, varför försiktighet bör vidtas för att kunna dra några definitiva slutsatser, och vidare forskning adresseras. Resultaten bidrar alltjämt till förståelsen för de undersökta skillnaderna grupperna emellan, samt kan vara av intresse för utformning av liknande ekonomisk policy.
3

A Model of the Probability of Informed Trading and its Application

Hung, Jung-Yao 17 October 2005 (has links)
This paper firstly constructed an order-driven market probability model of informed trading to analyze the correlation between informed trade and return of assets and the trade-price effect. Secondly, using the probability model of informed trading, we constructed a probability model of arbitrage trading in order-driven call market, which could analyze the stabilization fund and the arbitrage trade, to investigate whether the government¡¦s interference measures were necessary and whether the intervened timepoints conformed to the set-up spirit of the stabilization fund¡Xto intervene while falling and not to while rising. Finally, we set up a ratio empirical model of informed trading which could analyze the intraday trade scale of each trade section of informed traders and uninformed traders, to analyze the change of intraday trade scale of each type of investors while trade frequency changed to explore the factors of market performance. The main results are as follows respectively: Regarding the correlation analysis of informed trading and return of assets and trade-price effect, we found that (1) in the short-term (intraday, day) there was no relationship between probability of informed trading and return of assets, whereas in the mid-term probability of informed trading was correlated with return of assets although the influence impact was not as high as prior researches (Hasbrouck (1991a, b), Glosten and Harris (1988)) expected. (2) The intraday probability of informed trading of good news days was obviously higher than that of bad news days, which indicated that unbalanced buy-sell informed trade phenomenon existed in the market. Regarding the investigation of whether the intervened timepoints of stabilization fund conformed to the set-up spirit of the stabilization fund¡Xto intervene while falling and not to while rising, the main results are: (1) the individual stocks intervened by the stabilization fund had slightly smaller volatility, slightly worse efficiency, better returns and significantly larger liquidity. (2) There was no significant difference in the probability of arbitrage trading between the targets intervened by the stabilization fund and the other companies, nor in the performance (including volatility, efficiency, liquidity and return) between both. (3) The stabilization fund and arbitragers tended to conduct transactions in the opening period, which corresponds with the proposition of Schwartz (1988). (4) We also found that compared with other arbitrage trade, the trade of the stabilization fund was more correlated with the price up-down of the market, but not with that of individual stocks. In the analysis of the intraday trade scale change of each type of investors while trade frequency changed, the main findings are: (1) the slowdown of trade frequency caused smaller intraday trade ratio and worse performance in the opening, but it increased the intraday trade ratio and performance of the closing period, which was especially significant in the high-liquidity companies. (2) The increase of trade frequency could raise the liquidity of the high-liquidity and middle-liquidity companies. As to the low-liquidity companies, although the increase of trade frequency increased the liquidity, it raised their volatility and decreased their price finding speed. The main contributions of this paper¡¦s models are indicated as follows. Regarding a probability model of informed trade: first, it improves the prior ones by bringing the order-driven call market model; second, the addition of informed traders¡¦ possibility to use limit order in the model set-up better corresponds to the real market; third, the model can calculate the probability of informed trading of intraday trade section and thus can analyze the intraday and intraweek behavior or phenomenon of informed traders and the market; fourth, the model estimates the probability of informed trading using trade data, not order data, and thus avoids the probability of informed trade estimation error caused by order trade risk; fifth, the model calculates the probability of informed trade of individual stock after separating good and bad news and thus can analyze buy-sell informed trade behavior. Regarding the probability model of arbitrage trading, it provides a method to analyze whether self-stabilization mechanism-arbitrage trade exists in the market to investigate on the necessity of the stabilization fund and its intraday trade behavior. Finally, regarding the ratio empirical model of informed trading, since this paper calculated the section informed and uninformed trade ratio by simulating uninformed traders¡¦ intraday trade strategy and by extracting the ratio of the trade volume variation of intraday trade section explained by uninformed traders¡¦ intraday behavior variation using regression analysis, it can avoid the deficiency that every trade volume was regarded as from a single trader in the prior order empirical model of informed trading.
4

Tumor Control Probability Models

Gong, Jiafen Unknown Date
No description available.
5

German labour market outcomes of cohorts of immigrants over time : A forecast of the employment of recent cohorts based on earlier newcomers

Ottou, Estelle January 2019 (has links)
It has been noticed that throughout the years, immigrant’s skills, knowledge, and experience have declined. In fact, researchers have noticed the presence of cohort effects, where there are differences in quality and skills across the immigrants. Using data from the German Socio-Economic Panel and through an out-of-sample forecast of the employment of recent cohorts based on how earlier newcomers performed, I can confirm that, over time, immigrants see their probability of being employed decrease. For instance, employment decreased from 99% for immigrants that arrived in Germany in 2010 to 92% for those that came in 2015. The linear probability model also highlights that not only human capital influences directly employment levels of immigrants. Undeniably, the region of origin and the immigrants’ duration of residence in Germany also impact the likelihood of finding a paid job. Therefore, cohort effects cannot only be justified by the fact that newly arrived immigrants are very different from those who arrived some years ago.
6

An Empirical Analysis of Paper Selection by Digital Printers

Jonen, Benjamin Philipp 16 May 2007 (has links)
The Printing Industry is undergoing a Digital Revolution . The importance of digital printing has been increasing substantially over the last decade. How has this development affected the paper selection of printing firms? Only paper suppliers who successfully anticipate the changing needs of the printing firms will be able to benefit from the industry trend. This paper employs a probability model to analyze a survey data set of 103 digital printing firms in the USA and Canada. The research idea is to link the firm s paper selection with the firm s characteristics in order to gain insights into the printing firm s paper purchase behavior and the overall industry structure. The first part of this work investigates the importance of certain paper aspects, such as price, runnability and print quality. Strikingly, a company s involvement in digital printing, measured by the percent of digital printers of the total number of printers in the firm, is a central determinant of the importance of all paper aspects analyzed. This finding underscores the tremendous importance of the printing firms transition to digital printing for the Paper Industry. Paper runnability is found to become more important the faster the firm grows and can be explained by the fact that more successful firms incur higher opportunity costs from downtime. Another key finding is that the importance of paper price is lower for firms who collaborate with their customer on the paper selection and are able to pass on cost increases in the paper price. The second part involves a more direct assessment of paper selection. Here, the firm s characteristics are utilized to explain the choice of coated versus uncoated paper for the printing job. The analysis shows that firms involved in sophisticated print services, such as Digital Asset Management or Variable Data Printing are more likely to use the high quality coated paper. Further it is found that the usage of coated paper increases with catalog printing whereas it decreases with book and manual printing.
7

En jämförelse av regressioner med binära utfall

Pettersson, Fredrik January 2020 (has links)
The purpose of this bachelor thesis was to compare three different methods for regression with binary outcomes. The three methods used for comparison are: Linear Probability Model, Logit and Probit. To compare the methods, data gathered from the World Value Survey when it last was done in Sweden in 2011 was used. The outcome variable in the creation of the models was whether the respondent preferred protecting the environment or economic growth. A Monte Carlo-simulation was also performed to strengthen the arguments in the comparison.  The results from the different models created was very small, but there are still differences. Two examples of the differences are the simplicity of interpretation between the models and errors that argues for not using Linear Probability Model under certain circumstances.
8

A Comparative Analysis of the Use of a Markov Chain Versus a Binomial Probability Model in Estimating the Probability of Consecutive Rainless Days

Homeyer, Jack Wilfred 01 May 1974 (has links)
The Markov chain process for predicting the occurence of a sequence of rainless days, a standard technique, is critically examined in light of the basic underlying assumptions that must be made each time it is used. This is then compared to a simple binomial model wherein an event is defined to be a series of rainless days of desired length. Computer programs to perform the required calculations are then presented and compared as to complexity and operating characteristics. Finally, an example of applying both programs to real data is presented and further comparisons are drawn between the two techniques.
9

Bayesian approaches for the analysis of sequential parallel comparison design in clinical trials

Yao, Baiyun 07 November 2018 (has links)
Placebo response, an apparent improvement in the clinical condition of patients randomly assigned to the placebo treatment, is a major issue in clinical trials on psychiatric and pain disorders. Properly addressing the placebo response is critical to an accurate assessment of the efficacy of a therapeutic agent. The Sequential Parallel Comparison Design (SPCD) is one approach for addressing the placebo response. A SPCD trial runs in two stages, re-randomizing placebo patients in the second stage. Analysis pools the data from both stages. In this thesis, we propose a Bayesian approach for analyzing SPCD data. Our primary proposed model overcomes some of the limitations of existing methods and offers greater flexibility in performing the analysis. We find that our model is either on par or, under certain conditions, better, in preserving the type I error and minimizing mean square error than existing methods. We further develop our model in two ways. First, through prior specification we provide three approaches to model the relationship between the treatment effects from the two stages, as opposed to arbitrarily specifying the relationship as was done in previous studies. Under proper specification these approaches have greater statistical power than the initial analysis and give accurate estimates of this relationship. Second, we revise the model to treat the placebo response as a continuous rather than a binary characteristic. The binary classification, which groups patients into “placebo-responders” or “placebo non-responders”, can lead to misclassification, which can adversely impact the estimate of the treatment effect. As an alternative, we propose to view the placebo response in each patient as an unknown continuous characteristic. This characteristic is estimated and then used to measure the contribution (or the weight) of each patient to the treatment effect. Building upon this idea, we propose two different models which weight the contribution of placebo patients to the estimated second stage treatment effect. We show that this method is more robust against the potential misclassification of responders than previous methods. We demonstrate our methodology using data from the ADAPT-A SPCD trial.
10

Validation of high density electrode arrays for cochlear implants: a computational and structural approach

Falcone, Jessica Dominique 06 April 2011 (has links)
Creating high resolution, or high-density, electrode arrays may be the key for improving cochlear implant users' speech perception in noise, comprehension of lexical languages, and music appreciation. Contemporary electrode arrays use multipolar stimulation techniques such as current steering (shifting the spread of neural excitation in between two physical electrodes) and current focusing (narrowing of the neural spread of excitation) to increase resolution and more specifically target the neural population. Another approach to increasing resolution incorporates microelectromechanical systems (MEMS) fabrication to create a thin film microelectrode (TFM) array with a series of high density electrodes. Validating the benefits of high density electrode arrays requires a systems-level approach. This hypothesis will be tested computationally via cochlea and auditory nerve simulations, and in vitro studies will provide structural proof-of-concept. By employing Rattay's activating function and entering it into Litvak's neural probability model, a first order estimation model was obtained of the auditory nerve's response to electrical stimulation. Two different stimulation scenarios were evaluated: current steering vs. a high density electrode and current focusing of contemporary electrodes vs. current focusing of high density electrodes. The results revealed that a high density electrode is more localized than current steering and requires less current. A second order estimation model was also created COMSOL, which provided the resulting potential and current flow when the electrodes were electrically stimulated. The structural tests were conducted to provide a proof of concept for the TFM arrays' ability to contour to the shape of the cochlea. The TFM arrays were integrated with a standard insertion platform (IP). In vitro tests were performed on human cadaver cochleae using the TFM/IP devices. Fluoroscopic images recorded the insertion, and post analysis 3D CT scans and histology were conducted on the specimens. Only three of the ten implanted TFM/IPs suffered severe delamination. This statistic for scala vestibuli excursion is not an outlier when compared to previous data recorded for contemporary cochlear electrode arrays.

Page generated in 0.0887 seconds