• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 141
  • 21
  • 21
  • 13
  • 8
  • 7
  • 6
  • 6
  • 5
  • 3
  • 2
  • 2
  • 2
  • 2
  • 1
  • Tagged with
  • 270
  • 270
  • 169
  • 55
  • 39
  • 33
  • 31
  • 30
  • 26
  • 24
  • 22
  • 22
  • 22
  • 21
  • 21
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
131

Homeowner's Equity, Rental Cash Flow, and Recourse as Predictors of Default Mortgage Status

Callian III, William 01 January 2018 (has links)
In the aftermath of the Great Recession of 2007-2009, banking executives feared the impact of increased capital reserve requirements for losses from mortgage defaults. One reason was that home price declines during the Great Recession precipitated mortgage defaults, which increased the percentage of foreclosures as well as accelerated negative equity, and default. The purpose of this correlational study, grounded in Fishbein's expectancy of value and Vroom's expectancy theories, was to examine the relationship between the independent variables of homeowner's equity, rental cash flow value, and recourse, and the dependent variable, default mortgage status. Archival data comprised a sample of 408 single family residences in Alameda County, California, and Shelby, Fayette, and Tipton Counties in Tennessee. The results of the binary logistic regression model indicated the model was a good fit to predict a significant relationship between the variables (Ï?2 = 3.490, p = 0.322, df = 3). The findings did not reveal a significant relationship between homeowner's equity, rental cash flow value, recourse, and default mortgage status. Therefore, the independent variables did not predict mortgage default status. However, a minor relationship was found between homeowner's equity (p = 0.215), rental cash flow value (p = 0.215), and default mortgage status. A non-significant relationship between the independent variables and default mortgage status indicated that factors other than the study variables influenced default mortgage status. Advocates for fair housing laws may use study findings to encourage lenders to change lending policies to reduce the risk of default and increase stability in local communities, which may result in potential positive social change.
132

Towards an integration of theories of achievement motivation.

Wellman, David Allen, mikewood@deakin.edu.au January 2001 (has links)
This thesis investigated children's school achievement in terms of an integration of three theories of achievement motivation. The three theoretical outlooks were expectancy-value theory (EVT), implicit theories of intelligence (ITI), and flow theory (FT). The first of two studies was an exploratory investigation of the effectiveness of each theory independently and combined to predict children's achievement in four school subjects. The subject areas were maths, reading, instrumental music and sport. Participants were 84 children (40 females and 44 males) aged 9 to 10 years, one of each child's parents, and school teachers of each child in the four subject areas. All data were collected through questionnaires based on the three models. The results indicated that EVT and FT but not ITI accounted for a significant amount of the variance in children's achievement, including effects for subject area and gender. A second confirmatory study tested EVT, FT and an integrated model for the prediction of achievement in maths, reading and instrumental music. The participants were a further 141 children (74 females and 67 males) aged 10 to 11 years, and a parent and teachers of each child. Data collection using questionnaires occurred early in the school year (Timel) and approximately five months later (Time2). For EVT, children and parents’ competence beliefs were significant predictors of children's achievement in each subject area. Females tended to believe themselves more competent at reading and instrumental music and also valued these subjects more highly than boys. Modeling results for flow theory indicated that children's emotional responses to classes (happiness and confusion) were significant predictors of achievement, the type of emotion varying between subject areas and time periods. Females generally had a more positive emotional reaction to reading and instrumental music classes than males did. The integrated model results indicated significant relationships between EVT and flow theories for each subject area, with EVT explaining most achievement variance in the integrated model. Children's and parents’ competence beliefs were the main predictors of achievement at Timel and 2, Subject area and gender differences were found which provide direction for future research. Anecdotal reports of parents and teachers often attest to individual differences in children's involvement in various school domains. Even among children of apparently similar intelligence, it is not uncommon to find one who likes nothing better than to work on a mathematics problem while another much prefers to read a novel or play a musical instrument Some children appear to achieve good results for most of the activities in which they are engaged while others achieve in a less consistent manner, sometimes particularly excelling in one activity. Some children respond to failure experiences with a determination to improve their performance in the future while others react with resignation and acceptance of their low ability. Some children appear to become totally absorbed in the activity of playing sport while others cannot wait for the game to end. The primary research objective guiding the current thesis is how children's thoughts and feelings about school subjects differ and are related to their school achievement. A perusal of the achievement motivation literature indicates several possible models and concepts that can be applied to explain individual differences in children's school achievement. Concepts such as academic self-concept, multiple intelligences, intrinsic and extrinsic motivation, self-beliefs, competence beliefs, subjective task values, mastery and performance goals, ‘Flow’ experiences and social motivation are just some of the constructs used to explain children's achievement motivation, both within and between various activity domains. These constructs are proposed by researchers from different theoretical perspectives to achievement motivation. Although there is much literature relevant to each perspective, there is little research indicating how the various perspectives may relate to each other. The current thesis will begin by reviewing three currently popular theoretical orientations cited in achievement motivation research: subjective beliefs and values; implicit theories of intelligence, and flow experience and family complexity. Following this review, a framework will be proposed for testing the determinants of children's school achievement, both within each of the three theoretical perspectives and also in combination.
133

Extreme behavior and VaR of Short-term interest rate of Taiwan

Chiang, Ming-Chu 21 July 2008 (has links)
The current study empirically analyzes the extreme behavior and the impact of deregulation policies as well as financial turmoil on the extreme behavior of changes of Taiwan short term interest rate. A better knowledge of short-term interest rate properties, such as heavy tails, asymmetry, and uneven tail fatness between right and left tails, provide an insight to the extreme behavior of short-term interest rate as well as a more accurate estimation of interest risk. The predicting performances of filtered and unfiltered VaR (Value at risk) models are also examined to suggest the proper models for management of interest rate risk. By applying Extreme Value theory (EVT), tail behavior is analyzed and tested and the VaR based on parametric and non-parametric EVT models are calculated.The empirical findings show that, first, the distribution of change of rate are heavy-tailed indicating that the actual risk would be underestimated based on normality assumption. Second, the unconditional distribution is consistent with the heavier-tailed distributions such as ARCH process or Student¡¦t. Third, the right tail of distribution of change of rate are significantly heavier than the left one pointing out that the probabilities and magnitudes of rise in rate could be higher than those of drop in rate. Fourth, the amount of tail-fatness in tail of distribution of change of rate increase after 1999 and the vital factors to cause structural break in tail index are the interest rate policies taken by central bank of Taiwan instead of the deregulation policies in money market. Fifth, based on the two break points found in tail index of right and left tail, long sample of CP rates should not be treated as samples from a single distribution. Sixth, the dependent and heteroscedastic properties of data series should be considered in applying EVT to improve accuracy of VaR forecasts. Finally, EVT models predict VaR accurately before 2001 and the benchmark model, HS and GARCH, generally are superior to EVT models after 2001. Among EVT models, MRE and CHE are relative consistent and reliable in VaR prediction.
134

An assessment of uncertainties and limitations in simulating tropical cyclone climatology and future changes

Suzuki-Parker, Asuka 04 May 2011 (has links)
The recent elevated North Atlantic hurricane activity has generated considerable interests in the interaction between tropical cyclones (TCs) and climate change. The possible connection between TCs and the changing climate has been indicated by observational studies based on historical TC records; they indicate emerging trends in TC frequency and intensity in some TC basins, but the detection of trends has been hotly debated due to TC track data issues. Dynamical climate modeling has also been applied to the problem, but brings its own set of limitations owing to limited model resolution and uncertainties. The final goal of this study is to project the future changes of North Atlantic TC behavior with global warming for the next 50 years using the Nested Regional Climate Model (NRCM). Throughout the course of reaching this goal, various uncertainties and limitations in simulating TCs by the NRCM are identified and explored. First we examine the TC tracking algorithm to detect and track simulated TCs from model output. The criteria and thresholds used in the tracking algorithm control the simulated TC climatology, making it difficult to objectively assess the model's ability in simulating TC climatology. Existing tracking algorithms used by previous studies are surveyed and it is found that the criteria and thresholds are very diverse. Sensitivity of varying criteria and thresholds in TC tracking algorithm to simulated TC climatology is very high, especially with the intensity and duration thresholds. It is found that the commonly used criteria may not be strict enough to filter out intense extratropical systems and hybrid systems. We propose that a better distinction between TCs and other low-pressure systems can be achieved by adding the Cyclone Phase technique. Two sets of NRCM simulations are presented in this dissertation: One in the hindcasting mode, and the other with forcing from the Community Climate System Model (CCSM) to project into the future with global warming. Both of these simulations are assessed using the tracking algorithm with cyclone phase technique. The NRCM is run in a hindcasting mode for the global tropics in order to assess its ability to simulate the current observed TC climatology. It is found that the NRCM is capable of capturing the general spatial and temporal distributions of TCs, but tends to overproduce TCs particularly in the Northwest Pacific. The overpredction of TCs is associated with the overall convective tendency in the model added with an outstanding theory of wave energy accumulation leading to TC genesis. On the other hand, TC frequency in the tropical North Atlantic is under predicted due to the lack of moist African Easterly Waves. The importance of high-resolution is shown with the additional simulation with two-way nesting. The NRCM is then forced by the CCSM to project the future changes in North Atlantic TCs. An El Nino-like SST bias in the CCSM induced a high vertical wind shear in tropical North Atlantic, preventing TCs from forming in this region. A simple bias correction method is applied to remove this bias. The model projected an increase both in TC frequency and intensity owing to enhanced TC genesis in the main development region, where the model projects an increased favorability of large-scale environment for TC genesis. However, the model is not capable of explicitly simulating intense (Category 3-5) storms due to the limited model resolution. To extrapolate the prediction to intense storms, we propose a hybrid approach that combines the model results and a statistical modeling using extreme value theory. Specifically, the current observed TC intensity is statistically modeled with the General Pareto distribution, and the simulated intensity changes from the NRCM are applied to the statistical model to project the changes in intense storms. The results suggest that the occurrence of Category 5 storms may be increased by approximately 50% by 2055.
135

Ekstremumų asimptotinė analizė, kai imties didumo skirstinys yra neigiamas binominis / Asymptotis Analisis of Extremes, when the set size is distributed by negative binomial distribution

Sidekerskienė, Tatjana 05 June 2006 (has links)
In this work were considered the maxima and minima structures. Where number of value is random and is distributed by negative binomial distribution. There were theorems that were improved in this work, that helped to find the limit distribute function of this standard structures. These theorems generalize propositions, when set size is geometric random number. Also, there was the concrete distribution analysis done and such distributions were chosen: exponential, general logistic and uniform.
136

極值理論與整合風險衡量

黃御綸 Unknown Date (has links)
自從90年代以來,許多機構因為金融商品的操縱不當或是金融風暴的衝擊數度造成全球金融市場的動盪,使得風險管理的重要性與日俱增,而量化風險模型的準確性也益受重視,基於財務資料的相關性質如異質變異、厚尾現象等,本文主要結合AR(1)-GARCH(1,1)模型、極值理論、copula函數三種模型應用在風險值的估算,且將報酬分配的假設區分為三類,一是無母數模型的歷史模擬法,二是基於常態分配假設下考量隨機波動度的有母數模型,三是利用歷史資料配適尾端分配的極值理論法來對聯電、鴻海、國泰金、中鋼四檔個股和台幣兌美元、日圓兌美元、英鎊兌美元三種外匯資料作一日風險值、十日風險值、組合風險值的測試。 實證結果發現,在一日風險值方面,95%信賴水準下以動態風險值方法表現相對較好,99%信賴水準下動態極值理論法和動態歷史模擬法皆有不錯的估計效果;就十日風險值而言,因為未來十日資產的報酬可能受到特定事件影響,所以估計上較為困難,整體看來在99%信賴水準下以條件GPD+蒙地卡羅模擬的表現相對較理想;以組合風險值來說, copula、Clayton copula+GPD marginals模擬股票或外匯組合的聯合分配不論在95%或99%信賴水準下對其風險值的估計都獲得最好的結果;雖然台灣個股股價受到上下漲跌幅7%的限制,台幣兌美元的匯率也受到央行的干涉,但以極值理論來描述資產尾端的分配情形相較於假設其他兩種分配仍有較好的估計效果。
137

The Relationship Between Learning Persistence and Equipment Design Through the Lens of Expectancy-Value Theory

January 2016 (has links)
abstract: Learners' attitudes and beliefs during the initial stages of learning have a profound impact on their future decisions, practice habits, and persistence. In music education, however, surprisingly little research has explored how physical equipment design might influence novices' attitudes and beliefs. The current study addresses this gap by examining how novices' motivation and perception differ based on the physical design of the musical instrument they interact with while learning. Fifty-two adult participants completed an online survey measuring their expectancies (e.g., confidence), value beliefs (e.g., enjoyment, interest, and social merit), and anticipated persistence while attempting to learn the electric guitar. Afterward, participants attempted to learn and perform several beginner-level tasks while using a conventionally designed or ergonomically designed guitar. The conventionally designed guitar was a commercially available model marketed toward beginner and intermediate-level guitarists. In contrast, the ergonomic guitar was a custom model based on expert design recommendations to improve ease of use, comfort, and user experience. Participant learning expectations and values were assessed before and after a one-hour practice session. Results revealed that novices who used the ergonomic guitar reported significant gains in anticipated learning enjoyment. Alternatively, novices who used the conventional guitar exhibited no such change. Beyond this relationship however, the ergonomic guitar was not found to meaningfully affect participants' confidence, interest, physical discomfort, and task difficulty perceptions. Additionally, the ergonomic guitar did not have a statistically significant influence on learning persistence ratings. One important implication extracted from this study is that a single practice session may not provide enough time or experience to affect a novices' attitudes and beliefs toward learning. Future studies may seek to remedy this study limitation by using a longitudinal design or longer practice task trials. Despite this limitation however, this exploratory study highlights the need for researchers, music educators, and instrument manufacturers to carefully consider how the physical design of a musical instrument may impact learning attitudes, choices, and persistence over time. Additionally, this study offers the first attempt at extending the equipment design literature to music education and Expectancy-Value Theory. / Dissertation/Thesis / Masters Thesis Applied Psychology 2016
138

Enhanced Power System Operational Performance with Anticipatory Control under Increased Penetration of Wind Energy

January 2016 (has links)
abstract: As the world embraces a sustainable energy future, alternative energy resources, such as wind power, are increasingly being seen as an integral part of the future electric energy grid. Ultimately, integrating such a dynamic and variable mix of generation requires a better understanding of renewable generation output, in addition to power grid systems that improve power system operational performance in the presence of anticipated events such as wind power ramps. Because of the stochastic, uncontrollable nature of renewable resources, a thorough and accurate characterization of wind activity is necessary to maintain grid stability and reliability. Wind power ramps from an existing wind farm are studied to characterize persistence forecasting errors using extreme value analysis techniques. In addition, a novel metric that quantifies the amount of non-stationarity in time series wind power data was proposed and used in a real-time algorithm to provide a rigorous method that adaptively determines training data for forecasts. Lastly, large swings in generation or load can cause system frequency and tie-line flows to deviate from nominal, so an anticipatory MPC-based secondary control scheme was designed and integrated into an automatic generation control loop to improve the ability of an interconnection to respond to anticipated large events and fluctuations in the power system. / Dissertation/Thesis / Doctoral Dissertation Electrical Engineering 2016
139

Inferencia Bayesiana para valores extremos / Bayesian inference for extremes

Bernardini, Diego Fernando de, 1986- 15 August 2018 (has links)
Orientador: Laura Leticia Ramos Rifo / Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Matematica, Estatistica e Computação Cientifica / Made available in DSpace on 2018-08-15T01:44:09Z (GMT). No. of bitstreams: 1 Bernardini_DiegoFernandode_M.pdf: 1483229 bytes, checksum: ea77acd21778728138eea2f27e59235b (MD5) Previous issue date: 2010 / Resumo: Iniciamos o presente trabalho apresentando uma breve introdução a teoria de valores extremos, estudando especialmente o comportamento da variável aleatória que representa o máximo de uma sequência de variáveis aleatórias independentes e identicamente distribuídas. Vemos que o Teorema dos Tipos Extremos (ou Teorema de Fisher-Tippett) constitui uma ferramenta fundamental no que diz respeito ao estudo do comportamento assintóticos destes máximos, permitindo a modelagem de dados que representem uma sequência de observações de máximos de um determinado fenômeno ou processo aleatório, através de uma classe de distribuições conhecida como família de distribuições de Valor Extremo Generalizada (Generalized Extreme Value - GEV). A distribuição Gumbel, associada ao máximo de distribuições como a Normal ou Gama entre outras, é um caso particular desta família. Torna-se interessante, assim, realizar inferência para os parâmetros desta família. Especificamente, a comparação entre os modelos Gumbel e GEV constitui o foco principal deste trabalho. No Capítulo 1 estudamos, no contexto da inferência clássica, o método de estimação por máxima verossimilhança para estes parâmetros e um procedimento de teste de razão de verossimilhanças adequado para testar a hipótese nula que representa o modelo Gumbel contra a hipótese que representa o modelo completo GEV. Prosseguimos, no Capítulo 2, com uma breve revisão em teoria de inferência Bayesiana obtendo inferências para o parâmetro de interesse em termos de sua distribuição a posteriori. Estudamos também a distribuição preditiva para valores futuros. No que diz respeito à comparação de modelos, estudamos inicialmente, neste contexto bayesiano, o fator de Bayes e o fator de Bayes a posteriori. Em seguida estudamos o Full Bayesian Significance Test (FBST), um teste de significância particularmente adequado para testar hipóteses precisas, como a hipótese que caracteriza o modelo Gumbel. Além disso, estudamos outros dois critérios para comparação de modelos, o BIC (Bayesian Information Criterion) e o DIC (Deviance Information Criterion). Estudamos as medidas de evidência especificamente no contexto da comparação entre os modelos Gumbel e GEV, bem como a distribuição preditiva, além dos intervalos de credibilidade e inferência a posteriori para os níveis de retorno associados a tempos de retorno fixos. O Capítulo 1 e parte do Capítulo 2 fornecem os fundamentos teóricos básicos deste trabalho, e estão fortemente baseados em Coles (2001) e O'Hagan (1994). No Capítulo 3 apresentamos o conhecido algoritmo de Metropolis-Hastings para simulação de distribuições de probabilidade e o algoritmo particular utilizado neste trabalho para a obtenção de amostras simuladas da distribuição a posteriori dos parâmetros de interesse. No capítulo seguinte formulamos a modelagem dos dados observados de máximos, apresentando a função de verossimilhança e estabelecendo a distribuição a priori para os parâmetros. Duas aplicações são apresentadas no Capítulo 5. A primeira delas trata das observações dos máximos trimestrais das taxas de desemprego nos Estados Unidos da América, entre o primeiro trimestre de 1994 e o primeiro trimestre de 2009. Na segunda aplicação estudamos os máximos semestrais dos níveis de maré em Newlyn, no sudoeste da Inglaterra, entre 1990 e 2007. Finalmente, uma breve discussão é apresentada no Capítulo 6. / Abstract: We begin this work presenting a brief introduction to the extreme value theory, specifically studying the behavior of the random variable which represents the maximum of a sequence of independent and identically distributed random variables. We see that the Extremal Types Theorem (or Fisher-Tippett Theorem) is a fundamental tool in the study of the asymptotic behavior of those maxima, allowing the modeling of data which represent a sequence of maxima observations of a given phenomenon or random process, through a class of distributions known as Generalized Extreme Value (GEV) family. We are interested in making inference about the parameters of this family. Specifically, the comparison between the Gumbel and GEV models constitute the main focus of this work. In Chapter 1 we study, in the context of classical inference, the method of maximum likelihood estimation for these parameters and likelihood ratio test procedure suitable for testing the null hypothesis associated to the Gumbel model against the hypothesis that represents the complete GEV model. We proceed, in Chapter 2, with a brief review on Bayesian inference theory. We also studied the predictive distribution for future values. With respect to the comparison of models, we initially study the Bayes factor and the posterior Bayes factor, in the Bayesian context. Next we study the Full Bayesian Significance Test (FBST), a significance test particularly suitable to test precise hypotheses, such as the hypothesis characterizing the Gumbel model. Furthermore, we study two other criteria for comparing models, the BIC (Bayesian Information Criterion) and the DIC (Deviance Information Criterion). We study the evidence measures specifically in the context of the comparison between the Gumbel and GEV models, as well as the predictive distribution, beyond the credible intervals and posterior inference to the return levels associated with fixed return periods. Chapter 1 and part of Chapter 2 provide the basic theoretical foundations of this work, and are strongly based on Coles (2001) and O'Hagan (1994). In Chapter 3 we present the well-known Metropolis-Hastings algorithm for simulation of probability distributions, and the particular algorithm used in this work to obtain simulated samples from the posterior distribution for the parameters of interest. In the next chapter we formulate the modeling of the observed data of maximum, presenting the likelihood function and setting the prior distribution for the parameters. Two applications are presented in Chapter 5. The first one deals with observations of the quarterly maximum for unemployment rates in the United States of America, between the first quarter of 1994 and first quarter of 2009. In the second application we studied the semiannual maximum of sea levels at Newlyn, in southwest of England, between 1990 and 2007. Finally, a brief discussion is presented in Chapter 6. / Mestrado / Estatistica / Mestre em Estatística
140

The best are never normal: exploring the distribution of firm performance

Buchbinder, Felipe 08 June 2011 (has links)
Submitted by Estagiário SPT BMHS (spt@fgv.br) on 2013-07-30T12:35:47Z No. of bitstreams: 1 Dissertation Felipe Buchbinder.pdf: 1585162 bytes, checksum: b566d391b5cfffdea11553af4c3fcd3e (MD5) / Approved for entry into archive by Estagiário SPT BMHS (spt@fgv.br) on 2013-07-30T12:36:04Z (GMT) No. of bitstreams: 1 Dissertation Felipe Buchbinder.pdf: 1585162 bytes, checksum: b566d391b5cfffdea11553af4c3fcd3e (MD5) / Approved for entry into archive by Estagiário SPT BMHS (spt@fgv.br) on 2013-07-30T12:36:22Z (GMT) No. of bitstreams: 1 Dissertation Felipe Buchbinder.pdf: 1585162 bytes, checksum: b566d391b5cfffdea11553af4c3fcd3e (MD5) / Made available in DSpace on 2013-07-30T12:36:46Z (GMT). No. of bitstreams: 1 Dissertation Felipe Buchbinder.pdf: 1585162 bytes, checksum: b566d391b5cfffdea11553af4c3fcd3e (MD5) Previous issue date: 2011-06-08 / Competitive Strategy literature predicts three different mechanisms of performance generation, thus distinguishing between firms that have competitive advantage, firms that have competitive disadvantage or firms that have neither. Nonetheless, previous works in the field have fitted a single normal distribution to model firm performance. Here, we develop a new approach that distinguishes among performance generating mechanisms and allows the identification of firms with competitive advantage or disadvantage. Theorizing on the positive feedback loops by which firms with competitive advantage have facilitated access to acquire new resources, we proposed a distribution we believe data on firm performance should follow. We illustrate our model by assessing its fit to data on firm performance, addressing its theoretical implications and comparing it to previous works.

Page generated in 0.0844 seconds