• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 57
  • 22
  • 10
  • 8
  • 7
  • 6
  • 4
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 139
  • 139
  • 38
  • 32
  • 29
  • 27
  • 25
  • 25
  • 20
  • 19
  • 19
  • 18
  • 17
  • 16
  • 14
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

MONITORING AUTOCORRELATED PROCESSES

Tang, Weiping 10 1900 (has links)
<p>This thesis is submitted by Weiping Tang on August 2, 2011.</p> / <p>Several control schemes for monitoring process mean shifts, including cumulative sum (CUSUM), weighted cumulative sum (WCUSUM), adaptive cumulative sum (ACUSUM) and exponentially weighted moving average (EWMA) control schemes, display high performance in detecting constant process mean shifts. However, a variety of dynamic mean shifts frequently occur and few control schemes can efficiently work in these situations due to the limited window for catching shifts, particularly when the mean decreases rapidly. This is precisely the case when one uses the residuals from autocorrelated data to monitor the process mean, a feature often referred to as forecast recovery. This thesis focuses on detecting a shift in the mean of a time series when a forecast recovery dynamic pattern in the mean of the residuals is observed. Specifically, we examine in detail several particular cases of the Autoregressive Integrated Moving Average (ARIMA) time series models. We introduce a new upper-sided control chart based on the Exponentially Weighted Moving Average (EWMA) scheme combined with the Fast Initial Response (FIR) feature. To assess chart performance we use the well-established Average</p> <p>Run Length (ARL) criterion. A non-homogeneous Markov chain method is developed for ARL calculation for the proposed chart. We show numerically that the proposed procedure performs as well or better than the Weighted Cumulative Sum (WCUSUM) chart introduced by Shu, Jiang and Tsui (2008), and better than the conventional CUSUM, the ACUSUM and the Generalized Likelihood Ratio Test (GLRT) charts. The methods are illustrated on molecular weight data from a polymer manufacturing process.</p> / Master of Science (MSc)
102

增益型指數基金之建構 / Building the enhanced index fund

王世方 Unknown Date (has links)
本研究針對臺灣摩根指數的成分股進行分析,研究樣本期間從2008年至2010年,合計三個年度,正好歷經景氣的一個多空循環週期。本研究利用技術指標作為判讀多空的工具,技術指標包含價與量的技術分析工具,價格的技術指標有趨勢指標MA、擺盪指標KD與MACD,量的技術指標則是OBV。並利用優化的方式挑選出合適的參數值。本研究的風險控管則是控管個股的偏離程度,當允許的偏離程度愈大,模型便愈能區別出強勢股與弱勢股,風險的衡量指標則是採用年化追蹤誤差值來衡量,本研究設定的限制條件為最大累積年化追蹤誤差值不得超越6%。 實證結果發現,當模組的模型年化追蹤誤差值設定愈大,個股的偏離程度就愈大,模組的報酬表現就愈佳,但同樣的風險也愈大,即年化追蹤誤差值愈大。當模型年化追蹤誤差值設定在24%,並搭配MA、MACD與OBV三個技術指標得到的績效最佳,同時亦能夠將風險控制在設定的6%水準之下。 / This study analyzed the component stocks in MSCI Taiwan Index. The analyzed data from 2008 to 2010 was exactly an economic cycle. The study was based on technical analysis, including price and volume to judge that the price was bullish or bearish. The price technical analysis included Moving Average (MA), Stochastic Line (KD) and Moving Average Convergence and Divergence (MACD). The volume technical analysis was On Balance Volume (OBV). The study used the method of optimization to choose the best parameter of each technical analysis. The risk control was to limit the bias of each stock. When the bias of each stock was larger, the model could easily distinguish the stock was bullish or bearish. The risk indicator was annual tracking error limited to 6% in the study. The empirical results showed that the larger the model annual tracking error set, the large bias the stock show, and the outperformance of the return. But with the performance of the return larger, the risk of tracking error was also getting larger. When the model annual tracking error set to 24%, and utilized MA, MACD and OBV would get the best performance and the risk of annual tracking error was under 6%.
103

Natural gas storage level forecasting using temperature data

Sundin, Daniel January 2020 (has links)
Even though the theory of storage is historically a popular view to explain commodity futures prices, many authors focus on the oil price link. Past studies have shown an increased futures price volatility on Mondays and days when natural gas storage levels are released, which could both implicate that storage levels and temperature data are incorporated in the prices. In this thesis, the U.S. natural gas storage level change is studied as a function of the consumption and production. Consumption and production are furthered segmented and separately forecasted by modelling inverse problems that are solved by least squares regression using temperature data and timeseries analysis. The results indicate that each consumer consumption segment is highly dependent of the temperature with R2-values of above 90%. However, modelling each segment completely by time-series analysis proved to be more efficient due to lack of flexibility in the polynomials, lack of used weather stations and seasonal patterns in addition to the temperatures. Although the forecasting models could not beat analysts’ consensus estimates, these present natural gas storage level drivers and can thus be used to incorporate temperature forecasts when estimating futures prices.
104

Desenvolvimento de uma ferramenta computacional para avaliação da assistência hospitalar a partir de indicadores de qualidade / Development of a computational tool to evaluate hospital performance through inpatient quality indicators

Souza, Júlio César Botelho de 25 February 2015 (has links)
Indicadores de qualidade hospitalar correspondem a medidas que contém informações relevantes sobre determinados atributos e dimensões que caracterizam a qualidade de diferentes instituições de saúde. Tais medidas são capazes de sinalizar eventuais deficiências ou práticas de sucesso associadas à qualidade dos serviços de saúde. O presente estudo teve por finalidade desenvolver uma ferramenta computacional de análise, voltada para o gerenciamento hospitalar, com o objetivo de se obter um instrumento que possa ser utilizado para monitorar e avaliar a qualidade dos serviços oferecidos por instituições hospitalares através da análise e gerenciamento de indicadores de qualidade hospitalar. Os indicadores alvo para avaliar a qualidade dos serviços representaram um subconjunto de indicadores de qualidade denominados Inpatient Quality Indicators (IQIs) da Agency for Healthcare Research and Quality (AHRQ). A partir da revisão bibliográfica de textos científicos na área e com base nas dimensões de processo e resultado do Modelo Donabediano, foram selecionados vinte e dois indicadores da AHRQ, que avaliam a mortalidade por determinadas afecções e procedimentos cirúrgicos, bem como a quantidade e a qualidade dos procedimentos realizados nas instituições de saúde. A ferramenta foi construída em dois módulos: um módulo responsável pela geração dos indicadores a partir de dados coletados de um banco de dados relacional; e outro destinado ao estudo e análise das séries temporais dos indicadores, permitindo o acompanhamento da evolução dos mesmos de forma histórica. Os dados utilizados para a geração dos indicadores são oriundos da base de dados do Observatório Regional de Atenção Hospitalar (ORAH), que consiste numa entidade responsável pelo processamento de dados de internação de quarenta hospitais públicos e privados, distribuídos ao longo de vinte e seis municípios da região de Ribeirão Preto, São Paulo, Brasil, que compõem a Departamento Regional de Saúde XIII (DRS-XIII). A ferramenta computacional foi concluída e validade com êxito e suas funcionalidades foram disponibilizadas para gestores de saúde e acadêmicos através do portal web de conteúdo vinculado ao ORAH. Em adição, os resultados obtidos através do uso da ferramenta foram utilizados para analisar a situação da assistência hospitalar na região de Ribeirão Preto através da comparação histórica dos indicadores entre as três microrregiões de saúde que compõem a DRS-XIII: Aquífero Guarani, Vale das Cachoeiras e Horizonte Verde. A análise destes resultados também foi essencial para verificar a capacidade da ferramenta em prover informações relevantes para a gestão hospitalar. A partir da análise dos resultados obtidos, concluímos que a ferramenta permite a definição de um panorama geral da assistência hospitalar na região de Ribeirão Preto. De acordo com os achados deste estudo, também verificamos que os indicadores de qualidade hospitalar da AHRQ cumpriram seu papel como medidas sentinela e foram capazes de identificar certos aspectos associados à realidade. Entretanto, a análise dos resultados também remeteu à necessidade de introduzir novas variáveis que permitam conhecer o real estado dos pacientes e as condições estruturais das diferentes instituições de saúde, visto que os indicadores selecionados, por si só, não fornecem aos gestores de saúde uma avaliação final da qualidade das instituições hospitalares. / Inpatient quality indicators are measures that provide relevant inforrnation on the level of quality of care delivered by hospitals and healthcare services. These measures are capable of signaling eventual problems or successful practices associated with the quality of care provided by health services. This project was aimed to create an instrument to assess the quality of care delivered by hospitals by developing a web application whose functionalities focused on monitoring a subset of inpatient quality indicators (IQIs), extracted from the Agency for Healthcare Research and Quality (AHRQ). Based on literature review and on the components of process and outcomes defined by the Donabedian model, there were selected twenty-two AHRQ\'s inpatient quality indicators that are commonly used to evaluate the mortality associated with certain conditions and procedures, as well as the quantity and quality of certain medical procedures. The software is composed by two components: one is responsible for calculating the indicators using admission data extracted from an operational database; the other one is meant for the study and analysis of time series of the indicators, which allows the monitoring of its values over the years. The indicators were ca1culated using administrative data from the Observatory for Hospital Care\'s database (ORAH, from the acronyrn in Portuguese \"Observatório Regional de Atenção Hospitalar\"). The Observatory for Hospital Care is responsible for processing admission data collected from forty hospitals located throughout Ribeirao Preto region, in the Brazilian state of Sao Paulo. The management of hospitals located in the Ribeirao Preto region is conducted by the Regional Department of Health XIII (DRS-XIII, from the acronyrn in Portuguese \"Departamento Regional de Saúde XIII). The web application\'s services were made available to health service administrators and academic personnel through the ORAH\'s website. The results provided by this computational tool were also used to analyze the situation of care delivered by the hospitals in Ribeirao Preto region, which is subdivided into three microregions: Aquifero Guarani, Horizonte Verde e Vale das Cachoeiras. The historic values of the indicators were compared between these three microregions. The analysis of these results was also important to verify whether the web application is actually able to provi de enough inforrnation to acknowledge the reality of the hospitals in Ribeirao Preto region. According to the results, we verified that the AHRQ\'s inpatient quality indicators have fulfilled their role in signalizing certain aspects related to the quality of care of the hospitals, but they do not provi de enough inforrnation to establish a defini tive quality assessment of hospital services. Therefore, we verified the need of introducing new attributes in order to understand and acknowledge the clinical condition of the hospitalized patients, as well as the structure and resources available in the hospitals.
105

A Study On The Predictive Optimal Active Control Of Civil Engineering Structures

Keyhani, Ali 12 1900 (has links)
Uncertainty involved in the safe and comfort design of the structures is a major concern of civil engineers. Traditionally, the uncertainty has been overcome by utilizing various and relatively large safety factors for loads and structural properties. As a result in conventional design of for example tall buildings, the designed structural elements have unnecessary dimensions that sometimes are more than double of the ones needed to resist normal loads. On the other hand the requirements for strength and safety and comfort can be conflicting. Consequently, an alternative approach for design of the structures may be of great interest in design of safe and comfort structures that also offers economical advantages. Recently, there has been growing interest among the researchers in the concept of structural control as an alternative or complementary approach to the existing approaches of structural design. A few buildings have been designed and built based on this concept. The concept is to utilize a device for applying a force (known as control force) to encounter the effects of disturbing forces like earthquake force. However, the concept still has not found its rightful place among the practical engineers and more research is needed on the subject. One of the main problems in structural control is to find a proper algorithm for determining the optimum control force that should be applied to the structure. The investigation reported in this thesis is concerned with the application of active control to civil engineering structures. From the literature on control theory. (Particularly literature on the control of civil engineering structures) problems faced in application of control theory were identified and classified into two categories: 1) problems common to control of all dynamical systems, and 2) problems which are specially important in control of civil engineering structures. It was concluded that while many control algorithms are suitable for control of dynamical systems, considering the special problems in controlling civil structures and considering the unique future of structural control, many otherwise useful control algorithms face practical problems in application to civil structures. Consequently a set of criteria were set for judging the suitability of the control algorithms for use in control of civil engineering structures. Various types of existing control algorithms were investigated and finally it was concluded that predictive optimal control algorithms possess good characteristics for purpose of control of civil engineering structures. Among predictive control algorithms, those that use ARMA stochastic models for predicting the ground acceleration are better fitted to the structural control environment because all the past measured excitation is used to estimate the trends of the excitation for making qualified guesses about its coming values. However, existing ARMA based predictive algorithms are devised specially for earthquake and require on-line measurement of the external disturbing load which is not possible for dynamic loads like wind or blast. So, the algorithms are not suitable for tall buildings that experience both earthquake and wind loads during their life. Consequently, it was decided to establish a new closed loop predictive optimal control based on ARMA models as the first phase of the study. In this phase it was initially established that ARMA models are capable of predicting response of a linear SDOF system to the earthquake excitation a few steps ahead. The results of the predictions encouraged a search for finding a new closed loop optimal predictive control algorithm for linear SDOF structures based on prediction of the response by ARMA models. The second part of phase I, was devoted to developing and testing the proposed algorithm The new developed algorithm is different from other ARMA based optimal controls since it uses ARMA models for prediction of the structure response while existing algorithms predict the input excitation. Modeling the structure response as an AR or ARMA stochastic process is an effective mean for prediction of the structure response while avoiding measurement of the input excitation. ARMA models used in the algorithm enables it to avoid or reduce the time delay effect by predicting the structure response a few steps ahead. Being a closed loop control, the algorithm is suitable for all structural control conditions and can be used in a single control mechanism for vibration control of tall buildings against wind, earthquake or other random dynamic loads. Consequently the standby time is less than that for existing ARMA based algorithms devised only for earthquakes. This makes the control mechanism more reliable. The proposed algorithm utilizes and combines two different mathematical models. First model is an ARMA model representing the environment and the structure as a single system subjected to the unknown random excitation and the second model is a linear SDOF system which represents the structure subjected to a known past history of the applied control force only. The principle of superposition is then used to combine the results of these two models to predict the total response of the structure as a function of the control force. By using the predicted responses, the minimization of the performance index with respect to the control force is carried out for finding the optimal control force. As phase II, the proposed predictive control algorithm was extended to structures that are more complicated than linear SDOF structures. Initially, the algorithm was extended to linear MDOF structures. Although, the development of the algorithm for MDOF structures was relatively straightforward, during testing of the algorithm, it was found that prediction of the response by ARMA models can not be done as was done for SDOF case. In the SDOF case each of the two components of the state vector (i.e. displacement and velocity) was treated separately as an ARMA stochastic process. However, applying the same approach to each component of the state vector of a MDOF structure did not yield satisfactory results in prediction of the response. Considering the whole state vector as a multi-variable ARMA stochastic vector process yielded the desired results in predicting the response a few steps ahead. In the second part of this phase, the algorithm was extended to non-linear MDOF structures. Since the algorithm had been developed based on the principle of superposition, it was not possible to directly extend the algorithm to non-linear systems. Instead, some generalized response was defined. Then credibility of the ARMA models in predicting the generalized response was verified. Based on this credibility, the algorithm was extended for non-linear MDOF structures. Also in phase II, the stability of a controlled MDOF structure was proved. Both internal and external stability of the system were described and verified. In phase III, some problems of special interest, i.e. soil-structure interaction and control time delay, were investigated and compensated for in the framework of the developed predictive optimal control. In first part of phase III soil-structure interaction was studied. The half-space solution of the SSI effect leads to a frequency dependent representation of the structure-footing system, which is not fit for control purpose. Consequently an equivalent frequency independent system was proposed and defined as a system whose frequency response is equal to the original structure -footing system in the mean squares sense. This equivalent frequency independent system then was used in the control algorithm. In the second part of this phase, an analytical approach was used to tackle the time delay phenomenon in the context of the predictive algorithm described in previous chapters. A generalized performance index was defined considering time delay. Minimization of the generalized performance index resulted into a modified version of the algorithm in which time delay is compensated explicitly. Unlike the time delay compensation technique used in the previous phases of this investigation, which restricts time delay to be an integer multiplier of the sampling period, the modified algorithm allows time delay to be any non-negative number. However, the two approaches produce the same results if time delay is an integer multiplier of the sampling period. For evaluating the proposed algorithm and comparing it with other algorithms, several numerical simulations were carried during the research by using MATLAB and its toolboxes. A few interesting results of these simulations are enumerated below: ARM A models are able to predict the response of both linear and non-linear structures to random inputs such as earthquakes. The proposed predictive optimal control based on ARMA models has produced better results in the context of reducing velocity, displacement, total energy and operational cost compared to classic optimal control. Proposed active control algorithm is very effective in increasing safety and comfort. Its performance is not affected much by errors in the estimation of system parameters (e.g. damping). The effect of soil-structure interaction on the response to control force is considerable. Ignoring SSI will cause a significant change in the magnitude of the frequency response and a shift in the frequencies of the maximum response (resonant frequencies). Compensating the time delay effect by the modified version of the proposed algorithm will improve the performance of the control system in achieving the control goal and reduction of the structural response.
106

應用探勘技術於社會輿情以預測捷運週邊房地產市場之研究 / A Study of Applying Public Opinion Mining to Predict the Housing Market Near the Taipei MRT Stations

吳佳芸, Wu, Chia Yun Unknown Date (has links)
因網際網路帶來的便利性與即時性,網路新聞成為社會大眾吸收與傳遞新聞資訊的重要管道之一,而累積的巨量新聞亦可反映出社會輿論對某特定新聞議題之即時反應、熱門程度以及情緒走向等。 因此,本研究期望借由意見探勘與情緒分析技術,從特定領域新聞中挖掘出有價值的關聯,並結合傳統機器學習建立一個房地產市場的預測模式,提供購屋決策的參考依據。 本研究搜集99年1月1日至103年6月30日共1,1150筆房地產新聞,以及8,165件捷運週邊250公尺內房屋買賣交易資料,運用意見探勘萃取意見詞彙進行情緒分析,並建立房市情緒與成交價量時間序列,透過半年移動平均、二次移動平均及成長斜率,瞭解社會輿情對房市行情抱持樂觀或悲觀,分析社會情緒與實際房地產成交間關聯性,以期能找出房地產買賣時機點,並進一步結合情緒及房地產的環境影響因素,藉由支援向量機建立站點房市的預測模型。 實證結果中,本研究發現房市情緒與成交價量之波動有一定的週期與相關性,且新捷運開通前一年將連帶影響整體捷運房市波動,當成交線穿越情緒線且斜率同時向上時,可做為適當的房市進場時機點。而本研究針對站點情緒與環境變數所建立之預測模型,其預測新捷運線站點之平均準確率為69.2%,而預測新捷運線熱門站點之準確率為78%,顯示模型於預測熱門站點上具有不錯的預測能力。 / Nowadays, E-News have become an important way for people to get daily information. These enormous amounts of news could reflect public opinions on a particular attention or sentiment trends in news topics. Therefore, how to use opinion mining and sentiment analysis technology to dig out valuable information from particular news becomes the latest issue. In this study, we collected 1,1150 house news and 8,165 house transaction records around the MRT stations within 250 meters over the last five years. We extracted the emotion words from the news by manipulating opinion mining. Furthermore, we built moving average lines and the slope of the moving average in order to explore the relationship and entry point between public opinion and housing market. In conclusion, we indicated that there is a high correlation between the news sentiment and housing market. We also uses SVM algorithm to construct a model to predict housing hotspots. The results demonstrate that the SVM model reaches average accuracy at 69.2% and the model accuracy increases up to 78% for predicting housing hotspots. Besides, we also provide investors with a basis of entry point into the housing market by utilizing the moving average cross overs and slopes analysis and a better way of predicting housing hotspots.
107

Forecasting Mid-Term Electricity Market Clearing Price Using Support Vector Machines

2014 May 1900 (has links)
In a deregulated electricity market, offering the appropriate amount of electricity at the right time with the right bidding price is of paramount importance. The forecasting of electricity market clearing price (MCP) is a prediction of future electricity price based on given forecast of electricity demand, temperature, sunshine, fuel cost, precipitation and other related factors. Currently, there are many techniques available for short-term electricity MCP forecasting, but very little has been done in the area of mid-term electricity MCP forecasting. The mid-term electricity MCP forecasting focuses electricity MCP on a time frame from one month to six months. Developing mid-term electricity MCP forecasting is essential for mid-term planning and decision making, such as generation plant expansion and maintenance schedule, reallocation of resources, bilateral contracts and hedging strategies. Six mid-term electricity MCP forecasting models are proposed and compared in this thesis: 1) a single support vector machine (SVM) forecasting model, 2) a single least squares support vector machine (LSSVM) forecasting model, 3) a hybrid SVM and auto-regression moving average with external input (ARMAX) forecasting model, 4) a hybrid LSSVM and ARMAX forecasting model, 5) a multiple SVM forecasting model and 6) a multiple LSSVM forecasting model. PJM interconnection data are used to test the proposed models. Cross-validation technique was used to optimize the control parameters and the selection of training data of the six proposed mid-term electricity MCP forecasting models. Three evaluation techniques, mean absolute error (MAE), mean absolute percentage error (MAPE) and mean square root error (MSRE), are used to analysis the system forecasting accuracy. According to the experimental results, the multiple SVM forecasting model worked the best among all six proposed forecasting models. The proposed multiple SVM based mid-term electricity MCP forecasting model contains a data classification module and a price forecasting module. The data classification module will first pre-process the input data into corresponding price zones and then the forecasting module will forecast the electricity price in four parallel designed SVMs. This proposed model can best improve the forecasting accuracy on both peak prices and overall system compared with other 5 forecasting models proposed in this thesis.
108

Applications of Spatio-temporal Analytical Methods in Surveillance of Ross River Virus Disease

Hu, Wenbiao January 2005 (has links)
The incidence of many arboviral diseases is largely associated with social and environmental conditions. Ross River virus (RRV) is the most prevalent arboviral disease in Australia. It has long been recognised that the transmission pattern of RRV is sensitive to socio-ecological factors including climate variation, population movement, mosquito-density and vegetation types. This study aimed to assess the relationships between socio-environmental variability and the transmission of RRV using spatio-temporal analytic methods. Computerised data files of daily RRV disease cases and daily climatic variables in Brisbane, Queensland during 1985-2001 were obtained from the Queensland Department of Health and the Australian Bureau of Meteorology, respectively. Available information on other socio-ecological factors was also collected from relevant government agencies as follows: 1) socio-demographic data from the Australia Bureau of Statistics; 2) information on vegetation (littoral wetlands, ephemeral wetlands, open freshwater, riparian vegetation, melaleuca open forests, wet eucalypt, open forests and other bushland) from Brisbane City Council; 3) tidal activities from the Queensland Department of Transport; and 4) mosquito-density from Brisbane City Council. Principal components analysis (PCA) was used as an exploratory technique for discovering spatial and temporal pattern of RRV distribution. The PCA results show that the first principal component accounted for approximately 57% of the information, which contained the four seasonal rates and loaded highest and positively for autumn. K-means cluster analysis indicates that the seasonality of RRV is characterised by three groups with high, medium and low incidence of disease, and it suggests that there are at least three different disease ecologies. The variation in spatio-temporal patterns of RRV indicates a complex ecology that is unlikely to be explained by a single dominant transmission route across these three groupings. Therefore, there is need to explore socio-economic and environmental determinants of RRV disease at the statistical local area (SLA) level. Spatial distribution analysis and multiple negative binomial regression models were employed to identify the socio-economic and environmental determinants of RRV disease at both the city and local (ie, SLA) levels. The results show that RRV activity was primarily concentrated in the northeast, northwest and southeast areas in Brisbane. The negative binomial regression models reveal that RRV incidence for the whole of the Brisbane area was significantly associated with Southern Oscillation Index (SOI) at a lag of 3 months (Relative Risk (RR): 1.12; 95% confidence interval (CI): 1.06 - 1.17), the proportion of people with lower levels of education (RR: 1.02; 95% CI: 1.01 - 1.03), the proportion of labour workers (RR: 0.97; 95% CI: 0.95 - 1.00) and vegetation density (RR: 1.02; 95% CI: 1.00 - 1.04). However, RRV incidence for high risk areas (ie, SLAs with higher incidence of RRV) was significantly associated with mosquito density (RR: 1.01; 95% CI: 1.00 - 1.01), SOI at a lag of 3 months (RR: 1.48; 95% CI: 1.23 - 1.78), human population density (RR: 3.77; 95% CI: 1.35 - 10.51), the proportion of indigenous population (RR: 0.56; 95% CI: 0.37 - 0.87) and the proportion of overseas visitors (RR: 0.57; 95% CI: 0.35 - 0.92). It is acknowledged that some of these risk factors, while statistically significant, are small in magnitude. However, given the high incidence of RRV, they may still be important in practice. The results of this study suggest that the spatial pattern of RRV disease in Brisbane is determined by a combination of ecological, socio-economic and environmental factors. The possibility of developing an epidemic forecasting system for RRV disease was explored using the multivariate Seasonal Auto-regressive Integrated Moving Average (SARIMA) technique. The results of this study suggest that climatic variability, particularly precipitation, may have played a significant role in the transmission of RRV disease in Brisbane. This finding cannot entirely be explained by confounding factors such as other socio-ecological conditions because they have been unlikely to change dramatically on a monthly time scale in this city over the past two decades. SARIMA models show that monthly precipitation at a lag 2 months (=0.004,p=0.031) was statistically significantly associated with RRV disease. It suggests that there may be 50 more cases a year for an increase of 100 mm precipitation on average in Brisbane. The predictive values in the model were generally consistent with actual values (root-mean-square error (RMSE): 1.96). Therefore, this model may have applications as a decision support tool in disease control and risk-management planning programs in Brisbane. The Polynomial distributed lag (PDL) time series regression models were performed to examine the associations between rainfall, mosquito density and the occurrence of RRV after adjusting for season and auto-correlation. The PDL model was used because rainfall and mosquito density can affect not merely RRV occurring in the same month, but in several subsequent months. The rationale for the use of the PDL technique is that it increases the precision of the estimates. We developed an epidemic forecasting model to predict incidence of RRV disease. The results show that 95% and 85% of the variation in the RRV disease was accounted for by the mosquito density and rainfall, respectively. The predictive values in the model were generally consistent with actual values (RMSE: 1.25). The model diagnosis reveals that the residuals were randomly distributed with no significant auto-correlation. The results of this study suggest that PDL models may be better than SARIMA models (R-square increased and RMSE decreased). The findings of this study may facilitate the development of early warning systems for the control and prevention of this widespread disease. Further analyses were conducted using classification trees to identify major mosquito species of Ross River virus (RRV) transmission and explore the threshold of mosquito density for RRV disease in Brisbane, Australia. The results show that Ochlerotatus vigilax (RR: 1.028; 95% CI: 1.001 - 1.057) and Culex annulirostris (RR: 1.013, 95% CI: 1.003 - 1.023) were significantly associated with RRV disease cycles at a lag of 1 month. The presence of RRV was associated with average monthly mosquito density of 72 Ochlerotatus vigilax and 52 Culex annulirostris per light trap. These results may also have applications as a decision support tool in disease control and risk management planning programs. As RRV has significant impact on population health, industry, and tourism, it is important to develop an epidemic forecast system for this disease. The results of this study show the disease surveillance data can be integrated with social, biological and environmental databases. These data can provide additional input into the development of epidemic forecasting models. These attempts may have significant implications in environmental health decision-making and practices, and may help health authorities determine public health priorities more wisely and use resources more effectively and efficiently.
109

Estratégias de momentum no mercado cambial

Silva, Kesley Leandro da 15 February 2016 (has links)
Submitted by Kesley Leandro da Silva (kesley.leandro@gmail.com) on 2016-03-10T17:32:09Z No. of bitstreams: 1 Dissertação v02.docx: 272937 bytes, checksum: 8b3b51152e65026481b1ba2a1541fcde (MD5) / Rejected by Renata de Souza Nascimento (renata.souza@fgv.br), reason: Kesley, Segue abaixo as alterações que deverão ser realizadas em seu trabalho: - O arquivo deve estar em pdf. - Nome e Título em Letra maiúscula. - Retirar a sigla SP que consta ao lado de SÃO PAULO. - A ficha catalográfica deve estar na parte inferior da pagina - Centralizar os títulos Resumo e Abstract - As páginas anteriores da Introdução não podem estar numeradas. Em seguida, submeter novamente o trabalho. Att on 2016-03-10T21:57:30Z (GMT) / Submitted by Kesley Leandro da Silva (kesley.leandro@gmail.com) on 2016-03-11T15:24:37Z No. of bitstreams: 1 Dissertação v03.pdf: 1405923 bytes, checksum: 28d2a1fb855d75506c6f1f010f4ff5a5 (MD5) / Approved for entry into archive by Renata de Souza Nascimento (renata.souza@fgv.br) on 2016-03-11T15:42:19Z (GMT) No. of bitstreams: 1 Dissertação v03.pdf: 1405923 bytes, checksum: 28d2a1fb855d75506c6f1f010f4ff5a5 (MD5) / Made available in DSpace on 2016-03-11T16:00:12Z (GMT). No. of bitstreams: 1 Dissertação v03.pdf: 1405923 bytes, checksum: 28d2a1fb855d75506c6f1f010f4ff5a5 (MD5) Previous issue date: 2016-02-15 / Utilizo dados semanais para investigar a lucratividade de estratégias de momentum no mercado de câmbio baseadas em dois diferentes métodos de extração da tendência, possivelmente não linear. Comparo a performance com as tradicionais regras de médias móveis, método linear bastante utilizado pelos profissionais do mercado. Eu encontro que o desempenho de todas as estratégias é extremamente sensível à escolha da moeda, às defasagens utilizadas e ao critério de avaliação escolhido. A despeito disso, as moedas dos países do G10 apresentam resultados médios melhores com a utilização dos métodos não lineares, enquanto as moedas dos países emergentes apresentam resultados mistos. Adoto também uma metodologia para o gerenciamento do risco das estratégias de momentum, visando minimizar as 'grandes perdas'. Ela tem êxito em diminuir as perdas máximas semanais, o desvio-padrão, a assimetria e curtose para a maior parte das moedas em ambas as estratégias. Quanto ao desempenho, as operações baseadas no filtro HP com gestão do risco apresentam retornos e índices de Sharpe maiores para cerca de 70% das estratégias, enquanto as baseadas na regressão não paramétrica apresentam resultados melhores para cerca de 60% das estratégias. / I use weekly data to investigate the profitability of momentum strategies in the currency market based on two different methods of trending extraction, possibly nonlinear. I compare the performance with the traditional moving averages rules, linear method of trading broadly used by market professionals. I find that the performance of all strategies is extremely sensitive to the choice of currency, lags parameters and the evaluation criteria. Nevertheless, the G10 currencies show better average results with the nonlinear methods, while the emerging market currencies show mixed results. I also adopt a methodology for managing the risk of momentum strategies to minimize the “worst crashes”. It works to lower the maximum weekly losses, the standard deviation, the skewness and the kurtosis for most currencies in both strategies. In terms of performance, HP filter with risk-managed momentum shows higher return and Sharpe ratio for about 70% the observations, while those based on nonparametric regression show higher numbers for about 60% the observations.
110

Analýza cenných papírů na kapitálových trzích (meziodvětvová komparace výše a struktury jednotlivých typů rizika a výnosu na vybraných burzách cenných papírů) / Analysis of securities to capital markets (inter-industry comparison of the amount and structure of each type of risk and return on the selected stock exchanges)

WEISSOVÁ, Kateřina January 2012 (has links)
The main objective of this thesis is to analyze selected sectors of the European capital market by means of methods of technical and fundamental analysis. Based on the results obtained for each frame exchanges, industry sectors and the best investment strategy. The first part deals with the theoretical description of securities to capital markets, investment strategies, methods of assessment of the securities in the capital markets, the theory of efficient markets, market testing and evidence of their effectiveness. On the European stock market index, including the German DAX30 randomly selected ninety nine companies with data for the period 2006 {-} the 2011th The work on the basis of a confirmed capital market inefficiencies can be found active investment strategy to achieve above average returns.

Page generated in 0.1949 seconds