• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4
  • 1
  • 1
  • 1
  • Tagged with
  • 7
  • 7
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Uma abordagem para a previsão de carga crítica do sistema elétrico brasileiro = An approach for critical load forecasting of brasilian power system / An approach for critical load forecasting of brasilian power system

Barreto, Mateus Neves, 1989- 03 July 2014 (has links)
Orientadores: Takaaki Ohishi, Ricardo Menezes Salgado / Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Elétrica e de Computação / Made available in DSpace on 2018-08-24T15:46:34Z (GMT). No. of bitstreams: 1 Barreto_MateusNeves_M.pdf: 6008302 bytes, checksum: ae210360a5363404761ca9b3566732ab (MD5) Previous issue date: 2014 / Resumo: O Sistema Elétrico Brasileiro abastece cerca de 97% da demanda de energia nacional. Frente ao extenso território brasileiro, necessita-se de um sistema de transmissão de larga escala, devido as grandes distâncias entre as gerações, das hidroelétricas, e a principal concentração da demanda, no Sudeste brasileiro. Para garantir segurança e economia da operação do Sistema Elétrico Brasileiro são realizadas análises da operação do sistema de geração e transmissão frente às condições de cargas críticas. A ideia é preparar o sistema para suportar as condições mais severas de carga. A curva de carga crítica é calculada para cada mês com discretização horária (ou menor). A mesma é composta pela carga mínima observada num dado mês no período da primeira a oitava hora, e pela carga máxima observada no mês para as horas restantes. Utilizando históricos de demanda pertencentes aos agentes do Setor Elétrico Brasil, foi possível criar um histórico de cinco anos, 60 meses, de curvas de carga crítica. Esses dados foram disponibilizados pelo Operador Nacional do Sistema Elétrico Brasileiro ¿ ONS, em conjunto com o desenvolvimento de um projeto de pesquisa, através de um sistema de suporte a decisão nomeado SysPrev. Nesta dissertação são propostos três modelos para realizar a previsão da curva de carga crítica. Dois modelos utilizam Redes Neurais Artificiais e um modelo utiliza Suavização Exponencial de Holt-Winters (HW). Os resultados obtidos por todos os modelos foram satisfatórios. O modelo de Suavização Exponencial se destacou perante os outros dois modelos atingindo erros médios absolutos próximos a 3%. Esses resultados justificam-se devido às séries históricas de curvas de carga crítica possuírem características de tendência e sazonalidade e o modelo de HW ser projetado especificamente para séries temporais com estas características / Abstract: The Brazilian Power System supplies around 97 % of national energy demand. By reason of the broad Brazilian territory, it requires a transmission system of large scale, due to the large distances between the generations of hydropower and the main concentration of demand that stay in southeastern of Brazil. To ensure security and economy of operation of the Brazilian Electric System are performed analyzes the operation of generation and transmission system due to the conditions of critical loads. The idea is to prepare the system to resist the harshest load conditions. The curve of critical load is calculated for each month with hourly discretization (or less). It's made with the minimum load observed in a given month between the first to eighth hour, and to maximum load observed in the month for the rest of hours. Using the demand agents¿ history belonging to the Brazilian Power System, it was possible to create a record of five years, 60 months, of curves of critical load. These datas were available by the National Operator of the Brazilian Power System as part of the development of a research project, made available by a decision support system named SysPrev. This dissertation proposed three models to perform the forecasting of the critical load curve. Two models using Artificial Neural Networks and one model using Exponential Smoothing Holt-Winters (HW). The results obtained by all the models were satisfactory. The exponential smoothing model stood out against the other two models, this having absolute average errors near 3%. These results are justified due to the historical series of curves of critical load has characteristics of trend and seasonality and the HW model is specifically designed for time series with such characteristics / Mestrado / Engenharia de Computação / Mestre em Engenharia Elétrica
2

AI-Based Transport Mode Recognition for Transportation Planning Utilizing Smartphone Sensor Data From Crowdsensing Campaigns

Grubitzsch, Philipp, Werner, Elias, Matusek, Daniel, Stojanov, Viktor, Hähnel, Markus 11 May 2023 (has links)
Utilizing smartphone sensor data from crowdsen-sing (CS) campaigns for transportation planning (TP) requires highly reliable transport mode recognition. To address this, we present our RNN-based AI model MovDeep, which works on GPS, accelerometer, magnetometer and gyroscope data. It was trained on 92 hours of labeled data. MovDeep predicts six transportation modes (TM) on one second time windows. A novel postprocessing further improves the prediction results. We present a validation methodology (VM), which simulates unknown context, to get a more realistic estimation of the real-world performance (RWP). We explain why existing work shows overestimated prediction qualities, when they would be used on CS data and why their results are not comparable with each other. With the introduced VM, MovDeep still achieves 99.3 % F1 -Score on six TM. We confirm the very good RWP for our model on unknown context with the Sussex-Huawei Locomotion data set. For future model comparison, both publicly available data sets can be used with our VM. In the end, we compare MovDeep to a deterministic approach as a baseline for an average performing model (82 - 88 % RWP Recall) on a CS data set of 540 k tracks, to show the significant negative impact of even small prediction errors on TP.
3

Graph Cut Based Mesh Segmentation Using Feature Points and Geodesic Distance

Liu, L., Sheng, Y., Zhang, G., Ugail, Hassan January 2015 (has links)
No / Both prominent feature points and geodesic distance are key factors for mesh segmentation. With these two factors, this paper proposes a graph cut based mesh segmentation method. The mesh is first preprocessed by Laplacian smoothing. According to the Gaussian curvature, candidate feature points are then selected by a predefined threshold. With DBSCAN (Density-Based Spatial Clustering of Application with Noise), the selected candidate points are separated into some clusters, and the points with the maximum curvature in every cluster are regarded as the final feature points. We label these feature points, and regard the faces in the mesh as nodes for graph cut. Our energy function is constructed by utilizing the ratio between the geodesic distance and the Euclidean distance of vertex pairs of the mesh. The final segmentation result is obtained by minimizing the energy function using graph cut. The proposed algorithm is pose-invariant and can robustly segment the mesh into different parts in line with the selected feature points.
4

修勻與小區域人口之研究 / A Study of smoothing methods for small area population

金碩, Jin, Shuoh Unknown Date (has links)
由於誤差與人口數成反比,資料多寡影響統計分析的穩定性及可靠性,因此常用於推估大區域人口的方法,往往無法直接套用至縣市及其以下層級,尤其當小區域內部地理、社會或經濟的異質性偏高時,人口推估將更為棘手。本文以兩個面向對臺灣小區域人口進行探討:其一、臺灣人口結構漸趨老化,勢必牽動政府政策與資源分配,且臺灣各縣市的人口老化速度不一,有必要針對各地特性發展適當的小區域人口推估方法;其二、因為壽命延長,全球皆面臨長壽風險(Longevity Risk)的挑戰,包括政府退休金制度規劃、壽險保費釐定等,由於臺灣各地死亡率變化不盡相同,發展小區域死亡率模型也是迫切課題。 小區域推估面臨的問題大致可歸納為四個方向:「資料品質」、「地區人數」、「資料年數」與「推估年數」,資料品質有賴資料庫與制度的建立,關於後三個問題,本文引進修勻(Smoothing, Graduation)等方法來提高小區域推估及小區域死亡模型的穩定性。人口推估方面結合修勻與區塊拔靴法(Block Bootstrap),死亡率模型的建構則將修勻加入Lee-Carter與Age-Period-Cohort模型。由於小區域人口數較少,本文透過標準死亡比(Standard Mortality Ratio)及大區域與小區域間的連貫(Coherence),將大區域的訊息加入小區域,降低因為地區人數較少引起的震盪。 小區域推估通常可用的資料時間較短,未來推估結果的震盪也較大,本文針對需要過去幾年資料,以及未來可推估年數等因素進行研究,希冀結果可提供臺灣各地方政府的推估參考。研究發現,參考大區域訊息有穩定推估的效果,修勻有助於降低推估誤差;另外,在小區域推估中,如有過去十五年資料可獲得較可靠的推估結果,而未來推估年數盡量不超過二十年,若地區人數過少則建議合併其他區域增加資料量後再行推估;先經過修勻而得出的死亡率模型,其效果和較為複雜的連貫模型修正相當。 / The population size plays a very important role in statistical estimation, and it is difficult to derive a reliable estimation for small areas. The estimation is even more difficult if the geographic and social attributes within the small areas vary widely. However, although the population aging and longevity risk are common phenomenon in the world, the problem is not the same for different countries. The aim of this study is to explore the population projection and mortality models for small areas, with the consideration of the small area’s distinguishing characteristic. The difficulties for small area population projection can be attributed into four directions: data quality, population size, number of base years, and projection horizon. The data quality is beyond the discussion of this study and the main focus shall be laid on the other three issues. The smoothing methods and coherent models will be applied to improve the stability and accuracy of small area estimation. In the study, the block bootstrap and the smoothing methods are combined to project the population to the small areas in Taiwan. Besides, the Lee-Cater and the age-period-cohort model are extended by the smoothing and coherent methods. We found that the smoothing methods can reduce the fluctuation of estimation and projection in general, and the improvement is especially noticeable for areas with smaller population sizes. To obtain a reliable population projection for small areas, we suggest using at least fifteen-year of historical data for projection and a projection horizon not more than twenty years. Also, for developing mortality models for small areas, we found that the smoothing methods have similar effects than those methods using more complicated models, such as the coherent models.
5

Influence of the Choice of Disease Mapping Method on Population Characteristics in Areas of High Disease Burdens

Desai, Khyati Sanket 12 1900 (has links)
Disease maps are powerful tools for depicting spatial variations in disease risk and its underlying drivers.  However, producing effective disease maps requires careful consideration of the statistical and spatial properties of the disease data. In fact, the choice of mapping method influences the resulting spatial pattern of the disease, as well as the understanding of its underlying population characteristics. New developments in mapping methods and software in addition to continuing improvements in data quality and quantity are requiring map-makers to make a multitude of decisions before a map of disease burdens can be created. The impact of such decisions on a map, including the choice of appropriate mapping method, not been addressed adequately in the literature. This research demonstrates how choice of mapping method and associated parameters influence the spatial pattern of disease. We use four different disease-mapping methods – unsmoothed choropleth maps, smoothed choropleth maps produced using the headbanging method, smoothed kernel density maps, and smoothed choropleth maps produced using spatial empirical Bayes methods and 5-years of zip code level HIV incidence (2007- 2011) data from Dallas and Tarrant Counties, Texas. For each map, the leading population characteristics and their relative importance with regards to HIV incidence is identified using a regression analysis of a CDC recommended list of socioeconomic determinants of HIV. Our results show that the choice of mapping method leads to different conclusions regarding the associations between HIV disease burden and the underlying demographic and socioeconomic characteristics. Thus, the choice of mapping method influences the patterns of disease we see or fail to see. Accurate depiction of areas of high disease burden is important for developing and targeting appropriate public health interventions.
6

The Influence of Disease Mapping Methods on Spatial Patterns and Neighborhood Characteristics for Health Risk

Ruckthongsook, Warangkana 12 1900 (has links)
This thesis addresses three interrelated challenges of disease mapping and contributes a new approach for improving visualization of disease burdens to enhance disease surveillance systems. First, it determines an appropriate threshold choice (smoothing parameter) for the adaptive kernel density estimation (KDE) in disease mapping. The results show that the appropriate threshold value depends on the characteristics of data, and bandwidth selector algorithms can be used to guide such decisions about mapping parameters. Similar approaches are recommended for map-makers who are faced with decisions about choosing threshold values for their own data. This can facilitate threshold selection. Second, the study evaluates the relative performance of the adaptive KDE and spatial empirical Bayes for disease mapping. The results reveal that while the estimated rates at the state level computed from both methods are identical, those at the zip code level are slightly different. These findings indicate that using either the adaptive KDE or spatial empirical Bayes method to map disease in urban areas may provide identical rate estimates, but caution is necessary when mapping diseases in non-urban (sparsely populated) areas. This study contributes insights on the relative performance in terms of accuracy of visual representation and associated limitations. Lastly, the study contributes a new approach for delimiting spatial units of disease risk using straightforward statistical and spatial methods and social determinants of health. The results show that the neighborhood risk map not only helps in geographically targeting where but also in tailoring interventions in those areas to those high risk populations. Moreover, when health data is limited, the neighborhood risk map alone is adequate for identifying where and which populations are at risk. These findings will benefit public health tasks of planning and targeting appropriate intervention even in areas with limited and poor-quality health data. This study not only fills the identified gaps of knowledge in disease mapping but also has a wide range of broader impacts. The findings of this study improve and enhance the use of the adaptive KDE method in health research, provide better awareness and understanding of disease mapping methods, and offer an alternative method to identify populations at risk in areas with limited health data. Overall, these findings will benefit public health practitioners and health researchers as well as enhance disease surveillance systems.
7

Iterativni postupci sa regularizacijom za rešavanje nelinearnih komplementarnih problema

Rapajić Sanja 13 July 2005 (has links)
<p><span style="left: 81.5833px; top: 720.322px; font-size: 17.5px; font-family: serif; transform: scaleX(1.07268);">U doktorskoj disertaciji razmatrani su iterativni postupci za re&scaron;avanje nelinearnih komplementarnih problema (NCP). Problemi ovakvog tipa javljaju se u teoriji optimizacije, inženjerstvu i ekonomiji. Matematički modeli mnogih prirodnih, dru&scaron;tvenih i tehničkih procesa svode se takođe na ove probleme. Zbog izuzetno velike zastupljenosti NCP problema, njihovo re&scaron;avanje je veoma aktuelno. Među mnogobrojnim numeričkim postupcima koji se koriste u tu svrhu, u ovoj disertaciji posebna pažnja posvećena je<br />generalizovanim postupcima Njutnovog tipa i iterativnim postupcima sa re-gularizacijom matrice jakobijana. Definisani su novi postupci za re&scaron;avanje NCP i dokazana je njihova lokalna ili globalna konvergencija. Dobijeni teorijski rezultati testirani su na relevantnim numeričkim primerima. </span></p> / <p>Iterative methods for nonlinear complementarity problems (NCP) are con-sidered in this doctoral dissertation. NCP problems appear in many math-ematical models from economy, engineering and optimization theory. Solv-ing NCP is very atractive in recent years. Among many numerical methods for NCP, we are interested in generalized Newton-type methods and Jaco-bian smoothing methođs. Several new methods for NCP are defined in this dissertation and their local or global convergence is proved. Theoretical results are tested on relevant numerical examples.</p>

Page generated in 0.1321 seconds