• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 62
  • 4
  • 4
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 92
  • 92
  • 92
  • 19
  • 19
  • 13
  • 11
  • 10
  • 10
  • 9
  • 9
  • 9
  • 8
  • 8
  • 8
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Three essays on the econometric analysis of high-frequency data

Malec, Peter 27 June 2013 (has links)
Diese Dissertation behandelt die ökonometrische Analyse von hochfrequenten Finanzmarktdaten. Kapitel 1 stellt einen neuen Ansatz zur Modellierung von seriell abhängigen positiven Variablen, die einen nichttrivialen Anteil an Nullwerten aufweisen, vor. Letzteres ist ein weitverbreitetes Phänomen in hochfrequenten Finanzmarktzeitreihen. Eingeführt wird eine flexible Punktmassenmischverteilung, ein maßgeschneiderter semiparametrischer Spezifikationstest sowie eine neue Art von multiplikativem Fehlermodell (MEM). Kapitel 2 beschäftigt sich mit dem Umstand, dass feste symmetrische Kerndichteschätzer eine geringe Präzision aufweisen, falls eine positive Zufallsvariable mit erheblicher Wahrscheinlichkeitsmasse nahe Null gegeben ist. Wir legen dar, dass Gammakernschätzer überlegen sind, wobei ihre relative Präzision von der genauen Form der Dichte sowie des Kerns abhängt. Wir führen einen verbesserten Gammakernschätzer sowie eine datengetriebene Methodik für die Wahl des geeigneten Typs von Gammakern ein. Kapitel 3 wendet sich der Frage nach dem Nutzen von Hochfrequenzdaten für hochdimensionale Portfolioallokationsanwendungen zu. Wir betrachten das Problem der Konstruktion von globalen Minimum-Varianz-Portfolios auf der Grundlage der Konstituenten des S&P 500. Wir zeigen auf, dass Prognosen, welche auf Hochfrequenzdaten basieren, im Vergleich zu Methoden, die tägliche Renditen verwenden, eine signifikant geringere Portfoliovolatilität implizieren. Letzteres geht mit spürbaren Nutzengewinnen aus der Sicht eines Investors mit hoher Risikoaversion einher. / In three essays, this thesis deals with the econometric analysis of financial market data sampled at intraday frequencies. Chapter 1 presents a novel approach to model serially dependent positive-valued variables realizing a nontrivial proportion of zero outcomes. This is a typical phenomenon in financial high-frequency time series. We introduce a flexible point-mass mixture distribution, a tailor-made semiparametric specification test and a new type of multiplicative error model (MEM). Chapter 2 addresses the problem that fixed symmetric kernel density estimators exhibit low precision for positive-valued variables with a large probability mass near zero, which is common in high-frequency data. We show that gamma kernel estimators are superior, while their relative performance depends on the specific density and kernel shape. We suggest a refined gamma kernel and a data-driven method for choosing the appropriate type of gamma kernel estimator. Chapter 3 turns to the debate about the merits of high-frequency data in large-scale portfolio allocation. We consider the problem of constructing global minimum variance portfolios based on the constituents of the S&P 500. We show that forecasts based on high-frequency data can yield a significantly lower portfolio volatility than approaches using daily returns, implying noticeable utility gains for a risk-averse investor.
72

Urban Growth Modeling Based on Land-use Changes and Road Network Expansion

Rui, Yikang January 2013 (has links)
A city is considered as a complex system. It consists of numerous interactivesub-systems and is affected by diverse factors including governmental landpolicies, population growth, transportation infrastructure, and market behavior.Land use and transportation systems are considered as the two most importantsubsystems determining urban form and structure in the long term. Meanwhile,urban growth is one of the most important topics in urban studies, and its maindriving forces are population growth and transportation development. Modelingand simulation are believed to be powerful tools to explore the mechanisms ofurban evolution and provide planning support in growth management. The overall objective of the thesis is to analyze and model urban growth basedon the simulation of land-use changes and the modeling of road networkexpansion. Since most previous urban growth models apply fixed transportnetworks, the evolution of road networks was particularly modeled. Besides,urban growth modeling is an interdisciplinary field, so this thesis made bigefforts to integrate knowledge and methods from other scientific and technicalareas to advance geographical information science, especially the aspects ofnetwork analysis and modeling. A multi-agent system was applied to model urban growth in Toronto whenpopulation growth is considered as being the main driving factor of urbangrowth. Agents were adopted to simulate different types of interactiveindividuals who promote urban expansion. The multi-agent model with spatiotemporalallocation criterions was shown effectiveness in simulation. Then, anurban growth model for long-term simulation was developed by integratingland-use development with procedural road network modeling. The dynamicidealized traffic flow estimated by the space syntax metric was not only used forselecting major roads, but also for calculating accessibility in land-usesimulation. The model was applied in the city centre of Stockholm andconfirmed the reciprocal influence between land use and street network duringthe long-term growth. To further study network growth modeling, a novel weighted network model,involving nonlinear growth and neighboring connections, was built from theperspective of promising complex networks. Both mathematical analysis andnumerical simulation were examined in the evolution process, and the effects ofneighboring connections were particular investigated to study the preferentialattachment mechanisms in the evolution. Since road network is a weightedplanar graph, the growth model for urban street networks was subsequentlymodeled. It succeeded in reproducing diverse patterns and each pattern wasexamined by a series of measures. The similarity between the properties of derived patterns and empirical studies implies that there is a universal growthmechanism in the evolution of urban morphology. To better understand the complicated relationship between land use and roadnetwork, centrality indices from different aspects were fully analyzed in a casestudy over Stockholm. The correlation coefficients between different land-usetypes and road network centralities suggest that various centrality indices,reflecting human activities in different ways, can capture land development andconsequently influence urban structure. The strength of this thesis lies in its interdisciplinary approaches to analyze andmodel urban growth. The integration of ‘bottom-up’ land-use simulation androad network growth model in urban growth simulation is the major contribution.The road network growth model in terms of complex network science is anothercontribution to advance spatial network modeling within the field of GIScience.The works in this thesis vary from a novel theoretical weighted network modelto the particular models of land use, urban street network and hybrid urbangrowth, and to the specific applications and statistical analysis in real cases.These models help to improve our understanding of urban growth phenomenaand urban morphological evolution through long-term simulations. Thesimulation results can further support urban planning and growth management.The study of hybrid models integrating methods and techniques frommultidisciplinary fields has attracted a lot attention and still needs constantefforts in near future. / <p>QC 20130514</p>
73

Bandwidth Selection in Nonparametric Kernel Estimation / Bandweitenwahl bei nichtparametrischer Kernschätzung

Schindler, Anja 29 September 2011 (has links)
No description available.
74

Détection non supervisée d'évènements rares dans un flot vidéo : application à la surveillance d'espaces publics / Unsupervised detection of rare events in a video stream : application to the surveillance of public spaces

Luvison, Bertrand 13 December 2010 (has links)
Cette thèse est une collaboration entre le LAboratoire des Sciences et Matériaux pour l’Électronique et d’Automatique (LASMEA) de Clermont-Ferrand et le Laboratoire Vision et Ingénierie des Contenus (LVIC) du CEA LIST à Saclay. La première moitié de la thèse a été accomplie au sein de l’équipe ComSee (1) du LASMEA et la deuxième au LVIC. L’objectif de ces travaux est de concevoir un système de vidéo-assistance temps réel pour la détection d’évènements dans des scènes possiblement denses.La vidéosurveillance intelligente de scènes denses telles que des foules est particulièrement difficile, principalement à cause de leur complexité et de la grande quantité de données à traiter simultanément. Le but de cette thèse consiste à élaborer une méthode de détection d’évènements rares dans de telles scènes, observées depuis une caméra fixe. La méthode en question s’appuie sur l’analyse automatique de mouvement et ne nécessite aucune information à priori. Les mouvements nominaux sont déterminés grâce à un apprentissage statistique non supervisé. Les plus fréquemment observés sont considérés comme des évènements normaux. Une phase de classification permet ensuite de détecter les mouvements déviant trop du modèle statistique, pour les considérer comme anormaux. Cette approche est particulièrement adaptée aux lieux de déplacements structurés, tels que des scènes de couloirs ou de carrefours routiers. Aucune étape de calibration, de segmentation de l’image, de détection d’objets ou de suivi n’est nécessaire. Contrairement aux analyses de trajectoires d’objets suivis, le coût calculatoire de notre méthode est invariante au nombre de cibles présentes en même temps et fonctionne en temps réel. Notre système s’appuie sur une classification locale du mouvement de la scène, sans calibration préalable. Dans un premier temps, une caractérisation du mouvement est réalisée, soit par des méthodes classiques de flot optique, soit par des descripteurs spatio-temporels. Ainsi, nous proposons un nouveau descripteur spatio-temporel fondé sur la recherche d’une relation linéaire entre les gradients spatiaux et les gradients temporels en des zones où le mouvement est supposé uniforme. Tout comme les algorithmes de flot optique, ce descripteur s’appuie sur la contrainte d’illumination constante.Cependant en prenant en compte un voisinage temporel plus important, il permet une caractérisation du mouvement plus lisse et plus robuste au bruit. De plus, sa faible complexité calculatoire est bien adaptée aux applications temps réel. Nous proposons ensuite d’étudier différentes méthodes de classification : La première, statique, dans un traitement image par image, s’appuie sur une estimation bayésienne de la caractérisation du mouvement au travers d’une approche basée sur les fenêtres de Parzen. Cette nouvelle méthode est une variante parcimonieuse des fenêtres de Parzen. Nous montrons que cette approche est algorithmiquement efficace pour approximer de manière compacte et précise les densités de probabilité. La seconde méthode, basée sur les réseaux bayésiens, permet de modéliser la dynamique du mouvement. Au lieu de considérer ce dernier image par image, des séquences de mouvements sont analysées au travers de chaînes de Markov Cachées. Ajouté à cela, une autre contribution de ce manuscrit est de prendre en compte la modélisation du voisinage d’un bloc afin d’ajouter une cohérence spatiale à la propagation du mouvement. Ceci est réalisé par le biais de couplages de chaînes de Markov cachées.Ces différentes approches statistiques ont été évaluées sur des données synthétiques ainsi qu’en situations réelles, aussi bien pour la surveillance du trafic routier que pour la surveillance de foule.Cette phase d’évaluation permet de donner des premières conclusions encourageantes quant à la faisabilité de la vidéosurveillance intelligente d’espaces possiblement denses. / The automatic analysis of crowded areas in video sequences is particularly difficult because ofthe large amount of information to be processed simultaneously and the complexity of the scenes. We propose in this thesis a method for detecting abnormal events in possibly dense scenes observed from a static camera. The approach is based on the automatic classification of motion requiring no prior information. Motion patterns are encoded in an unsupervised learning framework in order to generate a statistical model of frequently observed (aka. normal) events. Then at the detection stage, motion patterns that deviate from the model are classified as unexpected events. The method is particularly adapted to scenes with structured movement with directional flow of objects or people such as corridors, roads, intersections. No camera calibration is needed, nor image segmentation, object detection and tracking. In contrast to approaches that rely on trajectory analysis of tracked objects, our method is independent of the number of targets and runs in real-time. Our system relies on a local classification of global scene movement. The local analysis is done on each blocks of a regular grid. We first introduce a new spatio-temporal local descriptor to characterize the movement efficiently. Assuming a locally uniform motion of space-time blocks of the image, our approach consists in determining whether there is a linear relationship between spatial gradients and temporal gradients. This spatio-temporal descriptor holds the Illumination constancy constraint like optical flow techniques, but it allows taking into account the spatial neighborhood and a temporal window by giving a smooth characterization of the motion, which makes it more robust to noise. In addition, its low computational complexity is suitable for real-time applications. Secondly, we present two different classification frameworks : The first approach is a static (frame by frame) classification approach based on a Bayesian characterization of the motion by using an approximation of the Parzen windowing method or Kernel Density Estimation (KDE) to model the probability density function of motion patterns.This new method is the sparse variant of the KDE (SKDE). We show that the SKDE is a very efficient algorithm giving compact representations and good approximations of the density functions. The second approach, based on Bayesian Networks, models the dynamics of the movement. Instead of considering motion patterns in each block independently, temporal sequences of motion patterns are learned by using Hidden Markov Models (HMM). The second proposed improvement consists in modeling the movement in one block by taking into account the observed motion in adjacent blocks. This is performed by the coupled HMM method. Evaluations were conducted to highlight the classification performance of the proposed methods,on both synthetic data and very challenging real video sequences captured by video surveillance cameras.These evaluations allow us to give first conclusions concerning automatic analyses of possibly crowded area.
75

Análise de desempenho de sistemas de comunicação OFDM-TDMA utilizando cadeias de Markov e curva de serviço / Performance analysis of OFDM-TDMA wireless systems based

Costa, Victor Hugo Teles 06 December 2013 (has links)
Submitted by Jaqueline Silva (jtas29@gmail.com) on 2014-12-12T17:31:05Z No. of bitstreams: 1 Dissertação-Victor Hugo Teles Costa-2013.pdf: 20678399 bytes, checksum: a39c778934ebe127bd74f506467fe0a3 (MD5) / Rejected by Jaqueline Silva (jtas29@gmail.com), reason: on 2014-12-12T17:31:57Z (GMT) / Submitted by Jaqueline Silva (jtas29@gmail.com) on 2014-12-12T19:42:30Z No. of bitstreams: 2 license_rdf: 23148 bytes, checksum: 9da0b6dfac957114c6a7714714b86306 (MD5) Dissertação-Victor Hugo Teles Costa-2013.pdf: 20678399 bytes, checksum: a39c778934ebe127bd74f506467fe0a3 (MD5) / Approved for entry into archive by Jaqueline Silva (jtas29@gmail.com) on 2014-12-16T09:25:22Z (GMT) No. of bitstreams: 2 license_rdf: 23148 bytes, checksum: 9da0b6dfac957114c6a7714714b86306 (MD5) Dissertação-Victor Hugo Teles Costa-2013.pdf: 20678399 bytes, checksum: a39c778934ebe127bd74f506467fe0a3 (MD5) / Made available in DSpace on 2014-12-16T09:25:22Z (GMT). No. of bitstreams: 2 license_rdf: 23148 bytes, checksum: 9da0b6dfac957114c6a7714714b86306 (MD5) Dissertação-Victor Hugo Teles Costa-2013.pdf: 20678399 bytes, checksum: a39c778934ebe127bd74f506467fe0a3 (MD5) Previous issue date: 2013-12-06 / This paper presents a model based on Markov Chains and enhanced with the use of Kernel Density Estimation and of MMFM (Markov Modulated Fluid Model) in order to evaluate the performance of the transmission link in OFDMTDMA systems. For that purpose, traffic models based on the Kernel method and the MMFM with adjusted autocorrelation function are proposed. From the model implemented for the OFDM-TDMA system, it was derived equations for estimation of QoS parameters such as delay and average queue size in the buffer. The obtained results confirm that the proposed model is efficient in describing the link performance indicators. The use of MMFM to model the arrival process improves the QoS parameter estimates of the queueing model making their values very close to those of the simulation results. It was also developed an equation to the OFDMTDMA system’s service curve. Through this equation and the concept of Envelope Process, it was proposed an equation to estimate the probability of buffer overflow in OFDM-TDMA systems. The results show that the estimates of the overflow probability based on the system’s service curve are very close to the ones obtained by simulations and that the computational complexity to obtain them is significantly reduced compared to the model based on Markov Chains due to the absence of matrix computation. / Este trabalho apresenta um modelo baseado em Cadeias de Markov e aprimorado com o uso do método de Kernel de estimação não-paramétrica e de MMFM (Markov Modulated Fluid Model) com o objetivo de avaliar e descrever o desempenho do enlace de transmissão em sistemas OFDM-TDMA. Para tal, modelos de tráfego baseados no Método de Kernel e em MMFM com ajuste da função de autocorrelação são propostos. A partir do modelo implementado para o sistema OFDM-TDMA, foram obtidas equações para estimação de parâmetros de QoS como retardo e tamanho médio da fila no buffer. Os resultados obtidos confirmam que o modelo proposto é bastante eficiente ao descrever os indicadores de desempenho do sistema. O uso de MMFM para modelar o processo de chegada de pacotes aprimora os estimadores de parâmetros de QoS tornando as estimativas bem próximas dos valores obtidos com as simulações. Também deduziu-se uma equação para a curva de serviço de Sistemas OFDM-TDMA. Em seguida, utilizando-se desta curva de serviço e do conceito de Processo Envelope foi proposta uma equação para estimação de probabilidade de transbordo do buffer em sistemas OFDM-TDMA. Os resultados obtidos mostram que as estimativas de probabilidade de transbordo baseadas na curva de serviço do sistema se aproximam bem dos resultados da simulação e a complexidade computacional do cálculo necessário para obtê-los é significativamente reduzida em relação ao modelo definido utilizando Cadeias de Markov.
76

3D imaging and nonparametric function estimation methods for analysis of infant cranial shape and detection of twin zygosity

Vuollo, V. (Ville) 17 April 2018 (has links)
Abstract The use of 3D imaging of craniofacial soft tissue has increased in medical science, and imaging technology has been developed greatly in recent years. 3D models are quite accurate and with imaging devices based on stereophotogrammetry, capturing the data is a quick and easy operation for the subject. However, analyzing 3D models of the face or head can be challenging and there is a growing need for efficient quantitative methods. In this thesis, new mathematical methods and tools for measuring craniofacial structures are developed. The thesis is divided into three parts. In the first part, facial 3D data of Lithuanian twins are used for the determination of zygosity. Statistical pattern recognition methodology is used for classification and the results are compared with DNA testing. In the second part of the thesis, the distribution of surface normal vector directions of a 3D infant head model is used to analyze skull deformation. The level of flatness and asymmetry are quantified by functionals of the kernel density estimate of the normal vector directions. Using 3D models from infants at the age of three months and clinical ratings made by experts, this novel method is compared with some previously suggested approaches. The method is also applied to clinical longitudinal research in which 3D images from three different time points are analyzed to find the course of positional cranial deformation and associated risk factors. The final part of the thesis introduces a novel statistical scale space method, SphereSiZer, for exploring the structures of a probability density function defined on the unit sphere. The tools developed in the second part are used for the implementation of SphereSiZer. In SphereSiZer, the scale-dependent features of the density are visualized by projecting the statistically significant gradients onto a planar contour plot of the density function. The method is tested by analyzing samples of surface unit normal vector data of an infant head as well as data from generated simulated spherical densities. The results and examples of the study show that the proposed novel methods perform well. The methods can be extended and developed in further studies. Cranial and facial 3D models will offer many opportunities for the development of new and sophisticated analytical methods in the future. / Tiivistelmä Pään ja kasvojen pehmytkudoksen 3D-kuvantaminen on yleistynyt lääketieteessä, ja siihen tarvittava teknologia on kehittynyt huomattavasti viime vuosina. 3D-mallit ovat melko tarkkoja, ja kuvaus stereofotogrammetriaan perustuvalla laitteella on nopea ja helppo tilanne kuvattavalle. Kasvojen ja pään 3D-mallien analysointi voi kuitenkin olla haastavaa, ja tarve tehokkaille kvantitatiivisille menetelmille on kasvanut. Tässä väitöskirjassa kehitetään uusia matemaattisia kraniofakiaalisten rakenteiden mittausmenetelmiä ja -työkaluja. Työ on jaettu kolmeen osaan. Ensimmäisessä osassa pyritään määrittämään liettualaisten kaksosten tsygositeetti kasvojen 3D-datan perusteella. Luokituksessa hyödynnetään tilastollista hahmontunnistusta, ja tuloksia verrataan DNA-testituloksiin. Toisessa osassa analysoidaan pään epämuodostumia imeväisikäisten päiden 3D-kuvista laskettujen pintanormaalivektorien suuntiin perustuvan jakauman avulla. Tasaisuuden ja epäsymmetrian määrää mitataan normaalivektorien suuntakulmien ydinestimaatin funktionaalien avulla. Kehitettyä menetelmää verrataan joihinkin aiemmin ehdotettuihin lähestymistapoihin mittaamalla kolmen kuukauden ikäisten imeväisten 3D-malleja ja tarkastelemalla asiantuntijoiden tekemiä kliinisiä pisteytyksiä. Menetelmää sovelletaan myös kliiniseen pitkittäistutkimukseen, jossa tutkitaan pään epämuodostumien ja niihin liittyvien riskitekijöiden kehitystä kolmena eri ajankohtana otettujen 3D-kuvien perusteella. Viimeisessä osassa esitellään uusi tilastollinen skaala-avaruusmenetelmä SphereSiZer, jolla tutkitaan yksikköpallon tiheysfunktion rakenteita. Toisessa osassa kehitettyjä työkaluja sovelletaan SphereSiZerin toteutukseen. SphereSiZer-menetelmässä tiheysfunktion eri skaalojen piirteet visualisoidaan projisoimalla tilastollisesti merkitsevät gradientit tiheysfunktiota kuvaavalle isoviivakartalle. Menetelmää sovelletaan imeväisikäisen pään pintanormaalivektoridataan ja simuloituihin, pallotiheysfunktioihin perustuviin otoksiin. Tulosten ja esimerkkien perusteella väitöskirjassa esitetyt uudet menetelmät toimivat hyvin. Menetelmiä voidaan myös kehittää edelleen ja laajentaa jatkotutkimuksissa. Pään ja kasvojen 3D-mallit tarjoavat paljon mahdollisuuksia uusien ja laadukkaiden analyysityökalujen kehitykseen myöhemmissä tutkimuksissa.
77

The Influence of Disease Mapping Methods on Spatial Patterns and Neighborhood Characteristics for Health Risk

Ruckthongsook, Warangkana 12 1900 (has links)
This thesis addresses three interrelated challenges of disease mapping and contributes a new approach for improving visualization of disease burdens to enhance disease surveillance systems. First, it determines an appropriate threshold choice (smoothing parameter) for the adaptive kernel density estimation (KDE) in disease mapping. The results show that the appropriate threshold value depends on the characteristics of data, and bandwidth selector algorithms can be used to guide such decisions about mapping parameters. Similar approaches are recommended for map-makers who are faced with decisions about choosing threshold values for their own data. This can facilitate threshold selection. Second, the study evaluates the relative performance of the adaptive KDE and spatial empirical Bayes for disease mapping. The results reveal that while the estimated rates at the state level computed from both methods are identical, those at the zip code level are slightly different. These findings indicate that using either the adaptive KDE or spatial empirical Bayes method to map disease in urban areas may provide identical rate estimates, but caution is necessary when mapping diseases in non-urban (sparsely populated) areas. This study contributes insights on the relative performance in terms of accuracy of visual representation and associated limitations. Lastly, the study contributes a new approach for delimiting spatial units of disease risk using straightforward statistical and spatial methods and social determinants of health. The results show that the neighborhood risk map not only helps in geographically targeting where but also in tailoring interventions in those areas to those high risk populations. Moreover, when health data is limited, the neighborhood risk map alone is adequate for identifying where and which populations are at risk. These findings will benefit public health tasks of planning and targeting appropriate intervention even in areas with limited and poor-quality health data. This study not only fills the identified gaps of knowledge in disease mapping but also has a wide range of broader impacts. The findings of this study improve and enhance the use of the adaptive KDE method in health research, provide better awareness and understanding of disease mapping methods, and offer an alternative method to identify populations at risk in areas with limited health data. Overall, these findings will benefit public health practitioners and health researchers as well as enhance disease surveillance systems.
78

Regeringars taktiska användning av de allmänna statsbidragen till kommunerna / The tactical use of inter-governmental grants from the central government to local governments at the municipality level

Juutinen, Gabriel, Jiang, Junhao January 2021 (has links)
Modellen för röstmaximerande politiska partier i ett proportionerligt valsystem presenterad av Lindbeck och Weibull (1987) respektive Dixit och Londregan (1996) används för att testa svenska regeringens taktiska användning av de allmänna statsbidragen till kommunerna under åren 2007–2010. Resultaten tyder på att statsbidragen använts taktiskt vilket korroborerar teorin. Liknande analys för åren 2015–2018 kunde inte replikera resultaten. Svensk valundersökning används för att identifiera väljarnas ideologiska preferens för regeringen under tiden för riksdagsvalet 2006 varefter mätmodellen presenterad av Johansson (2003) och Dahlberg och Johansson (2002) används för att skatta poängen mot den latenta faktorn "ideologisk bias". Dessa poäng delas upp efter valkrets och används för att skatta den ideologiska biasens täthetsfördelning i respektive valkrets. Valresultatet används för att identifiera indifferenspunkterna där väljarna i respektive valkrets är indifferenta mellan båda politiska blocken. Täthetsfunktionerna utvärderas vid dessa punkter varefter betydelsen av tätheterna som determinant för mängden allmänna statsbidrag en kommun erhåller testas genom linjär regression. / A version of the model for voter share maximizing political parties in a proportional electoral system developed by Lindbeck and Weibull (1987) and Dixit and Londregan (1996) respectively is presented. This model is used to test if there is evidence for the tactical use of intergovernmental grants from the central incumbent government to the local governments at the municipal level during the first term of office 2007-2010 of the conservative Reinfeldt government in Sweden. The results show that such tactical use did occur, and which corroborates the theoretical framework for competing parties. Similar results were obtained for the periods 2011-2014 (conservative government) and 2015-2018 (socialist government). Data from the Swedish election studies are used to identify the voter’s ideological preferences for the incumbent central government during the time of the 2006 general election to the Swedish parliament. The theory behind the model presented by Johansson (2003) and Dahlberg and Johansson (2002) respectively was the guideline to estimate the factor scores against the latent factor “ideological bias”. The Gaussian kernel density function is used to estimate the ideological bias in each constituency: The actual election results are used to approximate the indifference cutpoint where the voters are indifferent between both political alternatives. The probability distribution functions were evaluated at these cutpoints after which the importance of these densities for the amount of intergovernmental grants a municipality receives is tested using linear regression.
79

Vizualizace biomedicinských dat v prostředí Matlab / Biomedical data visualization using Matlab

Zvončák, Vojtěch January 2016 (has links)
The thesis deals with the visualization of biomedical data in MATLAB environment. The thesis contains following statistical methods and their descriptions: P-P plot, Q-Q plot, histogram, box plot, kernel denstity estimation, scatter plot and several time series metrics. Some functions are programmed from buil-in functions of MATLAB and others using external functions, which are changed to fit to this thesis’s purpose. First part of the thesis conserns theoretical background, whereas the second part conserns practical programmed realizations of mentioned functions. The program contains a graphical user interface - GUI, which the thesis describes in detail. The purpose of the GUI is to ensure ease of use and also data processing. The output graphs of GUI are shown in chapter 5. The last part deals with the possible extensions of the program.
80

Using an Inventory of Unstable Slopes to Prioritize Probabilistic Rockfall Modeling and Acid Base Accounting in Great Smoky Mountains National Park

O'Shea, Thomas A 01 August 2021 (has links)
An inventory of unstable slopes along transportation corridors and performance modeling are important components of geotechnical asset management in Great Smoky Mountains National Park (GRSM). Hazards and risk were assessed for 285 unstable slopes along 151 miles of roadway. A multi-criteria model was created to select fourteen sites for two-dimensional probabilistic rockfall simulations and Acid Base Accounting (ABA) tests. Simulations indicate that rock material would likely enter the roadway at all fourteen sites. ABA test results indicate that influence of significant acid-producing potential is generally confined to slaty rocks of the Anakeesta Formation and graphitic schist of the Wehutty Formation. The research illustrates an approach for prioritizing areas for site-specific investigations towards the goal of improving safety in GRSM. These results can help park officials develop mitigation strategies for rockfall, using strategies such as widening ditches and encapsulating acidic rockfall material.

Page generated in 0.1009 seconds