• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 242
  • 57
  • 30
  • 29
  • 16
  • 9
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • Tagged with
  • 482
  • 148
  • 132
  • 73
  • 72
  • 64
  • 47
  • 46
  • 34
  • 33
  • 33
  • 32
  • 30
  • 28
  • 28
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
281

"Análise de um modelo de regressão com erros nas variáveis multivariado com intercepto nulo" / "Analysis on a multivariate null-intercept errors-in-variables regression model"

Russo, Cibele Maria 19 June 2006 (has links)
Para analisar características de interesse a respeito de um conjunto de dados reais da área de Odontologia apresentado em Hadgu & Koch (1999), ajustaremos um modelo de regressão linear multivariado com erros nas variáveis com intercepto nulo. Este conjunto de dados é caracterizado por medições de placa bacteriana em três grupos de voluntários, antes e após utilizar dois líquidos de bochecho experimentais e um líquido de bochecho controle, com medições (sujeitas a erros de medição) no início do estudo, após três e seis meses de utilização dos líquidos. Neste caso, uma possível estrutura de dependência entre as medições feitas em um mesmo indivíduo deve ser incorporada ao modelo e, além disto, temos duas variáveis resposta para cada indivíduo. Após a apresentação do modelo estatístico, iremos obter estimativas de máxima verossimilhança dos parâmetros utilizando o algoritmo iterativo EM e testaremos as hipóteses de interesse utilizando testes assintóticos de Wald, razão de verossimilhanças e score. Como neste caso não existe um teste ótimo, faremos um estudo de simulação para verificar o comportamento das três estatísticas de teste em relação a diferentes tamanhos amostrais e diferentes valores de parâmetros. Finalmente, faremos um estudo de diagnóstico buscando identificar possíveis pontos influentes no modelo, considerando o enfoque de influência local proposto por Cook (1986) e a medida de curvatura normal conformal desenvolvida por Poon & Poon (1999). / To analyze some characteristics of interest in a real odontological data set presented in Hadgu & Koch (1999), we propose the use of a multivariate null intercept errors-in-variables regression model. This data set is composed by measurements of dental plaque index (with measurement errors), which were measured in volunteers who were randomized to two experimental mouth rinses (A and B) or a control mouth rinse. The measurements were taken in each individual, before and after the use of the respective mouth rinses, in the beginning of the study, after three months from the baseline and after six months from the baseline. In this case, a possible structure of dependency between the measurements taken within the same individual must be incorporated in the model. After presenting the statistical model, we obtain the maximum likelihood estimates of the parameters using the numerical algorithm EM, and we test the hypotheses of interest considering asymptotic tests (Wald, likelihood ratio and score). Also, a simulation study to verify the behavior of these three test statistics is presented, considering diferent sample sizes and diferent values for the parameters. Finally, we make a diagnostic study to identify possible influential observations in the model, considering the local influence approach proposed by Cook (1986) and the conformal normal curvature proposed by Poon & Poon (1999).
282

Analysis and Design of Conformal Array Antennas

Persson, Patrik January 2002 (has links)
Today there is a great need for communication between people and even between things, using systems onboard e.g. aircraft, cars, ships and satellites. As a consequence, these communications needs require antennas mounted on or integrated in the surfaces of different vehicles or platforms, i.e. conformal antennas. In order to ensure proper operation of the communication systems it is important to be able to determine the characteristics of these antennas. This thesis is about the analysis and design of conformal antennas using high frequency asymptotic methods. Commonly used eigenfunction solutions or numerical methods such as FDTD, FEM or MoM are difficult to use for arbitrarily shaped electrically large surfaces. However, the high frequency approximation approach together with an accurate ray tracing procedure offers a convenient solution for these surfaces. The geodesics (ray paths) on the surfaces are key parameters in the analysis and they are discussed in detail. In the first part of the thesis singly and doubly curved perfectly electrical conducting (PEC) surfaces are studied, with respect to the mutual coupling among aperture type elements. A synthesis problem is also considered where the effect of the mutual coupling is taken into account. As expected, the mutual coupling must be included in the synthesis procedure to be able to realize the prescribed pattern, especially in the shaped main lobe. Furthermore, the polarization of the antenna elements is very important when considering antennas on generally shaped surfaces. For such antennas the polarization must most likely be controlled in some way for a proper function. For verification of the results two experimental antennas were built at Ericsson Microwave Systems AB, Mölndal, Sweden. The first antenna is a circular cylinder with an array of rectangular waveguide fed apertures and the second antenna is a doubly curved surface (paraboloid) with circular waveguide fed apertures. It is found that it is possible to obtain very accurate results with the asymptotic method when it is combined with the Method of Moments, i.e. a hybrid method is used. The agreement compared to measurements is good at levels as low as –80 dB in many cases. The second part of the thesis is about the development of a high frequency approximation for surface field calculations on a dielectric covered PEC circular cylinder. When using conformal antennas in practice they have to be covered with a radome for protection and with the method developed here this cover can be included in the analysis. The method is a combination of two different solutions, one valid in the non-paraxial region of the cylinder and the other is valid in the paraxial region. The method is verified against measurements and reference results obtained from a spectral domain moment method code. / QC 20100616
283

On Visualizing Branched Surface: an Angle/Area Preserving Approach

Zhu, Lei 12 September 2004 (has links)
The techniques of surface deformation and mapping are useful tools for the visualization of medical surfaces, especially for highly undulated or branched surfaces. In this thesis, two algorithms are presented for flattened visualizations of multi-branched medical surfaces, such as vessels. The first algorithm is an angle preserving approach, which is based on conformal analysis. The mapping function is obtained by minimizing two Dirichlet functionals. On a triangulated representation of vessel surfaces, this algorithm can be implemented efficiently using a finite element method. The second algorithm adjusts the result from conformal mapping to produce a flattened representation of the original surface while preserving areas. It employs the theory of optimal mass transport via a gradient descent approach. A new class of image morphing algorithms is also considered based on the theory of optimal mass transport. The mass moving energy functional is revised by adding an intensity penalizing term, in order to reduce the undesired "fading" effects. It is a parameter free approach. This technique has been applied on several natural and medical images to generate in-between image sequences.
284

Équations différentielles issues des vecteurs singuliers des représentations de l'algèbre de Virasoro

Eon, Sylvain January 2008 (has links)
Mémoire numérisé par la Division de la gestion de documents et des archives de l'Université de Montréal
285

Sobre a Geometria de Gráficos Killing Conformes Inteiros em ambientes Riemannianos Folheados. / About the Geometry of Graphs Killing Complete Conform in Riemannian Veneered Environments.

ARAÚJO, Jogli Gidel da Silva. 09 August 2018 (has links)
Submitted by Johnny Rodrigues (johnnyrodrigues@ufcg.edu.br) on 2018-08-09T17:08:05Z No. of bitstreams: 1 JOGLI GIDEL DA SILVA ARAÚJO - DISSERTAÇÃO PPGMAT 2014..pdf: 597763 bytes, checksum: 4efda81f9c43bb545607e3229077124a (MD5) / Made available in DSpace on 2018-08-09T17:08:05Z (GMT). No. of bitstreams: 1 JOGLI GIDEL DA SILVA ARAÚJO - DISSERTAÇÃO PPGMAT 2014..pdf: 597763 bytes, checksum: 4efda81f9c43bb545607e3229077124a (MD5) Previous issue date: 2014-03 / Capes / Neste trabalho, estudamos a geometria de gráficos Killing conformes inteiros, isto é, gráficos construídos a partir do fluxo gerado por um campo de vetores V Killing conforme completo, os quais estão definidos sobre uma folha integral da folheação V⊥ ortogonal a V. Além disso, estudamos a restrição da norma do gradiente da função z a qual determina tal gráficoΣ(z), nesse sentido, apresentamos condições suficientes para assegurar que Σ(z) é uma hipersuperfície totalmente umbílica e, em particular, uma folha integral de V⊥. / We study the geometry of entire conformal Killing graphs, that is, graphs constructed through the flow generated by a complete conformal Killing vector field V and which are defined over an integral leaf of the foliation V⊥ orthogonal to V. In this setting, under a suitable restriction on the norm of the gradient of the function z which determines such a graphΣ(z), we establish sufficient conditions to ensure that Σ(z) is totally umbilical and, in particular, an integral leaf of V⊥.
286

Aspects of Higher Spin Theories Conformal Field Theories and Holography

Raju, Avinash January 2017 (has links) (PDF)
This dissertation consist of three parts. The first part of the thesis is devoted to the study of gravity and higher spin gauge theories in 2+1 dimensions. We construct cosmological so-lutions of higher spin gravity in 2+1 dimensional de Sitter space. We show that a consistent thermodynamics can be obtained for their horizons by demanding appropriate holonomy conditions. This is equivalent to demanding the integrability of the Euclidean boundary CFT partition function, and reduces to Gibbons-Hawking thermodynamics in the spin-2 case. By using a prescription of Maldacena, we relate the thermodynamics of these solutions to those of higher spin black holes in AdS3. For the case of negative cosmological constant we show that interpreting the inverse AdS3 radius 1=l as a Grassmann variable results in a formal map from gravity in AdS3 to gravity in flat space. The underlying reason for this is the fact that ISO(2,1) is the Inonu-Wigner contraction of SO(2,2). We show how this works for the Chern-Simons actions, demonstrate how the general (Banados) solution in AdS3 maps to the general flat space solution, and how the Killing vectors, charges and the Virasoro algebra in the Brown-Henneaux case map to the corresponding quantities in the BMS3 case. Our results straightforwardly generalize to the higher spin case: the flat space higher spin theories emerge automatically in this approach from their AdS counterparts. We also demonstrate the power of our approach by doing singularity resolution in the BMS gauge as an application. Finally, we construct a candidate for the most general chiral higher spin theory with AdS3 boundary conditions. In the Chern-Simons language, the left-moving solution has Drinfeld-Sokolov reduced form, but on the right-moving solution all charges and chemical potentials are turned on. Altogether (for the spin-3 case) these are 19 functions. Despite this, we show that the resulting metric has the form of the “most general” AdS3 boundary conditions discussed by Grumiller and Riegler. The asymptotic symmetry algebra is a product of a W3 algebra on the left and an affine sl(3)k current algebra on the right, as desired. The metric and higher spin fields depend on all the 19 functions. The second part is devoted to the problem of Neumann boundary condition in Einstein’s gravity. The Gibbons-Hawking-York (GHY) boundary term makes the Dirichlet problem for gravity well defined, but no such general term seems to be known for Neumann boundary conditions. In our work, we view Neumann boundary condition not as fixing the normal derivative of the metric (“velocity”) at the boundary, but as fixing the functional derivative of the action with respect to the boundary metric (“momentum”). This leads directly to a new boundary term for gravity: the trace of the extrinsic curvature with a specific dimension-dependent coefficient. In three dimensions this boundary term reduces to a “one-half” GHY term noted in the literature previously, and we observe that our action translates precisely to the Chern-Simons action with no extra boundary terms. In four dimensions the boundary term vanishes, giving a natural Neumann interpretation to the standard Einstein-Hilbert action without boundary terms. We also argue that a natural boundary condition for gravity in asymptotically AdS spaces is to hold the renormalized boundary stress tensor density fixed, instead of the boundary metric. This leads to a well-defined variational problem, as well as new counter-terms and a finite on-shell action. We elaborate this in various (even and odd) dimensions in the language of holographic renormalization. Even though the form of the new renormalized action is distinct from the standard one, once the cut-off is taken to infinity, their values on classical solutions coincide when the trace anomaly vanishes. For AdS4, we compute the ADM form of this renormalized action and show in detail how the correct thermodynamics of Kerr-AdS black holes emerge. We comment on the possibility of a consistent quantization with our boundary conditions when the boundary is dynamical, and make a connection to the results of Compere and Marolf. The difference between our approach and microcanonical-like ensembles in standard AdS/CFT is emphasized. In the third part of the dissertation, we use the recently developed CFT techniques of Rychkov and Tan to compute anomalous dimensions in the O(N) Gross-Neveu model in d = 2 + dimensions. To do this, we extend the “cow-pie contraction” algorithm of Basu and Krishnan to theories with fermions. Our results match perfectly with Feynman diagram computations.
287

Difeomorfismos conformes que preservam o tensor de Ricci em variedades semi-riemannianas / Conformal diffeomorphism that preserving the Ricci tensor in semi-riemannian manifolds

CARVALHO, Fernando Soares de 28 January 2011 (has links)
Made available in DSpace on 2014-07-29T16:02:18Z (GMT). No. of bitstreams: 1 Dissertacao Fernando Soares de Carvalho.pdf: 3468325 bytes, checksum: 30df6cf936483cf5aec035b1bdd9d208 (MD5) Previous issue date: 2011-01-28 / NOTE: Because some programs do not copy symbols, formulas, etc... to view the summary and the contents of the file, click on PDF - dissertation on the bottom of the screen. / OBS: Como programas não copiam certos símbolos, fórmulas... etc, para visualizar o resumo e o todo o arquivo, click em PDF - dissertação na parte de baixo da tela.
288

Comparação de algoritmos computacionais de cálculo de dose em radioterapia aplicada aos tumores de pulmão / Comparison of dose calculation algorithms in radiotherapy applied to lung tumors

Gabriela Reis dos Santos 16 September 2015 (has links)
INTRODUÇÃO: Na Radioterapia, a acurácia da distribuição de dose em cálculos com correção de heterogeneidade está diretamente relacionada à escolha do algoritmo de cálculo. Existe uma variedade de algoritmos de cálculo disponíveis no mercado, variando em tempo de processamento e acurácia. Este estudo teve como objetivos quantificar a acurácia de dez diferentes algoritmos de cálculo em objetos simuladores de pulmão e analisar o impacto da escolha do algoritmo na distribuição de dose em radioterapia aplicada a tumores de pulmão. METODOLOGIA: Foram utilizados placas simuladoras de água (água sólida RW3) e pulmão (cortiça) para determinar a Porcentagem de Dose em Profundidade (PDP) e perfil transversal dentro da heterogeneidade (cortiça). As medidas foram realizadas em um Clinac Varian 6EX, com feixes de fótons de 6 MV e dois tamanhos de campo (5 x 5 cm2 e 10 x 10 cm2), irradiando-se filmes radiocrômicos Gafchromic EBT3 e câmara de ionização Scanditronix Wellhofer CC13. Planejamentos de 25 pacientes - 11 com técnica tridimendional (3D) e 14 com técnica de Radioterapia Estereotática Corpórea (SBRT) - foram realizados, inicialmente sem correção de heterogeneidade e, mantendo-se as UM, os cálculos com os diferentes algoritmos/métodos de correção foram comparados com o planejamento inicial. Foram avaliados as doses no volume alvo e nos órgãos em risco. RESULTADOS: As medidas realizadas em objetos simuladores revelaram que os algoritmos baseados no princípio da convolução (Eclipse® Pencil Beam Convolution com métodos de correção Batho, Batho Modificado e TAR equivalente; XiO® Clarkson e Convolution e iPlan® Pencil Beam) apresentaram diferenças de dose significativas na região da cortiça, sempre superestimando a medida, com uma sobredose superior a 8%. Algoritmos mais avançados, como o Eclipse® AAA e Acuros XB, XiO® Superposition e iPlan® XVMC, apresentaram desvios inferiores a 3% na região da heterogeneidade. A análise dos perfis mostra, igualmente, que a segunda classe de algoritmos apresenta melhor comportamento em meios de baixa densidade como a cortiça. A largura da penumbra apresentou desvios inferiores a 1 mm para os algoritmos mais avançados contra diferenças de até 4,5 mm entre os algoritmos baseados em convolução. A análise da distribuição de dose em planejamentos de tumores pulmonares mostrou que todos os cálculos com correção de heterogeneidade presentam doses superiores ao cálculo sem correção de heterogeneidade. O histograma dose-volume (DVH) do volume alvo sofreu um impacto maior do que dos órgãos em risco. Os cálculos realizados com algoritmos baseados em convolução apresentaram distribuições de dose semelhantes entre si, porém diferentes das do cálculo sem correção de heterogeneidade. Eclipse® AAA, Acuros XB, XiO® Superposition e iPlan® XVMC apresentaram distribuições de dose também semelhantes, porém Eclipse® Acuros XB e iPlan® XVMC são ainda mais similares. Os planejamentos de SBRT apresentaram resultados mais discrepantes do cálculo sem correção de heterogeneidade do que os planejamentos 3D. CONCLUSÕES: Os diferentes algoritmos de cálculo disponíveis possuem acurácias diferentes em meios de baixa densidade eletrônica. Essas diferenças possuem impacto nas distribuições de dose em planejamentos de tratamento de tumores pulmonares, sendo o impacto ainda maior para a técnica de SBRT. Entre os algoritmos avaliados, há pelo menos um de cada fabricante que apresentou bom desempenho em objetos simuladores de pulmão e que devem ser priorizados para o cálculo em planejamentos de tratamentos de câncer de pulmão / INTRODUCTION: In Radiotherapy, the dose distribution accuracy in heterogeneity correction calculations is directly related to the choice of calculation algorithm. There are many calculation algorithms commercially available. They vary in accuracy and processing time. This study aimed to quantify the accuracy of ten different calculation algorithms in lung equivalent material and to analyze the impact of the algorithm choice in the dose distribution in Radiotherapy applied to lung tumors. METHODS: It was used plates of water (solid water RW3) and lung (cork) equivalent materials to determine the Percentage of Depth Dose (PDD) and transversal profile inside the heterogeneity (cork). The measurements were performed in a Clinac Varian 6EX, with 6 MV photon beams and two field sizes (5 x 5 cm2 and 10 x 10 cm2), through irradiation of radiochromic films Gafchromic EBT3 and ionization chamber Scanditronix Wellhofer CC13. Treatment planning of 25 patients - 11 with tridimensional (3D) technique and 14 with Stereotactic Body Radiation Therapy (SBRT) technique - were performed, first without heterogeneity correction and, by keeping the Monitor Units (MU), the calculations were then performed with the different algorithms/methods of heterogeneity corrections and the results were compared with the initial planning. It was analyzed the target volume and organs at risk doses. RESULTS: The measurements performed in phantoms revealed that algorithms based on the convolution principle (Eclipse® Pencil Beam Convolution with correction methods Batho, Batho Modified and Equivalent TAR; XiO® Clarkson and Convolution e iPlan® Pencil Beam) presented significant dose differences in the cork region, overestimating the measurement, with a overdose higher than 8%. More advanced algorithms, as Eclipse® AAA and Acuros XB, XiO® Superposition and iPlan® XVMC, presented deviations below to 3% in the heterogeneity region. The profile analysis showed, similarly, that the second class of algorithms presents better performance in medium with low electronic density, like cork. The penumbra width presented deviations below to 1 mm for the more sophisticated algorithms against differences up to 4.5 mm between the convolution based algorithms. The dose distribution analysis in lung treatment planning showed that all the calculations performed with heterogeneity corrections presented doses higher than the calculation without heterogeneity corrections. The target volume dose-volume histogram (DVH) had a higher impact compared to the organs at risk. The calculation performed with convolution based algorithms presented dose distributions comparable, although different from the calculation performed without heterogeneity correction. Eclipse® AAA, Acuros XB, XiO® Superposition and iPlan® XVMC presented dose distribution similar, however Eclipse® Acuros XB and iPlan® XVMC are still more similar. The SBRT treatment planning presented higher deviations from the calculation with no heterogeneity correction, compared with the 3D treatment planning. CONCLUSIONS: The different calculation algorithms available have different accuracies in low density mediums. These differences have impact in the dose distributions in lung treatment planning, being the impact higher for the SBRT technique. Between the evaluated algorithms there is, at least one of each manufacturer, that presented acceptable performance in lung equivalent material and it should be the choice in lung treatment planning calculation
289

"Análise de um modelo de regressão com erros nas variáveis multivariado com intercepto nulo" / "Analysis on a multivariate null-intercept errors-in-variables regression model"

Cibele Maria Russo 19 June 2006 (has links)
Para analisar características de interesse a respeito de um conjunto de dados reais da área de Odontologia apresentado em Hadgu & Koch (1999), ajustaremos um modelo de regressão linear multivariado com erros nas variáveis com intercepto nulo. Este conjunto de dados é caracterizado por medições de placa bacteriana em três grupos de voluntários, antes e após utilizar dois líquidos de bochecho experimentais e um líquido de bochecho controle, com medições (sujeitas a erros de medição) no início do estudo, após três e seis meses de utilização dos líquidos. Neste caso, uma possível estrutura de dependência entre as medições feitas em um mesmo indivíduo deve ser incorporada ao modelo e, além disto, temos duas variáveis resposta para cada indivíduo. Após a apresentação do modelo estatístico, iremos obter estimativas de máxima verossimilhança dos parâmetros utilizando o algoritmo iterativo EM e testaremos as hipóteses de interesse utilizando testes assintóticos de Wald, razão de verossimilhanças e score. Como neste caso não existe um teste ótimo, faremos um estudo de simulação para verificar o comportamento das três estatísticas de teste em relação a diferentes tamanhos amostrais e diferentes valores de parâmetros. Finalmente, faremos um estudo de diagnóstico buscando identificar possíveis pontos influentes no modelo, considerando o enfoque de influência local proposto por Cook (1986) e a medida de curvatura normal conformal desenvolvida por Poon & Poon (1999). / To analyze some characteristics of interest in a real odontological data set presented in Hadgu & Koch (1999), we propose the use of a multivariate null intercept errors-in-variables regression model. This data set is composed by measurements of dental plaque index (with measurement errors), which were measured in volunteers who were randomized to two experimental mouth rinses (A and B) or a control mouth rinse. The measurements were taken in each individual, before and after the use of the respective mouth rinses, in the beginning of the study, after three months from the baseline and after six months from the baseline. In this case, a possible structure of dependency between the measurements taken within the same individual must be incorporated in the model. After presenting the statistical model, we obtain the maximum likelihood estimates of the parameters using the numerical algorithm EM, and we test the hypotheses of interest considering asymptotic tests (Wald, likelihood ratio and score). Also, a simulation study to verify the behavior of these three test statistics is presented, considering diferent sample sizes and diferent values for the parameters. Finally, we make a diagnostic study to identify possible influential observations in the model, considering the local influence approach proposed by Cook (1986) and the conformal normal curvature proposed by Poon & Poon (1999).
290

Training a Multilayer Perceptron to predict the final selling price of an apartment in co-operative housing society sold in Stockholm city with features stemming from open data / Träning av en “Multilayer Perceptron” att förutsäga försäljningspriset för en bostadsrättslägenhet till försäljning i Stockholm city med egenskaper från öppna datakällor

Tibell, Rasmus January 2014 (has links)
The need for a robust model for predicting the value of condominiums and houses are becoming more apparent as further evidence of systematic errors in existing models are presented. Traditional valuation methods fail to produce good predictions of condominium sales prices and systematic patterns in the errors linked to for example the repeat sales methodology and the hedonic pricing model have been pointed out by papers referenced in this thesis. This inability can lead to monetary problems for individuals and in worst-case economic crises for whole societies. In this master thesis paper we present how a predictive model constructed from a multilayer perceptron can predict the price of a condominium in the centre of Stockholm using objective data from sources publicly available. The value produced by the model is enriched with a predictive interval using the Inductive Conformal Prediction algorithm to give a clear view of the quality of the prediction. In addition, the Multilayer Perceptron is compared with the commonly used Support Vector Regression algorithm to underline the hallmark of neural networks handling of a broad spectrum of features. The features used to construct the Multilayer Perceptron model are gathered from multiple “Open Data” sources and includes data as: 5,990 apartment sales prices from 2011- 2013, interest rates for condominium loans from two major banks, national election results from 2010, geographic information and nineteen local features. Several well-known techniques of improving performance of Multilayer Perceptrons are applied and evaluated. A Genetic Algorithm is deployed to facilitate the process of determine appropriate parameters used by the backpropagation algorithm. Finally, we conclude that the model created as a Multilayer Perceptron using backpropagation can produce good predictions and outperforms the results from the Support Vector Regression models and the studies in the referenced papers. / Behovet av en robust modell för att förutsäga värdet på bostadsrättslägenheter och hus blir allt mer uppenbart alt eftersom ytterligare bevis på systematiska fel i befintliga modeller läggs fram. I artiklar refererade i denna avhandling påvisas systematiska fel i de estimat som görs av metoder som bygger på priser från repetitiv försäljning och hedoniska prismodeller. Detta tillkortakommandet kan leda till monetära problem för individer och i värsta fall ekonomisk kris för hela samhällen. I detta examensarbete påvisar vi att en prediktiv modell konstruerad utifrån en “Multilayer Perceptron” kan estimera priset på en bostadsrättslägenhet i centrala Stockholm baserad på allmänt tillgängligt data (“Öppen Data”). Modellens resultat har utökats med ett prediktivt intervall beräknat utifrån “Inductive Conformal Prediction”- algoritmen som ger en klar bild över estimatets tillförlitlighet. Utöver detta jämförs “Multilayer Perceptron”-algoritmen med en annan vanlig algoritm för maskinlärande, den så kallade “Support Vector Regression” för att påvisa neurala nätverks kvalité och förmåga att hantera dataset med många variabler. De variabler som används för att konstruera “Multilayer Perceptron”-modellen är sammanställda utifrån allmänt tillgängliga öppna datakällor och innehåller information så som: priser från 5990 sålda lägenheter under perioden 2011- 2013, ränteläget för bostadsrättslån från två av de stora bankerna, valresultat från riksdagsvalet 2010, geografisk information och nitton lokala särdrag. Ett flertal välkända förbättringar för “Multilayer Perceptron”-algoritmen har applicerats och evaluerats. En genetisk algoritm har använts för att stödja processen att hitta lämpliga parametrar till “Backpropagation”-algoritmen. I detta arbete drar vi slutsatsen att modellen kan producera goda förutsägelser med en modell konstruerad utifrån ett neuralt nätverk av typen “Multilayer Perceptron” beräknad med “backpropagation”, och därmed utklassar de resultat som levereras av Support Vector Regression modellen och de studier som refererats i denna avhandling

Page generated in 0.1295 seconds