• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 292
  • 171
  • 44
  • 32
  • 10
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 615
  • 143
  • 104
  • 92
  • 87
  • 78
  • 78
  • 70
  • 68
  • 68
  • 62
  • 61
  • 55
  • 53
  • 52
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
291

Learning with Limited Supervision by Input and Output Coding

Zhang, Yi 01 May 2012 (has links)
In many real-world applications of supervised learning, only a limited number of labeled examples are available because the cost of obtaining high-quality examples is high. Even with a relatively large number of labeled examples, the learning problem may still suffer from limited supervision as the complexity of the prediction function increases. Therefore, learning with limited supervision presents a major challenge to machine learning. With the goal of supervision reduction, this thesis studies the representation, discovery and incorporation of extra input and output information in learning. Information about the input space can be encoded by regularization. We first design a semi-supervised learning method for text classification that encodes the correlation of words inferred from seemingly irrelevant unlabeled text. We then propose a multi-task learning framework with a matrix-normal penalty, which compactly encodes the covariance structure of the joint input space of multiple tasks. To capture structure information that is more general than covariance and correlation, we study a class of regularization penalties on model compressibility. Then we design the projection penalty, which encodes the structure information from a dimension reduction while controlling the risk of information loss. Information about the output space can be exploited by error correcting output codes. Using the composite likelihood view, we propose an improved pairwise coding for multi-label classification, which encodes pairwise label density (as opposed to label comparisons) and decodes using variational methods. We then investigate problemdependent codes, where the encoding is learned from data instead of being predefined. We first propose a multi-label output code using canonical correlation analysis, where predictability of the code is optimized. We then argue that both discriminability and predictability are critical for output coding, and propose a max-margin formulation that promotes both discriminative and predictable codes. We empirically study our methods in a wide spectrum of applications, including document categorization, landmine detection, face recognition, brain signal classification, handwritten digit recognition, house price forecasting, music emotion prediction, medical decision, email analysis, gene function classification, outdoor scene recognition, and so forth. In all these applications, our proposed methods for encoding input and output information lead to significantly improved prediction performance.
292

Flavor Changing Neutral Current Processes In The Framework Of The Two Higgs Doublet Model

Turan, Ismail 01 January 2003 (has links) (PDF)
It is widely believed that the Standard Model (SM) can not be a fundamental theory of the basic interactions. Originated from this fact, many new physics models have been proposed. Among them, the two Higgs doublet model (2HDM), the SM enlarged by adding one extra scalar doublet, is considered as the simplest extension of the SM. In this work, within the framework of the model III version of the 2HDM, the exclusive decay the branching ratio is calculated and discussed in various physical regions determined by model parameters. It is observed that it is possible to reach present experimental upper limits in model Finally, the avor changing top quark decay,
293

Regularized equivariant Euler classes and gamma functions.

Lu, Rongmin January 2008 (has links)
We consider the regularization of some equivariant Euler classes of certain infinite-dimensional vector bundles over a finite-dimensional manifold M using the framework of zeta-regularized products [35, 53, 59]. An example of such a regularization is the Atiyah–Witten regularization of the T-equivariant Euler class of the normal bundle v(TM) of M in the free loop space LM [2]. In this thesis, we propose a new regularization procedure — W-regularization — which can be shown to reduce to the Atiyah–Witten regularization when applied to the case of v(TM). This new regularization yields a new multiplicative genus (in the sense of Hirzebruch [26]) — the ^Γ-genus — when applied to the more general case of a complex spin vector bundle of complex rank ≥ 2 over M, as opposed to the case of the complexification of TM for the Atiyah–Witten regularization. Some of its properties are investigated and some tantalizing connections to other areas of mathematics are also discussed. We also consider the application of W-regularization to the regularization of T²- equivariant Euler classes associated to the case of the double free loop space LLM. We find that the theory of zeta-regularized products, as set out by Jorgenson–Lang [35], Quine et al [53] and Voros [59], amongst others, provides a good framework for comparing the regularizations that have been considered so far. In particular, it reveals relations between some of the genera that appeared in elliptic cohomology, allowing us to clarify and prove an assertion of Liu [44] on the ˆΘ-genus, as well as to recover the Witten genus. The ^Γ₂-genus, a new genus generated by a function based on Barnes’ double gamma function [5, 6], is also derived in a similar way to the ^Γ-genus. / Thesis (Ph.D.) - University of Adelaide, School of Mathematical Sciences, 2008
294

Regularized equivariant Euler classes and gamma functions.

Lu, Rongmin January 2008 (has links)
We consider the regularization of some equivariant Euler classes of certain infinite-dimensional vector bundles over a finite-dimensional manifold M using the framework of zeta-regularized products [35, 53, 59]. An example of such a regularization is the Atiyah–Witten regularization of the T-equivariant Euler class of the normal bundle v(TM) of M in the free loop space LM [2]. In this thesis, we propose a new regularization procedure — W-regularization — which can be shown to reduce to the Atiyah–Witten regularization when applied to the case of v(TM). This new regularization yields a new multiplicative genus (in the sense of Hirzebruch [26]) — the ^Γ-genus — when applied to the more general case of a complex spin vector bundle of complex rank ≥ 2 over M, as opposed to the case of the complexification of TM for the Atiyah–Witten regularization. Some of its properties are investigated and some tantalizing connections to other areas of mathematics are also discussed. We also consider the application of W-regularization to the regularization of T²- equivariant Euler classes associated to the case of the double free loop space LLM. We find that the theory of zeta-regularized products, as set out by Jorgenson–Lang [35], Quine et al [53] and Voros [59], amongst others, provides a good framework for comparing the regularizations that have been considered so far. In particular, it reveals relations between some of the genera that appeared in elliptic cohomology, allowing us to clarify and prove an assertion of Liu [44] on the ˆΘ-genus, as well as to recover the Witten genus. The ^Γ₂-genus, a new genus generated by a function based on Barnes’ double gamma function [5, 6], is also derived in a similar way to the ^Γ-genus. / Thesis (Ph.D.) - University of Adelaide, School of Mathematical Sciences, 2008
295

[en] TO FORMALIZE THE LAND? ANALYSIS OF THE IMPACTS OF THE LAND REGULARIZATION PROGRAMS IN THE RIO DE JANEIRO FAVELAS AND THEIR INTERFACE WITH URBAN INFORMALITY / [pt] FORMALIZAR O SOLO? ANÁLISE DOS IMPACTOS DOS PROGRAMAS DE REGULARIZAÇÃO FUNDIÁRIA NAS FAVELAS CARIOCAS E SUA INTERFACE COM A INFORMALIDADE URBANA

LORENA HELENA DOS SANTOS S ANILE 23 August 2018 (has links)
[pt] A população moradora das favelas cria estratégias para sua manutenção nesses locais. Longe de afirmar que a informalidade seria uma solução para todos os problemas da favela, a presente dissertação compreende esta prática como funcional. Contrapondo a isto, a Regularização Fundiária é entendida como a legalização das propriedades das áreas informais. Entretanto, o que se vê, na maioria das favelas cariocas que receberam estes projetos, é a entrega de um título fragilizado, que não garante a permanência da população, isto quando há efetivamente a entrega do título. O tema proposto por esta dissertação é a Regularização Fundiária em contraste com a informalidade urbana encontrada nas favelas cariocas. Analisamos a informalidade urbana como um ordenamento diferenciado no meio urbano, portanto, ela não deve ser encarada como um problema que pode ser solucionado pela Regularização Fundiária. Para compreender a inserção da Regularização Fundiária como Programa governamental nas favelas cariocas, destacamos três favelas como campo de pesquisa: Rocinha, Cantagalo e Acari (Vila Rica e Vila Esperança), todas com inserção governamental e desfechos distintos. Assim, para alcançar o objetivo central desta dissertação, buscamos o aprofundamento nos temas principais por meio da pesquisa bibliográfica, do levantamento documental sobre os programas de regularização fundiária e a realização de entrevistas com lideranças comunitárias, gestores dos programas locais de regularização fundiária e agentes governamentais. Buscamos aprofundar a temática da Regularização Fundiária e as suas variadas vertentes, observando os desafios enfrentados para garantir à população moradora das favelas o direito à cidade. / [en] The slum leaving population create strategies for their maintenance in these environments. Far from stating that informality would be a solution to all slum problems, the present dissertation understands it as a functional practice. Contrary to it, land regularization is understood as of the properties legalization from informal areas. However, what is seen in most slums in Rio de Janeiro that received these projects, is a fragile deed delivered that does not guarantee the population permanence, when, in fact, there is an actual delivery. The theme proposed by this dissertation is Land Regularization in contrast to the urban informality found in the Rio de Janeiro slums. It has been analyzed urban informality as a differentiated urban planning, therefore, it should not be seen as a problem that can be solved by land regularization. In order to understand the inclusion of land regularization as a government program in Rio de Janeiro slums, three slums were studied as field of research: Rocinha, Cantagalo and Acari (Vila Rica and Vila Esperança), all with governmental insertion and different outcomes. Thus, in order to reach the main objective, we seek to deepen the main themes through bibliographical research, documentary survey of land regularization programs and interviews with community leaders, managers of local land regularization programs and government agents. We aim to deepen the theme of land regularization and its various aspects, observing the challenges faced to guarantee the population living in the slums the right to the city.
296

Stéréo corrélation d'images numériques éléments finis : application à la validation de modèles de structures composites sous sollicitations complexes / Finite element based stereo digital image correlation : application to the validation of composite structures models complex loading

Pierré, Jean-Emmanuel 25 November 2016 (has links)
Le projet VERTEX, dans lequel s'inscrivent ces travaux, a pour objectif de valider des modèles composites par essais statiques multi-axiaux sur des éprouvettes à l'échelle des détails structuraux. Le positionnement à cette échelle nécessite de développer de nouveaux outils d'instrumentation et de dialogue essais/calculs, qui sont au cœur de cette thèse. Compte-tenu de la complexité de ce type d'essai, nous nous intéressons ici à la Stéréo Corrélation d'Images Numériques (SCIN) puisqu'elle permet d'accéder à un champ de déplacement 3D à la surface du spécimen. Néanmoins, si l'on s'en tient à des approches classiques, il est difficile de faire une comparaison quantitative entre un champ mesuré et un champ simulé par Éléments Finis (EF). Ainsi dans ce travail, un formalisme est développé pour réaliser une mesure par SCIN dans l'espace physique en se basant sur un modèle EF (calibration de modèles non-linéaires de caméra, mesure de forme EF, mesure de champs EF). Cette méthode donne accès à un champ de déplacement expérimental directement exprimé dans les repère et support EF de la simulation, ce qui simplifie considérablement la validation. Pour valider un modèle à l'échelle des détails structuraux, la question des conditions aux limites est fondamentale. Cette question est d'autant plus complexe que la mesure cinématique se limite à la surface visible. En plus de cette mesure surfacique, il est envisagé ici d'estimer les conditions aux limites dans l'épaisseur en s'appuyant sur un modèle de type plaque/coque (classique ou volumique) et en y accordant plus ou moins de confiance (approches régularisée ou intégrée dans tout ou partie de l'éprouvette). La méthodologie est implémentée dans un logiciel académique et est appliquée sur des essais synthétiques. Une instrumentation adaptée au banc VERTEX a également permis l'analyse des essais réalisés dans le projet. / VERTEX Project, to which this thesis belongs, aims to validate composite models by multi-axial static tests on specimens at the level of structural details. The positioning on this scale requires the development of both new instrumentation techniques and tools for the test/simulation dialogue, which are at the heart of this thesis. Given the complexity of this type of experiments, we focus here on Stereo Digital Image Correlation (SDIC) since it yields 3D displacement fields on the surface of the specimen. However, if we stick to conventional approaches, it is difficult to make quantitative comparisons between a measured field and a Finite Element (FE) simulated field. Thus, in this work, a framework is developed to make a measurement by SDIC in the physical coordinate system based on an FE model (calibration of non-linear camera models , FE shape measurement, FE field measurement). This method gives access to experimental displacement fields directly expressed both in the coordinate system and support of the FE simulation, which considerably simplifies validation. To validate a model at the scale of structural details, the question of boundary conditions is fundamental. This question is even more complex since the kinematic measurement is limited to the visible surface. In addition to this surface measurement, it is here envisioned to estimate additionally the boundary conditions in the thickness of the specimen relying on a plate/shell-like model (classic or volume). Different approaches are considered depending on the confidence giving to this model. This methodology is implemented in an academic software and is applied to synthetic tests. The development of a dedicated instrumentation also allowed the analysis of tests performed on the VERTEX bench.
297

Modélisation et simulation numérique robuste de l’endommagement ductile / Robust modeling and numerical simulation of ductile damage

Zhang, Yi 26 January 2016 (has links)
Cette thèse a pour objectif de développer une modélisation robuste pour l’endommagement ductile. En raison de l’adoucissement et du niveau de déformation élevé, les principales difficultés pratiques dans la simulation de l’endommagement ductile sont la dépendance au maillage et le verrouillage volumique. Dans ce travail, on choisit tout d’abord le cadre de grandes déformations en se basant sur un formalisme logarithmique. Puis, partant de la loi de Gurson-Tvergaard-Needleman transcrite en grandes déformations, on adopte une formulation non locale à gradient d’une variable interne qui permet de contrôler la localisation du dommage et traiter ainsi la dépendance au maillage. Ensuite, le modèle non local est couplé avec des éléments finis mixtes pour limiter le verrouillage volumique relatif à l’incompressibilité plastique. On aboutit ainsi à la construction d’un cadre de modélisation de l’endommagement ductile, indépendant du maillage et exonéré du verrouillage volumique. Les propriétés mathématiques et la performance numérique du modèle sont étudiées avec attention. Enfin, après une identification des paramètres sur un acier nucléaire, on réalise des simulations sur des éprouvettes (AE, CT, SENT) et sur une tuyauterie de réelle dimension afin de les confronter à des résultats d’essais. / The major goal of this dissertation is to develop a robust model for ductile damage simulation. Because of the softening behavior and the significantly large deformation in ductile damage, two principle difficulties should be dealt with carefully: mesh-dependency and volumetric locking. In this thesis, we adopt a logarithmic finite strain framework in which the Gurson-Tvergaard-Needleman constitutive law is reformulated. Then a non-local formulation with regularization of hardening variable is applied so as to solve mesh dependency and strain localization problem. In addition, the non-local model is combined with mixed “displacement-pressure-volume variation” elements to avoid volumetric locking. Thereby we establish a mesh-independent and locking-free finite strain framework for ductile damage modelling. Attention is paid to mathematical properties and numerical performance of the model. Finally, after an identification work of a nuclear steel,we carry out simulations on normalized specimens (NT, CT, SENT) as well as an industrial tube in order to compare with experimental results.
298

Implementação paralela do algoritmo iterativo de busca do parâmetro de regularização ótimo para o funcional de Tikhonov no problema de restauração de imagens / Parallel implementation of the iterative algorithm to search the optimal regularization parameter for the Tikhonov functional problem in image restoration

Claudir Oliveira 27 April 2012 (has links)
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / O uso de técnicas com o funcional de Tikhonov em processamento de imagens tem sido amplamente usado nos últimos anos. A ideia básica nesse processo é modificar uma imagem inicial via equação de convolução e encontrar um parâmetro que minimize esse funcional afim de obter uma aproximação da imagem original. Porém, um problema típico neste método consiste na seleção do parâmetro de regularização adequado para o compromisso entre a acurácia e a estabilidade da solução. Um método desenvolvido por pesquisadores do IPRJ e UFRJ, atuantes na área de problemas inversos, consiste em minimizar um funcional de resíduos através do parâmetro de regularização de Tikhonov. Uma estratégia que emprega a busca iterativa deste parâmetro visando obter um valor mínimo para o funcional na iteração seguinte foi adotada recentemente em um algoritmo serial de restauração. Porém, o custo computacional é um fator problema encontrado ao empregar o método iterativo de busca. Com esta abordagem, neste trabalho é feita uma implementação em linguagem C++ que emprega técnicas de computação paralela usando MPI (Message Passing Interface) para a estratégia de minimização do funcional com o método de busca iterativa, reduzindo assim, o tempo de execução requerido pelo algoritmo. Uma versão modificada do método de Jacobi é considerada em duas versões do algoritmo, uma serial e outra em paralelo. Este algoritmo é adequado para implementação paralela por não possuir dependências de dados como de Gauss-Seidel que também é mostrado a convergir. Como indicador de desempenho para avaliação do algoritmo de restauração, além das medidas tradicionais, uma nova métrica que se baseia em critérios subjetivos denominada IWMSE (Information Weighted Mean Square Error) é empregada. Essas métricas foram introduzidas no programa serial de processamento de imagens e permitem fazer a análise da restauração a cada passo de iteração. Os resultados obtidos através das duas versões possibilitou verificar a aceleração e a eficiência da implementação paralela. A método de paralelismo apresentou resultados satisfatórios em um menor tempo de processamento e com desempenho aceitável. / The use of techniques with the functional of Tikhonov in image processing has been widely used in recent years. The basic idea in this process is to modify an initial image using a convolution equation and to find a parameter which minimizes the function in order to obtain an aproximation of the original image. However, a typical problem in this method consists in the choice of the regularization parameter in the appropriate compromise between accuracy and stability of the solution. A method developed by researchers of IPRJ and UFRJ, operating in the area of inverse problems, consists on minimizing a functional of residues through a functional parameter Tikhonovs regularization. A strategy that uses the iterative search of this parameter aiming at to get a minimum value for the functional in the following iteration was adopted recently in a serial algorithm of restoration. However, the computational cost is a factor problem found when using the iterative search. With this approach, an implementation in C++ language was made using techniques of parallel computation using MPI (Message Passing Interface) for the in the choice of the regularization parameter in the appropriate compromise between accuracy and stability of the soluton with the method of iterative search, thus reducing, the time of execution required for the algorithm. A modified version of the Jacobi method is considered to be two versions of the algorithm, one serial and one parallel. This algorithm is adequate for parallel implementation because it has no data dependencies such as the Gauss-Seidel method is also shown to converge. As indicating of performance for evaluation of the restoration algorithm, in addition to the traditional measures, new metric that is based on subjective criteria called IWMSE (InformationWeighted Mean Square Error) is used. These metrics were introduced in the program of image processing and allow to make the analysis of the restoration to each step of iteration. The results obtained using the two possible versions verify the efficiency of acceleration and the parallel implementation. The method of parallelism achieved satisfactory results in a shorter processing time and with acceptable performance.
299

Dilemas da regularização fundiária no Parque nacional de São Joaquim. A valoração de áreas naturais / Dilemma of land adjustment in São joaquim National Park. The wild areas valuation

Faxina, Thábata Cristina 14 July 2014 (has links)
Made available in DSpace on 2016-12-12T20:12:30Z (GMT). No. of bitstreams: 1 PGEF14MA027.pdf: 1058498 bytes, checksum: 6501dab1ea3b3aad80c46c5326c56a8b (MD5) Previous issue date: 2014-07-14 / The Conservation Unit of National Park category aims mainly the preservation of natural ecosystems, as it is of public possession and domain private areas included in its boundaries should be properly expropriated. Law No. 9985 of 2000 provides that Units of Integral Protection category are considered rural.For determining the expropriation amount, environmental agencies use the methodology proposedby theInstitutoNacional de Colonização e ReformaAgrária (INCRA) [National Institute of Colonization and Agrarian Reform] that reflects the prioritization of valuation factors related to area productivity, commonly seeing diversity as a limiting factor to agricultural activities. Since National Park objective is ecosystems preservation, the question is does the methodology used for valuing not underestimate the values of wilderness areas properly preserved by the owners? From the forest survey data made in a 957.1 ha area inserted in São Joaquim National Park, the Valuation of Ecological Factors methodology associated with the Contingent Valuation Method proposed in this paper were used, to supplement the data already obtained in the valuation performed by the responsible agency. The proposed methodology resulted in an increase of 39.18% in the final value of the property under study. Therefore, it is concluded that the values obtained with valuation method used by INCRA, does not consider ecological factors and preserved areas that are the goal of a PARNA, so that owners of inserted land in the Conservation Units of Integral Protection may be being compensated for those areas with values below its importance. This work aims to support new tools for land policies so that homeowners are encouraged to not modify the preserved areas, contributing to the larger goal of a conservation Integral Protection: biodiversity conservation / A Unidade de Conservação de categoria Parque Nacional objetiva principalmente a preservação de ecossistemas naturais, como é de posse e domínio públicos as áreas particulares incluídas em seus limites devem ser devidamente desapropriadas.A Lei nº 9985 de 2000 determina que as Unidades de categoria de Proteção Integral são consideradas zonas rurais. Para determinar os valores de desapropriação os órgãos ambientais utilizam a metodologia proposta pelo Instituto Nacional de Colonização e Reforma Agrária (INCRA) que reflete a priorização da valoração de fatores ligados à produtividade da área, muitas vezes encarando a diversidade como um fator limitante às atividades agrícolas. Visto que o objetivo do Parque Nacional é a preservação dos ecossistemas, questiona-se: será que a metodologia empregada para valoração não subestima os valores das áreas naturais devidamente preservadas pelos proprietários? A partir do levantamento de dados florestais efetuados em uma área de 957,1 ha inserida no Parque Nacional de São Joaquim, utilizou-se a metodologia de Valoração de Fatores Ecológicos associada ao Método de Valoração Contingente, proposta no presente trabalho, para complementar os dados já obtidos na valoração realizada pelo órgão responsável. A metodologia proposta proporcionou um aumento de 39,18% no valor final da propriedade em estudo. Diante disso, conclui-se que os valores obtidos com a metodologia de Valoração utilizada pelo INCRA, não consideram fatores ecológicos e áreas preservadas que são o objetivo de um Parque Nacional, logo os proprietários de terras inseridas em Unidades de Conservação de Proteção Integral podem estar sendo indenizados por essas áreas com valores aquém de sua importância. Este trabalho visa subsidiar novas ferramentas para políticas fundiárias de modo que os proprietários sejam incentivados a não alterarem as áreas preservadas, contribuindo para o maior objetivo de uma Unidade de Conservação de Proteção Integral: a conservação da biodiversidade
300

Assentamentos informais e regularização fundiária: o caso da comunidade Casa Branca, em Bayeux-PB

Falcão, Lívia dos Santos 28 March 2014 (has links)
Made available in DSpace on 2015-05-14T12:09:35Z (GMT). No. of bitstreams: 1 arquivototal.pdf: 9982047 bytes, checksum: c97a4cb337cf64dfe05ab4a780c54d2c (MD5) Previous issue date: 2014-03-28 / Conselho Nacional de Pesquisa e Desenvolvimento Científico e Tecnológico - CNPq / This research proposes to investigate the application of the instrument land tenure regularization in informal settlements in Brazil, through a theoretical, legal and practical contribution of discussions and experiences, which from a technical, political and social reading, we seek from case study of regularization process of the Casa Branca´s community, located in Bayeux/PB, understand the possible problems and blame the actors to do not complete this. Under the time frame between 2006 to 2013 where the first is distinguished to year of initiation, the research focuses on two hypotheses raised regarding non-completion of this process specifically, these being, the lack of integration between the actors involved which are the Municipal government of Bayeux (PMB), the Superintendent of Union Assets/PB (SPU/PB) and the Association of Residents of the Casa Branca community (AMCB), and to disorganization of public documents, as causes of slowness. Based on the empirical of object analysis, the scenario method has been established as the method of analysis, where qualitative and quantitative procedures were applied, either through the use of questionnaires to stakeholders, and a analysis of the process in detail, to be established possible futures scenarios, positive or not, that will materialize the process analyzed. The results confirmed that technical impediments, are responsibility of the SPU / PB, and involving the PMB, are the political impediments the main obstacles to the process, and that the SPU / PB would be the main actor on progress of steps. However, the hypothesis of lack of integration between the actors involved as a factor of slowness was denied, and confirmed that the mess of public documents along with the technical and political impediments found, would be the main reasons for non-implementation of the regularization process of the Casa Branca´s community / Esta pesquisa propõe a investigação da aplicação do instrumento de regularização fundiária em assentamentos informais no Brasil, por meio da análise de um aporte teórico, legal e prático de discussões e experiências, onde a partir de uma leitura técnica, política e social, busca-se estudar o processo de regularização fundiária da comunidade Casa Branca, localizada em Bayeux/PB, e compreender os possíveis entraves e atores responsáveis pela não concretização deste. Sob o recorte temporal de 2006 a 2013, onde o primeiro destaca-se pelo ano de início do processo, a pesquisa debruça-se sobre duas hipóteses levantadas quanto a não conclusão deste processo especificamente, sendo estas, a falta de integração entre os atores envolvidos, que são a Prefeitura Municipal de Bayeux (PMB), a Superintendência de Patrimônio da União/PB (SPU/PB) e a Associação de Moradores da comunidade Casa Branca (AMCB), e a desorganização dos documentos públicos, como causas da lentidão. Com base no objeto empírico analisado, o método de cenários foi estabelecido como forma de análise, onde foram aplicados procedimentos qualitativos e quantitativos, utilizando-se de ferramentas como à aplicação de questionários, nos atores envolvidos, na análise do processo de forma detalhada, para assim, serem estabelecidos os possíveis cenários futuros, positivos ou não, que concretizem o processo analisado. Os resultados obtidos confirmaram que impedimentos técnicos, de ordem da SPU/PB, e políticos, envolvendo a PMB, seriam os principais entraves do processo, e que a SPU/PB, seria o principal ator inibidor das etapas. Entretanto, a hipótese de falta de integração entre os agentes envolvidos como fator da lentidão foi negada, e confirmada que a desordem dos documentos públicos, juntamente com impedimentos técnicos e políticos levantados, seriam os principais motivos da não concretização do processo de regularização da comunidade Casa Branca

Page generated in 0.0567 seconds