• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 223
  • 62
  • 56
  • 29
  • 8
  • 6
  • 5
  • 5
  • 4
  • 4
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 498
  • 498
  • 115
  • 105
  • 100
  • 88
  • 78
  • 75
  • 66
  • 52
  • 50
  • 48
  • 45
  • 44
  • 40
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
331

Avaliação de susceptibilidade de movimento de massa em linhas de transmissão: estudo de caso na região da Serra da Mantiqueira / Evaluation of mass movement susceptibility over transmission lines: case study at Serra da Mantiqueira region

Junqueira, Adriano Martins [UNESP] 18 January 2017 (has links)
Submitted by ADRIANO MARTINS JUNQUEIRA null (adriano@geopx.com.br) on 2017-03-11T20:17:10Z No. of bitstreams: 1 Mestrado - Adriano Martins Junqueira - 2017.pdf: 8109678 bytes, checksum: d1e6cf6e83e8eac0c87d3eb5b494f810 (MD5) / Approved for entry into archive by Juliano Benedito Ferreira (julianoferreira@reitoria.unesp.br) on 2017-03-16T20:19:44Z (GMT) No. of bitstreams: 1 junqueira_am_me_guara.pdf: 8109678 bytes, checksum: d1e6cf6e83e8eac0c87d3eb5b494f810 (MD5) / Made available in DSpace on 2017-03-16T20:19:44Z (GMT). No. of bitstreams: 1 junqueira_am_me_guara.pdf: 8109678 bytes, checksum: d1e6cf6e83e8eac0c87d3eb5b494f810 (MD5) Previous issue date: 2017-01-18 / A maioria dos empreendimentos do setor elétrico brasileiro não apresentam uma avaliação sistemática espaço-temporal de suas estruturas para identificar vulnerabilidades ambientais nos empreendimentos. Assim, este estudo teve como objetivo analisar a susceptibilidade de movimento de massa em linhas de transmissão, na região da Serra da Mantiqueira, sujeitas aos efeitos de chuvas tropicais e operacionalizar uma plataforma dinâmica de análise e alerta nas áreas de maior criticidade. Para este trabalho, foram levantados dados estáticos e dinâmicos na região a partir de fontes de acesso público e privado. A seguir, foram definidos os múltiplos critérios através do processo de análise hierárquica (AHP). Utilizando Sistemas de Informações Geográficas (SIG) e álgebra de mapas, foi possível determinar o mapa de susceptibilidade em cinco classes e operacionalizar a plataforma de monitoramento, análise e alerta a riscos ambientais baseando-se nas áreas identificadas e nos dados dinâmicos meteorológicos e hidrológicos. Como resultado, foi gerado um banco de dados geográfico com abrangência regional (14.000 km²), que envolveu 7 critérios: declividade, distância das linhas de transmissão, densidade de drenagem, uso do solo, tipo do solo, densidade de fraturas e falhas, e precipitação. A distribuição das classes de susceptibilidade encontradas na região de estudo foram: muito baixa (1,5%), baixa (12%), média (34,9%), alta (45,3%) e muito alta (6,2%). Foi possível ainda, identificar diferentes movimentos de massa em áreas próximas à linhas de transmissão, quanto próximas aos outros elementos de risco, como moradias, estradas, bordas de reservatórios e torres de telecomunicações. A plataforma de monitoramento operacionalizada possibilitou o estabelecimento de análises dinâmicas sobre a ocorrência de eventos naturais extremos, através do envio de notificações e acesso WEB com o mapa das áreas afetadas. Assim, este estudo pode se tornar um instrumento de avaliação, acompanhamento e gerenciamento para a gestão pública e agências reguladoras do setor elétrico. / Most of the Brazilian electric sector enterprises do not present a systematic space-time evaluation of their structures to identify environmental vulnerabilities in the projects. Thus, this study aimed to analyze the susceptibility of mass movement in transmission lines, in the Serra da Mantiqueira region, subject to the effects of tropical rains and to operationalize a dynamic platform of analysis and alertness in the most critical areas. For this work, static and dynamic data were collected in the region from public and private sources. Next, the multiple criteria were defined through the hierarchical analysis process (AHP). Using Geographic Information Systems (GIS) and map algebra, it was possible to determine the susceptibility map in five classes and to operationalize the monitoring platform for monitoring, analysis and alert to environmental risks based on identified areas and dynamic meteorological and hydrological data. As a result, a geographic database with a regional coverage (14.000 km²) was generated, involving 7 criteria: slope, distance of transmission lines, drainage density, soil use, soil type, fracture and failure density, and precipitation. The distribution of susceptibility classes found in the study region were very low (1.5%), low (12%), average (34.9%), high (45.3%) and very high (6, 2%). It was also possible to identify different mass movements in areas close to the transmission lines, as well as to other risk elements such as dwellings, roads, reservoir borders and telecommunications towers. The operationalized monitoring platform allowed the establishment of dynamic analyzes on the occurrence of extreme natural events, by sending notifications and WEB access with the map of the affected areas. Thus, this study can become an instrument of evaluation, monitoring and management for the public management and regulatory agencies of the electric sector.
332

Educação para viver: contra a reprodução da sobrevivência nas escolas. Elementos dos fundamentos sociais da escola e da educação / Education for life: against survival at schools. Elements of social foundations of the school and education

Daniela Dias Marinho 24 October 2014 (has links)
educação com o objetivo de compreender a re-produção das relações sociais de produção que se generalizam até o cotidiano e produzem e são produzidas por um espaço e tempo separados. Falamos em fundamentos sociais porque seria impossível pensar em qualquer mudança qualitativa da educação sem pensarmos numa mudança social. Este é um estudo desenvolvido através da Teoria da Implicação, utilizada por Remi Hess em suas análises de instituições e teve como base teórica-metodológica as obras de Henri Lefebvre, Karl Marx e os Situacionistas, principalmente Guy Debord e Raoul Vaneigem, sendo a dialética essencial para que se priorizasse o movimento conflituoso de uma totalidade. A linguagem dessa pesquisa é definida pelos conceitos lefebvrianos e situacionistas, daí conceitos como o de re-produção das relações sociais de produção, possívelimpossível, tempo-espaço do vivido, sob(re)vida são de extrema importância no desenvolvimento da dissertação. A escola, como objetividade que re-produz as relações sociais, na lógica da economia, tem como sujeito a educação, que torna-se sinônimo de escola e sua contradição, deteriorando a educação. Na realidade urbana atual, lutarmos por uma boa escola é lutar por mais desigualdade transvestida de igualdade formal, por divisão do trabalho, por uma sociedade de classes, enfim, é lutar pela reificação. A escola urbana cada vez mais faz parte do cotidiano, que é um nível do fenômeno urbano, e nessa repetição do cotidiano a escola e a educação tem as suas rupturas. Este é um momento em que se torna necessário discutir os limites da escola e da educação numa conjuntura em que a alienação social permite a re-produção de uma sociedade onde seu meio e seu fim é a economia e não a felicidade das pessoas. Apenas sob(re)viver, que no conceito apresentado por Raoul Vaneigem significa a negação da vida, não é aceitável e por isso queremos uma educação que seja para a vida. / This research establishes elements of the social foundations of the school and education in order to understand the re-production of social relations of production which generalizes to everyday life and produces and is produced by a separated space and time. We talk about social foundations because it would be impossible to think of any qualitative change in education without thinking about social change. This is a study developed based on the Theory of Implication, used by Remi Hess in his analysis of institutions. The study was theoretically and methodologically directed by the studies of Henri Lefebvre, Karl Marx and the Situationists, especially Guy Debord and Raoul Vaneigem, being dialectic the most essential basis so the conflicting movement of totality would arise. The language of this study is defined by concepts by Lefebvre and the Situationists, so concepts such as re-production of social relations of production, possible-impossible, lived space-time, survival, are of utmost importance in the development of this dissertation. The school as an objectivity that re-produces social relations, in the logic of the economy, has the education as a subject that becomes synonymous with school and in this movement its its contradiction and deterioration. In the current urban reality, fight for the right of a good school is fight for more inequality disguised as formal equality, division of labor, class society, in short, it is fighting for reification. The urban school is increasingly becoming a part of everyday life, which is a level of the urban phenomenon, and this repetition in school and education have rupture moments. This is the moment when it becomes necessary to discuss the limits of school and education in an environment where social alienation enables the re-production of a society where economy is its means and its end and not people\'s happiness. Survival is a concept developed by Raoul Vaneigem meaning the negation of life. Survival is not acceptable, we want education for life.
333

Sistema automático para obtenção de parâmetros do tráfego veicular a partir de imagens de vídeo usando OpenCV / Automatic system to obtain traffic parameters from video images based on OpenCV

Cunha, André Luiz Barbosa Nunes da 08 November 2013 (has links)
Esta pesquisa apresenta um sistema automático para extrair dados de tráfego veicular a partir do pós-processamento de vídeos. Os parâmetros macroscópicos e microscópicos do tráfego são derivados do diagrama espaço-tempo, que é obtido pelo processamento das imagens de tráfego. A pesquisa fundamentou-se nos conceitos de Visão Computacional, programação em linguagem C++ e a biblioteca OpenCV para o desenvolvimento do sistema. Para a detecção dos veículos, duas etapas foram propostas: modelagem do background e segmentação dos veículos. Uma imagem sem objetos (background) pode ser determinada a partir das imagens do vídeo através de vários modelos estatísticos disponíveis na literatura especializada. A avaliação de seis modelos estatísticos indicou o Scoreboard (combinação de média e moda) como o melhor método de geração do background atualizado, por apresentar eficiente tempo de processamento de 18 ms/frame e 95,7% de taxa de exatidão. A segunda etapa investigou seis métodos de segmentação, desde a subtração de fundo até métodos de segmentação por textura. Dentre os descritores de textura, é apresentado o LFP, que generaliza os demais descritores. Da análise do desempenho desses métodos em vídeos coletados em campo, conclui-se que o tradicional método Background Subtraction foi o mais adequado, por apresentar o melhor tempo de processamento (34,4 ms/frame) e a melhor taxa de acertos totais com 95,1% de média. Definido o método de segmentação, foi desenvolvido um método para se definir as trajetórias dos veículos a partir do diagrama espaço-tempo. Comparando-se os parâmetros de tráfego obtidos pelo sistema proposto com medidas obtidas em campo, a estimativa da velocidade obteve uma taxa de acerto de 92,7%, comparado com medidas de velocidade feitas por um radar; por outro lado, a estimativa da taxa de fluxo de tráfego foi prejudicada por falhas na identificação da trajetória do veículo, apresentando valores ora acima, ora abaixo dos obtidos nas coletas manuais. / This research presents an automatic system to collect vehicular traffic data from video post-processing. The macroscopic and microscopic traffic parameters are derived from a space-time diagram, which is obtained by traffic image processing. The research was based on the concepts of Computer Vision, programming in C++, and OpenCV library to develop the system. Vehicle detection was divided in two steps: background modeling and vehicle segmentation. A background image can be determined from the video sequence through several statistical models available in literature. The evaluation of six statistical models indicated Scoreboard (combining mean and mode) as the best method to obtain an updated background, achieving a processing time of 18 ms/frame and 95.7% accuracy rate. The second step investigated six segmentation methods, from background subtraction to texture segmentation. Among texture descriptors, LFP is presented, which generalizes other descriptors. Video images collected on highways were used to analyze the performance of these methods. The traditional background subtraction method was found to be the best, achieving a processing time of 34.4 ms/frame and 95.1% accuracy rate. Once the segmentation process was chosen, a method to determine vehicle trajectories from the space-time diagram was developed. Comparing the traffic parameters obtained by the proposed system to data collected in the field, the estimates for speed were found to be very good, with 92.7% accuracy, when compared with radar-measured speeds. On the other hand, flow rate estimates were affected by failures to identify vehicle trajectories, which produced values above or below manually collected data.
334

Esthétique romanesque de René Philombe. Essai d’analyse littéraire

Atangana Essomba, Sylvain 30 May 2012 (has links)
L’esthétique romanesque de René Philombe, objet de notre recherche, à travers sestrois oeuvres postcoloniales : Un sorcier blanc à Zangali, Sola ma chérie, et Bedi-Ngula,l’ancien maquisard, passe par les aspects environnemental, morphosyntaxique, lexicologique,stylistique qui constituent son contexte culturel camerounais et africain subsaharien. Il s’agitlà de la première partie de notre travail qui nous a conduit à l’analyse des fondements deladite esthétique dont les poétiques du genre, de l’énoncé et le style du romancier occupentune place importante. En troisième lieu, nous nous sommes penché sur le corpus en contexteou encore la représentativité de l’esthétique romanesque de René Philombe où la politique /éthique ressort comme principal message, non seulement pour l’Afrique mais également pourle monde. / The romantic aesthetics of René Philombe, subject of our research, through his threepost-colonialist works: Un sorcier blanc à Zangali, Sola ma chérie, and Bedi-Ngula, l’ancienmaquisard, is expressed by the environmental, morphosyntactical, lexicological and stylisticalaspects which form his cultural context from Cameroon and subsaharan Africa. That is thefirst part of our study leading us to the analysis of the aesthetics basis in which the poeticaltype, the expression and the style of the novelist take an important place. In the third part ,wehave been concerned by the contextual corpus or, moreover, the representation of RenéPhilombe’s romantic aesthetics in which the politics / ethics appear as the main message, notonly for Africa but also for the whole word.
335

Mixed Velocity-Displacement Formulation for Modeling of Complex Behavior of Polymer

Pham, Vu Thu 17 February 2012 (has links) (PDF)
This work concerns the simulation of viscoelastic behavior of polymer at different states. Viscoelastic modeling of polymer was performed from the solid state to the liquid state via a multiphase approach which is largely used to deal with the fluid structure interaction. To ensure the appreciation of the FSI, viscoelasticity is considered in two parts: an elastic one and viscous other where the main idea is to use a mixed formulation in three fields (u, v, p) (displacement, velocity, pressure), with u and v, represented the primary variables of a strain and a strain rate formulation. We are led to the Navier-Stokes compressible problem with extra-stress, which is solved by using the Mixed Finite Element. The present work contributes some stabilization elements to the numerical simulation of multiphase problem by the monolithic approach.Comparison between the literature and experiments was performed through the validation of an elastic case and the viscoelastic Kelvin-Voigt model in the context of Lagrangian framework as well as Eulerian framework. The extension of the methodology to a visco-hyper-elastic is given through the modeling and validation on material point on the finite elements library CimLib®. Finally, a stabilization scheme of the EVSS type is adopted for viscoelastic Kelvin-Voigt model, hyper-elastic Neo-Hookean model, and also visco-hyper-elastic model which proposed an open door in computational modeling, not only with viscoelasticity but also complex dynamic application.
336

A New Space-Time Model for Interacting Agents in the Financial Market

Boguta, Maria January 2009 (has links)
<p>In this thesis we present a new space-time model of interacting agents in the financial market. It is a combination of the Curie-Weiss model and a model introduced by Järpe. We investigate properties such as the critical temperature and magnetization of the system. The distribution of the Hamiltonian function is obtained and a hypothesis test of independence is derived. The results are illustrated in an example based on real data.</p>
337

Novel aspects of the dynamics of binary black-hole mergers

Mösta, Philipp January 2011 (has links)
The inspiral and merger of two black holes is among the most exciting and extreme events in our universe. Being one of the loudest sources of gravitational waves, they provide a unique dynamical probe of strong-field general relativity and a fertile ground for the observation of fundamental physics. While the detection of gravitational waves alone will allow us to observe our universe through an entirely new window, combining the information obtained from both gravitational wave and electro-magnetic observations will allow us to gain even greater insight in some of the most exciting astrophysical phenomena. In addition, binary black-hole mergers serve as an intriguing tool to study the geometry of space-time itself. In this dissertation we study the merger process of binary black-holes in a variety of conditions. Our results show that asymmetries in the curvature distribution on the common apparent horizon are correlated to the linear momentum acquired by the merger remnant. We propose useful tools for the analysis of black holes in the dynamical and isolated horizon frameworks and shed light on how the final merger of apparent horizons proceeds after a common horizon has already formed. We connect mathematical theorems with data obtained from numerical simulations and provide a first glimpse on the behavior of these surfaces in situations not accessible to analytical tools. We study electro-magnetic counterparts of super-massive binary black-hole mergers with fully 3D general relativistic simulations of binary black-holes immersed both in a uniform magnetic field in vacuum and in a tenuous plasma. We find that while a direct detection of merger signatures with current electro-magnetic telescopes is unlikely, secondary emission, either by altering the accretion rate of the circumbinary disk or by synchrotron radiation from accelerated charges, may be detectable. We propose a novel approach to measure the electro-magnetic radiation in these simulations and find a non-collimated emission that dominates over the collimated one appearing in the form of dual jets associated with each of the black holes. Finally, we provide an optimized gravitational wave detection pipeline using phenomenological waveforms for signals from compact binary coalescence and show that by including spin effects in the waveform templates, the detection efficiency is drastically improved as well as the bias on recovered source parameters reduced. On the whole, this disseration provides evidence that a multi-messenger approach to binary black-hole merger observations provides an exciting prospect to understand these sources and, ultimately, our universe. / Schwarze Löcher gehören zu den extremsten und faszinierensten Objekten in unserem Universum. Elektromagnetische Strahlung kann nicht aus ihrem Inneren entkommen, und sie bilden die kompaktesten Objekte, die wir kennen. Wir wissen heute, dass in den Zentren der meisten Galaxien sehr massereiche schwarze Löcher vorhanden sind. Im Fall unserer eigenen Galaxie, der Milchstrasse, ist dieses schwarze Loch ungefähr vier Millionen mal so schwer wie unsere Sonne. Wenn zwei Galaxien miteinander kollidieren, führt dies auch dazu, dass ihre beiden schwarzen Löcher kollidieren und zu einem einzelnen schwarzen Loch verschmelzen. Das Simulieren einer solchen Kollision von zwei schwarzen Löchern, die Vorhersage sowie Analyse der von ihnen abgestrahlten Energie in Form von Gravitations- und elektromagnetischen Wellen, bildet das Thema der vorliegenden Dissertation. Im ersten Teil dieser Arbeit untersuchen wir die Verschmelzung von zwei schwarzen Löchern unter verschiedenen Gesichtspunkten. Wir zeigen, dass Ungleichmässigkeiten in der Geometrie des aus einer Kollision entstehenden schwarzen Loches dazu führen, dass es zuerst beschleunigt und dann abgebremst wird, bis diese Ungleichmässigkeiten in Form von Gravitationswellen abgetrahlt sind. Weiterhin untersuchen wir, wie der genaue Verschmelzungsprozess aus einer geometrischen Sicht abläuft und schlagen neue Methoden zur Analyse der Raumzeitgeometrie in Systemen vor, die schwarze Löcher enthalten. Im zweiten Teil dieser Arbeit beschäftigen wir uns mit den Gravitationswellen und elektromagnetischer Strahlung, die bei einer Kollision von zwei schwarzen Löchern freigesetzt wird. Gravitationswellen sind Wellen, die Raum und Zeit dehnen und komprimieren. Durchläuft uns eine Gravitationswelle, werden wir in einer Richtung minimal gestreckt, während wir in einer anderen Richtung minimal zusammengedrückt werden. Diese Effekte sind allerdings so klein, dass wir sie weder spüren, noch auf einfache Weise messen können. Bei einer Kollision von zwei schwarzen Löchern wird eine grosse Menge Energie in Form von Gravitationswellen und elektromagnetischen Wellen abgestrahlt. Wir zeigen, dass beide Signale in ihrer Struktur sehr ähnlich sind, dass aber die abgestrahlte Energie in Gravitationswellen um ein Vielfaches grösser ist als in elektromagnetischer Strahlung. Wir führen eine neue Methode ein, um die elektromagnetische Strahlung in unseren Simulationen zu messen und zeigen, dass diese dazu führt, dass sich die räumliche Struktur der Strahlung verändert. Abschliessend folgern wir, dass in der Kombination der Signale aus Gravitationswellen und elektromagnetischer Strahlung eine grosse Chance liegt, ein System aus zwei schwarzen Löchern zu detektieren und in einem weiteren Schritt zu analysieren. Im dritten und letzen Teil dieser Dissertation entwickeln wir ein verbessertes Suchverfahren für Gravitationswellen, dass in modernen Laser-Interferometerexperimenten genutzt werden kann. Wir zeigen, wie dieses Verfahren die Chancen für die Detektion eines Gravitationswellensignals deutlich erhöht, und auch, dass im Falle einer erfolgreichen Detektion eines solchen Signals, seine Parameter besser bestimmt werden können. Wir schliessen die Arbeit mit dem Fazit, dass die Kollision von zwei schwarzen Löchern ein hochinteressantes Phenomenon darstellt, das uns neue Möglichkeiten bietet die Gravitation sowie eine Vielzahl anderer fundamentaler Vorgänge in unserem Universum besser zu verstehen.
338

A New Space-Time Model for Interacting Agents in the Financial Market

Boguta, Maria January 2009 (has links)
In this thesis we present a new space-time model of interacting agents in the financial market. It is a combination of the Curie-Weiss model and a model introduced by Järpe. We investigate properties such as the critical temperature and magnetization of the system. The distribution of the Hamiltonian function is obtained and a hypothesis test of independence is derived. The results are illustrated in an example based on real data.
339

Space-time-frequency processing from the analysis of bistatic scattering for simple underwater targets

Anderson, Shaun David 14 August 2012 (has links)
The development of low-frequency SONAR systems, using a network of autonomous systems in unmanned vehicles, provides a practical means for bistatic measurements (i.e. when the source and receiver are widely separated, thus allowing multiple viewpoints of a target). Furthermore, time-frequency analysis, in particular Wigner-Ville analysis, takes advantage of the evolution of the time dependent echo spectrum to differentiate a man-made target (e.g. an elastic spherical shell, or cylinder) from a natural one of the similar shape (e.g. a rock). Indeed, key energetic features of man-made objects can aid in identification and classification in the presence of clutter and noise. For example, in a fluid-loaded thin spherical shell, an energetic feature is the mid-frequency enhancement echoes (MFE) that result from antisymmetric Lamb waves propagating around the circumference of the shell, which have been shown to be an acoustic feature useful in this pursuit. This research investigates the enhancement and benefits of bistatic measurements using the Wigner-Ville analysis along with acoustic imaging methods. Additionally, the advantage of joint space-time-frequency coherent processing is investigated for optimal array processing to enhance the detection of non-stationary signals across an array. The proposed methodology is tested using both numerical simulations and experimental data for spherical shells and solid cylinders. This research was conducted as part of the Shallow Water Autonomous Mine Sensing Initiative (SWAMSI) sponsored by ONR.
340

Precoding and Resource Allocation for Multi-user Multi-antenna Broadband Wireless Systems

Khanafer, Ali 06 January 2011 (has links)
This thesis is targeted at precoding methods and resource allocation for the downlink of fixed multi-user multi-antenna broadband wireless systems. We explore different utilizations of precoders in transmission over frequency-selective channels. We first consider the weighted sum-rate (WSR) maximization problem for multi-carrier systems using linear precoding and propose a low complexity algorithm which exhibits near-optimal performance. Moreover, we offer a novel rate allocation method that utilizes the signalto- noise-ratio (SNR) gap to capacity concept to choose the rates to allocate to each data stream. We then study a single-carrier transmission scheme that overcomes known impairments associated with multi-carrier systems. The proposed scheme utilizes timereversal space-time block coding (TR-STBC) to orthogonalize the downlink receivers and performs the required pre-equalization using Tomlinson-Harashima precoding (THP).We finally discuss the strengths and weaknesses of the proposed method.

Page generated in 0.0977 seconds