• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 440
  • 117
  • 102
  • 48
  • 33
  • 25
  • 14
  • 13
  • 13
  • 6
  • 6
  • 5
  • 5
  • 4
  • 3
  • Tagged with
  • 975
  • 135
  • 120
  • 111
  • 99
  • 86
  • 82
  • 73
  • 72
  • 71
  • 71
  • 71
  • 70
  • 63
  • 62
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
431

A study of the work and methods of Henry Briggs, with special reference to the early history of interpolation

Waterson, Andrew January 1941 (has links)
No description available.
432

[en] KERNEL BASED SHEPARD`S INTERPOLATION METHOD / [pt] MÉTODOS DE INTERPOLAÇÃO DE SHEPARD BASEADO EM NÚCLEOS

JOANA BECKER PAULO 01 June 2010 (has links)
[pt] Muitos problemas reais em modelagem computacional requerem o uso de aproximação de funções. Em alguns casos a função a ser avaliada no computador é muito complexa, portanto seria desejável que ela fosse substituída por uma função mais simples e mais eficiente de ser calculada. Para fazer isso, calcula-se o valor da função escalar f em um conjunto de N pontos {x1, x2, . . . , XN}, onde x(i) (pertence a) R(n), e faz-se uma estimativa dos valores dessa função f em qualquer outro ponto através de um método de interpolação. Um método de interpolação é qualquer procedimento que toma um conjunto de restrições e determina uma boa função que satisfaça essas condições. O método de interpolação de Shepard originalmente calcula o valor estimado dessa função num ponto qualquer x (pertence a) R(N) como uma média ponderada dos valores da função original nas N amostras dadas. Sendo que o peso para cada amostra x(i) é função das potências negativas das distâncias euclidianas entre os pontos x e x(i). Os núcleos K: R(N) × R(N) (EM) R são funções que correspondem ao produto interno no espaço de Hilbert F da imagem dos pontos x e z por uma função phi (conjunto vazio) : R(N) (EM) F, ou seja K(x, z) = < phi (conjunto vazio) (x), phi (conjunto vazio) (z) >. Na prática, as funções núcleos representam implicitamente o mapeamento feito pela função phi (conjunto vazio) , ou seja, se define qual núcleo usar e não qual phi (conjunto vazio) usar. Esse trabalho propõe uma modificação do método de interpolação de Shepard que é uma simples substituição no método original: ao invés de usar a distância euclidiana entre os pontos x e xi sugere-se usar a distância entre as imagens dos pontos x e x(I) por phi (conjunto vazio) no espaço de Hilbert F, que pode ser calculada diretamente com o uso da função núcleo K. Os resultados mostram que essa pequena modificação gera resultados melhores quando comparados com o método de Shepard original. / [en] Several real problem in computational modeling require function approximations. In some cases, the function to be evaluated in the computer is very complex, so it would be nice if this function could be substituted by a simpler and efficient one. To do so, the function f is sampled in a set of N pontos {x1, x2, . . . , xN}, where x(i) (is an element of) R(n), and then an estimate for the value of f in any other point is done by an interpolation method. An interpolation method is any procedure that takes a set of constraints and determines a nice function that satisfies such conditions. The Shepard interpolation method originally calculates the estimate of F(x) for some x (is an element of) R(n) as a weighted mean of the N sampled values of f. The weight for each sample xi is a function of the negative powers of the euclidian distances between the point x and xi. Kernels K : R(n) ×R(n) (IN) R are functions that correspond to an inner product on some Hilbert space F that contains the image of the points x and z by a function phi (the empty set) : R(n) (IN) F, i.e. k(x, z) =< phi (the empty set) (x), phi (the empty set) (z) >. In practice, the kernels represent implicitly the mapping phi (the empty set), i.e. it is more suitable to defines which kernel to use instead of which function phi (the empty set). This work proposes a simple modification on the Shepard interpolation method that is: to substitute the euclidian distance between the points x and xi by a distance between the image of these two point by phi (the empty set) in the Hilbert space F, which can be computed directly with the kernel k. Several tests show that such simple modification has better results when compared to the original method.
433

Quantifying the Effect of Topographic Slope on Lava Flow Thickness: A First Step to Improve Lava Flow Volume Estimation Methods

Rizo, Steven R. 21 March 2018 (has links)
The volume of lava flows provide important information on the magnitude of volcanic eruptions, and accurate volumes are necessary to produce reliable models of lava flow emplacement or constrain the internal structure of volcanoes. The most accurate lava flow volumes are obtainable when the topography before and after an eruption are both known, but information for the topography before lava flow emplacement is absent in non-historic lava flows. To calculate the volume of non-historic lava flows, this pre-emplacement topography needs to be reconstructed. Common methods for this include using inverse distance-weighted averages or global polynomial interpolation methods, but these can still underestimate the volume of the flow, and the surface of the flow itself is not considered in these interpolations. A new calculation method seems necessary to better constrain the volume of lava flows, and including the lava flow surface in the volume calculation, given that it is generally excluded during interpolation of pre-emplacement topography, may be the solution to improving lava flow volume calculation for flows where the base surface is unknown. The 2012-2013 Tolbachik lava flow is used to look at potential relationships due to the availability of elevation data before and after the eruption. A quantitative analysis on the relationships between the slope of topography before and after lava flow emplacement and on the relationship between the slope and thickness of lava flows is performed. In addition to this, the slope of the topography calculated over local and regional scales is used as a new interpolation method, and the calculated thickness from the interpolated surface is compared to the known thickness for the lava flow.
434

Applications of DINEOF to satellite-derived chlorophyll-a from a productive coastal region

Hilborn, Andrea 10 October 2018 (has links)
A major limitation for remote sensing analyses of oceanographic variables is loss of spatial data. The Data INterpolating Empirical Orthogonal Functions (DINEOF) method has demonstrated effectiveness for filling spatial gaps in remote sensing datasets, making them more easily implemented in further applications. However, dataset reconstructions with this method are sensitive to the characteristics of the input data used. The spatial and temporal coverage of the input imagery can heavily impact the reconstruction outcome, and thus, further metrics derived from these datasets, such as phytoplankton bloom phenology. In this study, the DINEOF method was applied to a three-year time series of MODIS-Aqua chlorophyll-a of the Salish Sea, Canada. Spatial reconstructions were performed on an annual and multi-year basis at daily and week- composite time resolutions, and assessed relative to the original, clouded chla datasets and a set of extracted in situ chla measurements. A sensitivity test was performed to assess stability of the results with variation of cross-validation data and simulated scenarios of lower temporal data coverage. Daily input time series showed greater accuracy reconstructing chla (95.08-97.08% explained variance, RMSExval 1.49 - 1.65 mg m-3) than week-composite counterparts (68.99-76.88% explained variance, RMSExval 1.87 – 2.07 mg m-3), with longer time series of both types producing a better relationship to original chla pixel concentrations (R 0.95 over 0.94, RMSE 1.29 over 1.35 mg m-3, slope 0.88 over 0.84). Original daily chla achieved a better relationship to in situ matchups than DINEOF gap-filled chla, with annual DINEOF-processed data performing better than the multi-year. The results of this study are of interest to those who require spatially continuous satellite-derived products, particularly from short time series, and encourage processing consistency in future DINEOF studies to allow unification for global purposes such as climate change studies (Mélin et al., 2017). / Graduate
435

On deeply learning features for automatic person image re-identification

Franco, Alexandre da Costa e Silva 13 May 2016 (has links)
Submitted by Diogo Barreiros (diogo.barreiros@ufba.br) on 2017-03-10T14:39:59Z No. of bitstreams: 1 tese_alexandre_versao_final_bd.pdf: 3780030 bytes, checksum: 765f095f9626a12f3b43a6bf9fdb97f3 (MD5) / Approved for entry into archive by Vanessa Reis (vanessa.jamile@ufba.br) on 2017-03-10T14:52:25Z (GMT) No. of bitstreams: 1 tese_alexandre_versao_final_bd.pdf: 3780030 bytes, checksum: 765f095f9626a12f3b43a6bf9fdb97f3 (MD5) / Made available in DSpace on 2017-03-10T14:52:25Z (GMT). No. of bitstreams: 1 tese_alexandre_versao_final_bd.pdf: 3780030 bytes, checksum: 765f095f9626a12f3b43a6bf9fdb97f3 (MD5) / The automatic person re-identification (re-id) problem resides in matching an unknown person image to a database of previously labeled images of people. Among several issues to cope with this research field, person re-id has to deal with person appearance and environment variations. As such, discriminative features to represent a person identity must be robust regardless those variations. Comparison among two image features is commonly accomplished by distance metrics. Although features and distance metrics can be handcrafted or trainable, the latter type has demonstrated more potential to breakthroughs in achieving state-of-the-art performance over public data sets. A recent paradigm that allows to work with trainable features is deep learning, which aims at learning features directly from raw image data. Although deep learning has recently achieved significant improvements in person re-identification, found on some few recent works, there is still room for learning strategies, which can be exploited to increase the current state-of-the-art performance. In this work a novel deep learning strategy is proposed, called here as coarse-to-fine learning (CFL), as well as a novel type of feature, called convolutional covariance features (CCF), for person re-identification. CFL is based on the human learning process. The core of CFL is a framework conceived to perform a cascade network training, learning person image features from generic-to-specific concepts about a person. Each network is comprised of a convolutional neural network (CNN) and a deep belief network denoising autoenconder (DBN-DAE). The CNN is responsible to learn local features, while the DBN-DAE learns global features, robust to illumination changing, certain image deformations, horizontal mirroring and image blurring. After extracting the convolutional features via CFL, those ones are then wrapped in covariance matrices, composing the CCF. CCF and flat features were combined to improve the performance of person re-identification in comparison with component features. The performance of the proposed framework was assessed comparatively against 18 state-of-the-art methods by using public data sets (VIPeR, i-LIDS, CUHK01 and CUHK03), cumulative matching characteristic curves and top ranking references. After a thorough analysis, our proposed framework demonstrated a superior performance.
436

Video view interpolation using temporally adaptive 3D meshes / Interpolação de vistas em video utilizando malhas 3D adaptativas

Fickel, Guilherme Pinto January 2015 (has links)
Esta tese apresenta um novo método para interpolação de vistas em vídeos usando câmeras ao longo de um baseline baseado em uma triangulação 2D. A imagem de referência é primeiramente particionada em regiões triangulares usando informação de bordas e escala, visando colocar vértices ao longo das bordas da imagem e aumentar o número de triângulos em regiões texturadas. Um algoritmo de casamento de regiões é então usado para encontrar a disparidade inicial de cada triângulo, e uma etapa de refinamento é aplicada para mudar a disparidade nos vértices dos triângulos, gerando um mapa de disparidade linear em trechos. Uma simples etapa de pós-processamento é aplicada para conectar os triângulos com disparidade semelhante, gerando uma malha 3D relacionada a cada câmera, que são usadas para gerar novas vistas sintéticas ao longo do mesmo baseline das câmeras. Para gerar vistas com menos artefatos temporais (flickering), foi proposta uma abordagem para atualizar a malha 3D inicial dinamicamente, movendo, removendo e inserindo vértices a cada quadro baseado no fluxo óptico. Esta abordagem permite relacionar triângulos da malha ao longo do tempo, e uma combinação de Modelo Oculto de Markov, aplicado nos triângulos que persistem ao longo do tempo, com Filtro de Kalman, aplicado nos vértices, permite a geração de uma mapa de disparidade com coerência temporal. Com a abordagem proposta, o processo de gerar vistas interpoladas se reduz à trivial tarefa de renderizar uma malha poligonal, algo que pode ser feito muito rapidamente, principalmente quando placas gráficas são utilizadas. Além disso, as vistas geradas não possuem buracos, diferente de muitas técnicas de interpolação de vistas baseadas em pixels que requerem procedimentos de pós-processamento para preencher buracos. Os resultados experimentais indicam que a abordagem proposta foi capaz de gerar vistas interpoladas visualmente coerentes em vídeos desafiadores, com luz natural e movimento de câmera. Além disso, uma avaliação quantitativa usando métricas de qualidade de vídeos mostrou que as sequências de video interpoladas são melhores que abordagens competitivas. / This thesis presents a new method for video view interpolation using multiview linear camera arrays based on 2D domain triangulation. The domain of the reference image is initially partitioned into triangular regions using edge and scale information, aiming to place vertices along image edges and to increase the number of triangles in textured regions. A region-based matching algorithm is then used to find an initial disparity for each triangle, and a refinement stage is applied to change the disparity at the vertices of the triangles, generating a piecewise linear disparity map. A simple post-processing procedure is applied to connect the triangles with similar disparities, generating a full 3D mesh related to each camera (view), which are used to generate the new synthesized views along the cameras baseline. In order to generate views with less temporal flickering artifacts, we propose a scheme to update the initial 3D mesh dynamically, by moving, deleting and inserting vertices at each frame based on optical flow. This approach allows to relate triangles of the mesh across time, and a combination of Hidden Markov Models (HMMs), applied to time-persistent triangles, with the Kalman Filter, applied to vertices, so that temporal consistency can also be obtained. With the proposed framework, view interpolation reduces to the trivial task of rendering polygonal meshes, which can be done very fast, particularly when GPUs are employed. Furthermore, the generated views are hole-free, unlike most point-based view interpolation schemes that require some kind of post-processing procedures to fill holes. Experimental results indicate that our approach was able to generate visually coherent in-between interpolated views for challenging, real-world videos with natural lighting and camera movement.
437

VisualMet : um sistema para visualização e exploração de dados meteorológicos / VisualMet: a system for visualizing and exploring meteorological data

Manssour, Isabel Harb January 1996 (has links)
Os centros operacionais e de pesquisa em previsão numérica do tempo geralmente trabalham com uma grande quantidade de dados complexos multivariados, tendo que interpretá-los num curto espaço de tempo. Técnicas de visualização científica podem ser utilizadas para ajudar a entender o comportamento atmosférico. Este trabalho descreve a arquitetura e as facilidades apresentadas pelo sistema VisualMet, que foi implementado com base em um estudo das tarefas desenvolvidas pelos meteorologistas responsáveis pelo 8º Distrito de Meteorologia, em Porto Alegre. Este centro coleta dados meteorológicos três vezes ao dia, de 32 estações locais, e recebe dados similares do Instituto Nacional de Meteorologia, localizado em Brasília, e do National Meteorological Center, localizado nos Estados Unidos. Tais dados são resultados de observações de variáveis tais como temperatura, pressão, velocidade do vento e tipos de nuvens. As tarefas dos meteorologistas e as classes de dados foram observadas e analisadas para definir as características do sistema. A arquitetura e a implementação do VisualMet seguem, respectivamente, uma abordagem orientada a ferramentas e o paradigma de programação orientada a objetos. Dados obtidos das estações meteorológicas são instancias de uma classe chamada "Entidade". Três outras classes de objetos representando ferramentas que suportam as tarefas dos meteorologistas foram modeladas. Os objetos no sistema são apresentados ao usuário através de duas janelas, "Base de Entidades" e " Base de Ferramentas". A implementação da "Base de Ferramentas" inclui ferramentas de mapeamento (para produzir mapas de contorno, mapas de ícones e gráficos), ferramentas de armazenamento (para guardar e recuperar imagens geradas pelo sistema) e uma ferramenta de consulta (para ler valores de variáveis de estações selecionadas). E dada especial atenção a ferramenta de mapa de contorno, onde foi utilizado o método Multiquádrico para interpolação de dados. O trabalho apresenta ainda um estudo sobre métodos de interpolação de dados esparsos, antes de descrever detalhadamente os resultados obtidos com a ferramenta de mapa de contorno. Estes resultados (imagens) são discutidos e comparados com mapas gerados manualmente por meteorologistas do 8º Distrito de Meteorologia. Possíveis extensões do presente trabalho são também abordadas. / The weather forecast centers deal with a great volume of complex multivariate data, which usually have to be understood within short time. Scientific visualization techniques can be used to support both daily forecasting and meteorological research. This work reports the architecture and facilities of a system, named VisualMet, that was implemented based on a case study of the tasks accomplished by the meteorologists responsible for the 8th Meteorological District, in the South of Brazil. This center collects meteorological data three times a day from 32 local stations and receives similar data from both the National Institute of Meteorology, located in Brasilia, and National Meteorological Center, located in the United States of America. Such data result from observation of variables like temperature, pressure, wind velocity, and type of clouds. The tasks of meteorologists and the classes of application data were observed to define system requirements. The architecture and implementation of Visual- Met follow the tool-oriented approach and object-oriented paradigm, respectively. Data taken from meteorological stations are instances of a class named Entity. Three other classes of tools which support the meteorologists' tasks are modeled. Objects in the system are presented to the user through two windows, "Entities Base" and "Tools Base". Current implementation of the "Tools Base" contains mapping tools (to produce contour maps, icons maps and graphs), recording tools (to save and load images generated by the system) and a query tool (to read variables values of selected stations). The results of applying the multiquadric method to interpolate data for the construction of contour maps are also discussed. Before describing the results obtained with the multiquadric method, this work also presents a study on interpolation methods for scattered data. The results (images) obtained with the contour map tool are discussed and compared with the maps drawn by the meteorologists of the 8th Meteorological District. Possible extensions to this work are also presented.
438

Análise de funções de interpolação no método de volumes finitos / Interpolation functions analysis in finite volume method

Vanuza da Silva Vogas 15 September 2006 (has links)
Neste trabalho desenvolveu-se a modelagem matemática e computacional na análise de Funções de Interpolação nos Métodos de Volumes Finitos. As equações foram integradas no volume de controle onde nas fronteiras devem ser avaliados os fluxos advectivos e difusivos. Para calcular estes fluxos, utilizamos a função de interpolação cujo, principal objetivo é conectar estes pontos nodais. Utilizamos três esquemas de função de interpolação: CDS, UDS, QUICK estes esquemas foram escolhidos, pois apresentaram os menores erros de truncamentos possíveis. Em síntese, os testes computacionais mostraram as limitações dos esquemas CDS e UDS, bem como também revelaram os ótimos resultados do esquema QUICK. / In this work, the mathematical and computer modeling in the analysis of the interpolation functions in the Finite Volume Method was developed. The equations were integrated in the control of volume in which frontiers the advective and diffuse flows must be evaluated. The interpolation function, which main objective is to connect the nodal points, was used to calculate those flows. Three interpolation function schemes were used: CDS, UDS and QUICK. These schemes were chosen due to the fact that they present less errors and blockings. To sum up, the computer tests showed the limitations of the CDS and UDS schemes and the excellent results of the QUICK scheme.
439

Insecta e Arachnida associados ao solo: plantas herbáceas como área de refúgio visando ao controle biológico conservativo

Martins, Ivan Carlos Fernandes [UNESP] 25 April 2011 (has links) (PDF)
Made available in DSpace on 2014-06-11T19:32:05Z (GMT). No. of bitstreams: 0 Previous issue date: 2011-04-25Bitstream added on 2014-06-13T18:42:56Z : No. of bitstreams: 1 martins_icf_dr_jabo.pdf: 4264594 bytes, checksum: 8d2c087dcb45096b62998afb741a1d4b (MD5) / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES) / Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP) / Este trabalho teve como objetivo principal avaliar o controle biológico conservativo através da criação de área de refúgio em um agroecossistema. A área de estudo foi estabelecida em um hectare com a área de refúgio apresentando 80 m de comprimento e constituída por quatro canteiros de 20 m, cada um deles contendo uma das seguintes espécies de planta herbácea perene: Panicum maximum cv. Massai e Cynodon spp. cv. Tifton 85 (gramíneas) e Stylosanthes spp. cv. BRS Campo Grande e Calopogonium mucunoides cv. Comum (leguminosas). Os artrópodes foram amostrados por meio de armadilhas de solo tipo alçapão. Todas as análises foram realizadas com as espécies consideradas predominantes classificadas de acordo com a abundância, freqüência, constância e dominância. Utilizou-se análise de regressão múltipla com seleção de variáveis “stepwise” para verificar a influência dos fatores meteorológicos na variação populacional. As fases fenológicas da soja e milho foram determinadas e relacionadas com a flutuação populacional. Para determinar a distribuição espacial os dados foram analisados através dos índices de dispersão e modelos probabilísticos. A visualização da distribuição e influência da área de refúgio foi verificada por mapa de interpolação linear. Um total de 79.633 espécimes e 514 espécies de artrópodes foram coletados. Os himenópteros e os coleópteros foram os grupos mais diversificados e abundantes, com destaque para os formicídeos e carabídeos. Os refúgios com as plantas Stylosanthes spp. e Panicum maximum apresentaram maior diversidade e abundância de artrópodes. A maioria dos artrópodes associados ao solo considerados predominantes apresentou distribuição agregada. Muitos destes, principalmente artrópodes predadores, se agruparam próximo ou na área de refúgio / The objective of this study was to evaluate the conservation biological control through the creation of beetle bank in an agroecosystem. The study was conducted in one hectare with a 80 m long refuge area, with four blocks of 20 m., in each block one species of perennial herbaceous plant was planted: Panicum maximum cv. Masai and Cynodon spp. cv. Tifton 85 (grasses) and Stylosanthes spp. cv. BRS Campo Grande and Calopogonium mucunoides cv. Common (legumes). The Arthropods were sampled by pitfall traps. All analyses were performed with the predominant species considered classified according to the abundance, frequency, constancy, and dominance. We used multiple regression analysis with variable selection stepwise to assess the influence of meteorological factors in population. The soybean and corn phenological stages were determined and related to population fluctuation. To determine the spatial distribution, data were analyzed using dispersion indices and probabilistic models based on the frequency distribution of the arthropods. The illustration of the distribution and influence of the beetle bank was verified by linear interpolation map. A total of 79,633 specimens and 514 species of arthropods were collected. The Hymenoptera and Coleoptera were more diverse and abundant, specially ants and ground beetles. The refuges with plants Stylosanthes spp. and Panicum maximum showed greater diversity and abundance of arthropods. Aggregated distribution was showed for most predominant arthropods associated with soil. Many of these, mainly predatory arthropods, clustered near or in the beetle bank
440

Interpolação de imagens baseada em clustering

Akyama, Marcio Teruo 29 November 2010 (has links)
O ato de executar zoom em imagens é uma tarefa que se aplica em diversas áreas que podem variar desde entretenimento até aplicações científicas. Um dos grandes desafios na área é manter a definição das bordas dos objetos da imagem sem que haja a criação de artefatos tais como aspecto serrilhado ou borramento. Diversos métodos de preservação de borda foram apresentados na literatura. Este trabalho apresenta a proposta de uma nova técnica de interpolação de imagens baseada em clustering que tem como objetivo aumentar a resolução da imagem em tons de cinza preservando as bordas dos objetos nela presentes com um método mais simples e de fácil implementação. Foram realizados testes da técnica proposta com diversas imagens de natureza diferente e seus resultados comparados aos métodos clássicos de interpolação de imagem encontrados na literatura. Para teste da eficácia foram consideradas a medida do PSNR e Correlação Cruzada com cada método comparado. Os resultados obtidos mostraram que a técnica é promissora e que cumpre os objetivos do projeto. / Image zooming is a task applicable to many areas which can vary from entertainment to scientific applications. A big challenge is image edge preserving without creating artifacts like blurring or blocking. Several methods for edge preserving were proposed in literature. This work presents a new technique proposal based on clustering which aims to increase gray scale image resolution preserving objects edges with a simple method and easy to implement. Many different types of images were used to make tests of the proposed technique and results are compared to classical methods of image interpolation found in literature. PSNR and Cross-Correlation measurements were used to compare efficiency between methods. Results showed that the technique is quite competitive and meets the project goals.

Page generated in 0.0199 seconds