• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4
  • 4
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 15
  • 15
  • 5
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Studies in Interpolation and Approximation of Multivariate Bandlimited Functions

Bailey, Benjamin Aaron 2011 August 1900 (has links)
The focus of this dissertation is the interpolation and approximation of multivariate bandlimited functions via sampled (function) values. The first set of results investigates polynomial interpolation in connection with multivariate bandlimited functions. To this end, the concept of a uniformly invertible Riesz basis is developed (with examples), and is used to construct Lagrangian polynomial interpolants for particular classes of sampled square-summable data. These interpolants are used to derive two asymptotic recovery and approximation formulas. The first recovery formula is theoretically straightforward, with global convergence in the appropriate metrics; however, it becomes computationally complicated in the limit. This complexity is sidestepped in the second recovery formula, at the cost of requiring a more local form of convergence. The second set of results uses oversampling of data to establish a multivariate recovery formula. Under additional restrictions on the sampling sites and the frequency band, this formula demonstrates a certain stability with respect to sampling errors. Computational simplifications of this formula are also given.
2

Estimation of Unmeasured Radon Concentrations in Ohio Using Quantile Regression Forest

Bandreddy, Neel Kamal January 2014 (has links)
No description available.
3

Analyse des données en vue du diagnostic des moteurs Diesel de grande puissance / Data analysis for fault diagnosis on high power rated Diesel engines

Khelil, Yassine 04 October 2013 (has links)
Cette thèse a été réalisée dans le cadre d'un projet industriel (BMCI), dont l'objectif est d'augmenter la disponibilité des équipements sur les navires. Dans cette thèse, nous proposons une approche qui met à contribution deux approches différentes, à savoir une approche à base de données pour la détection des défauts et une approche à base de connaissances d'experts pour l'isolation des défauts. Cette approche se veut générique et applicable à différents sous-systèmes du moteur ainsi qu'à divers moteurs et offre une ouverture pour une éventuelle application sur d'autres équipements. De plus, elle est tolérante vis-à-vis des éventuels changements au niveau de l'instrumentation disponible. Cette approche a été testée sur la détection et l'isolation des défauts les plus fréquents et aux conséquences graves auxquels les moteurs Diesel sont sujets. Tous les sous-systèmes du moteurs Diesel sont inclus et l'approche de diagnostic prend en considération les interactions existantes entre les sous-systèmes. L'approche de diagnostic a été testée sur un banc d'essai et sur le navire militaire Adroit de DCNS. Les défauts réalisés sur divers circuits du banc moteur et les défauts apparus en fonctionnement sur certains moteurs de l'Adroit, ont été majoritairement détectés et isolés avec succès. De plus, pour pallier à l'incertitude et au caractère flou des relations expertes utilisées dans la procédure d'isolation, une validation des relations de cause à effet a été réalisée, dans le cadre de cette thèse, par la réalisation d'un modèle analytique de simulation de défauts. / This thesis is carried out within an industrial framework (BMCI) which aims to enhance the availability of equipments on board ships. In this work, a data-based method for fault detection is combined with a knowledge-based method for fault isolation. The presented approach is generic and characterized by the ability to be applied to all the Diesel engine subsystems, to different kind of Diesel engines and can also be extended to other equipments. Moreover, this approach is tolerant regarding differences in instrumentation. This approach is tested upon the detection and isolation of the most hazardous and frequent faults which subject Diesel engines. This approach intends to make diagnosis upon the entire Diesel engine including all the subsystems and the existing interactions between the subsystems. The proposed approach is tested upon a test bench and upon the Diesel engines of the DCNS military vessel textquotedblleft Adroit". Most of the introduced faults on the test bench and the appeared faults on the Adroit engines have been successfully detected and isolated. In addition, to deal with uncertainties and fuzziness of the causal relationships given by maintenance experts, a model is developed. This model aims to validate these causal relationships used in the isolation part of the diagnosis approach.
4

Accurate and Efficient Evaluation of the Second Virial Coefficient Using Practical Intermolecular Potentials for Gases

Hryniewicki, Maciej Konrad 24 August 2011 (has links)
The virial equation of state p = ρRT[ 1 + B(T) ρ + C(T) ρ2 + · · ·] for high pressure and density gases is used for computing chemical equilibrium properties and mixture compositions of strong shock and detonation waves. The second and third temperature-dependent virial coefficients B(T) and C(T) are included in tabular form in computer codes, and they are evaluated by polynomial interpolation. A very accurate numerical integration method is presented for computing B(T) and its derivatives for tables, and a sophisticated method is introduced for interpolating B(T) more accurately and efficiently than previously possible. Tabulated B(T) values are non-uniformly distributed using an adaptive grid, to minimize the size and storage of the tables and to control the maximum relative error of interpolated values. The methods introduced for evaluating B(T) apply equally well to the intermolecular potentials of Lennard-Jones in 1924, Buckingham and Corner in 1947, and Rice and Hirschfelder in 1954.
5

Accurate and Efficient Evaluation of the Second Virial Coefficient Using Practical Intermolecular Potentials for Gases

Hryniewicki, Maciej Konrad 24 August 2011 (has links)
The virial equation of state p = ρRT[ 1 + B(T) ρ + C(T) ρ2 + · · ·] for high pressure and density gases is used for computing chemical equilibrium properties and mixture compositions of strong shock and detonation waves. The second and third temperature-dependent virial coefficients B(T) and C(T) are included in tabular form in computer codes, and they are evaluated by polynomial interpolation. A very accurate numerical integration method is presented for computing B(T) and its derivatives for tables, and a sophisticated method is introduced for interpolating B(T) more accurately and efficiently than previously possible. Tabulated B(T) values are non-uniformly distributed using an adaptive grid, to minimize the size and storage of the tables and to control the maximum relative error of interpolated values. The methods introduced for evaluating B(T) apply equally well to the intermolecular potentials of Lennard-Jones in 1924, Buckingham and Corner in 1947, and Rice and Hirschfelder in 1954.
6

On deeply learning features for automatic person image re-identification

Franco, Alexandre da Costa e Silva 13 May 2016 (has links)
Submitted by Diogo Barreiros (diogo.barreiros@ufba.br) on 2017-03-10T14:39:59Z No. of bitstreams: 1 tese_alexandre_versao_final_bd.pdf: 3780030 bytes, checksum: 765f095f9626a12f3b43a6bf9fdb97f3 (MD5) / Approved for entry into archive by Vanessa Reis (vanessa.jamile@ufba.br) on 2017-03-10T14:52:25Z (GMT) No. of bitstreams: 1 tese_alexandre_versao_final_bd.pdf: 3780030 bytes, checksum: 765f095f9626a12f3b43a6bf9fdb97f3 (MD5) / Made available in DSpace on 2017-03-10T14:52:25Z (GMT). No. of bitstreams: 1 tese_alexandre_versao_final_bd.pdf: 3780030 bytes, checksum: 765f095f9626a12f3b43a6bf9fdb97f3 (MD5) / The automatic person re-identification (re-id) problem resides in matching an unknown person image to a database of previously labeled images of people. Among several issues to cope with this research field, person re-id has to deal with person appearance and environment variations. As such, discriminative features to represent a person identity must be robust regardless those variations. Comparison among two image features is commonly accomplished by distance metrics. Although features and distance metrics can be handcrafted or trainable, the latter type has demonstrated more potential to breakthroughs in achieving state-of-the-art performance over public data sets. A recent paradigm that allows to work with trainable features is deep learning, which aims at learning features directly from raw image data. Although deep learning has recently achieved significant improvements in person re-identification, found on some few recent works, there is still room for learning strategies, which can be exploited to increase the current state-of-the-art performance. In this work a novel deep learning strategy is proposed, called here as coarse-to-fine learning (CFL), as well as a novel type of feature, called convolutional covariance features (CCF), for person re-identification. CFL is based on the human learning process. The core of CFL is a framework conceived to perform a cascade network training, learning person image features from generic-to-specific concepts about a person. Each network is comprised of a convolutional neural network (CNN) and a deep belief network denoising autoenconder (DBN-DAE). The CNN is responsible to learn local features, while the DBN-DAE learns global features, robust to illumination changing, certain image deformations, horizontal mirroring and image blurring. After extracting the convolutional features via CFL, those ones are then wrapped in covariance matrices, composing the CCF. CCF and flat features were combined to improve the performance of person re-identification in comparison with component features. The performance of the proposed framework was assessed comparatively against 18 state-of-the-art methods by using public data sets (VIPeR, i-LIDS, CUHK01 and CUHK03), cumulative matching characteristic curves and top ranking references. After a thorough analysis, our proposed framework demonstrated a superior performance.
7

Modelagem de funções via interpolação polinomial de Lagrange

Martins, Flávio Inácio da Silveira 19 December 2017 (has links)
Submitted by Luciana Ferreira (lucgeral@gmail.com) on 2018-01-10T14:13:44Z No. of bitstreams: 2 Dissertação - Flávio Inácio da Silveira Martins - 2017.pdf: 3132595 bytes, checksum: d6a10e897a46a5bb9e5448865b952fa7 (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) / Approved for entry into archive by Luciana Ferreira (lucgeral@gmail.com) on 2018-01-10T14:15:24Z (GMT) No. of bitstreams: 2 Dissertação - Flávio Inácio da Silveira Martins - 2017.pdf: 3132595 bytes, checksum: d6a10e897a46a5bb9e5448865b952fa7 (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) / Made available in DSpace on 2018-01-10T14:15:24Z (GMT). No. of bitstreams: 2 Dissertação - Flávio Inácio da Silveira Martins - 2017.pdf: 3132595 bytes, checksum: d6a10e897a46a5bb9e5448865b952fa7 (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) Previous issue date: 2017-12-19 / This work aims to discuss, conceptually, functions modeling through polynomial interpolation, using the Lagrange method. For a better theoretical foundation, the research method uses qualitative analysis and a literature review of key subjects related to this proposition: Polynomial and non - polynomial functions, Polynomial interpolation and Lagrange formula. This work presents graphs, examples and relevant problem-situations, permitting the reader to enrich and better understand the subject. Nevertheless, this research most important conclusion is to relate mathematical theory to high school practice, not with a formal or mechanical model, but with a familiar approach to identify the mathematical concepts in their daily basis life. / O presente trabalho objetiva debater, de forma conceitual, a modelagem de funções por meio da interpolação polinomial, utilizando o método de Lagrange. Para melhor fundamentação, o método de pesquisa empreendido segue natureza qualitativa e com revisão bibliográfica dos principais conteúdos relacionados nessa proposição: Funções Polinomiais e não – polinomiais, Interpolação Polinomial e Fórmula de Lagrange. Contempla ainda: gráficos, exemplos e situações - problema relevantes, o que possibilita ao leitor enriquecimento e melhor compreensão acerca da temática. No entanto, a conclusão mais importante dessa pesquisa é relacionar a teoria matemática com a prática escolar no ensino médio, mas não com um modelo formal ou mecânico, e sim familiar para que ocorra a identificação dos conceitos matemáticos em seu cotidiano.
8

Ajuste de curvas por polinômios com foco no currículo do ensino médio / Curve fitting polynomials focusing on high school curriculum

Santos, Alessandro Silva, 1973- 27 August 2018 (has links)
Orientador: Lúcio Tunes dos Santos / Dissertação (mestrado profissional) - Universidade Estadual de Campinas, Instituto de Matemática Estatística e Computação Científica / Made available in DSpace on 2018-08-27T11:38:45Z (GMT). No. of bitstreams: 1 Santos_AlessandroSilva_M.pdf: 6474871 bytes, checksum: 351d93b093e44b399a99cd42075cb4b5 (MD5) Previous issue date: 2015 / Resumo: ste trabalho surge a partir de uma proposta de desenvolvimento do ajuste de curvas com uma abordagem que busca enriquecer o estudo de funções presente no currículo do ensino fundamental e médio. É apresentada ao aluno, desde o aspecto histórico do ajuste, passando pela interpolação de curvas, com foco na interpolação polinomial e o método dos quadrados mínimos, no qual são apresentados, a regressão linear, além de modelos como o ajuste exponencial. É também descrita nesta dissertação uma ferramenta de grande importância no cálculo numérico, o conceito de erro de ajuste, sua definição e forma de estimativa. Na interpolação polinomial, o aluno, ao desenvolver a forma de Lagrange, é estimulado a trabalhar as operações, forma fatorada e interpretação das vantagens do método, como o número e grau de dificuldade de operações realizadas por esse. Interpolação inversa e interpolação de curvas complementam o referido capítulo, em que busca, sempre que possível, utilizar situações problema . O método dos quadrados mínimos estimula o estudante a escolha da função de ajuste e determinação dessa a partir do conceito de minimização do erro. Polinômios de grau um,a regressão linear, e dois são trabalhados devido a sua importância no currículo do ensino médio. Explorando também conceitos como logaritmos e exponenciais, propõe-se o ajuste linear de modelos exponenciais, utilizando situações problema de uma área em evidência no momento, a Biomatemática, modelar dinâmicas de crescimento populacional. Dessa forma o aluno tem contato com formas de previsão que são úteis em importantes áreas como: a saúde pública, a modelagem de epidemias e desenvolvimento de patógenos; planejamento de políticas públicas com a modelagem do crescimento e distribuição da população; comportamento da economia como no caso de previsões de juros futuros. Para que este trabalho possa servir de auxílio aos professores de forma prática e interessante, o capítulo final traz sugestão de problemas na forma de planilha, facilitando sua reprodução e aplicação em sala de aula / Abstract: This study comes from a development proposal curves adjustment with an approach that seeks to enrich the study of present functions in the primary and secondary curriculum. It is presented to the student, from the historical aspect setting, through interpolation curves, focusing on polynomial interpolation and the method of least squares, which presents the linear regression, and models like the exponential fit. It is also described in this work a very important tool in numerical calculation, the concept of setting error, its definition and method of estimation. In polynomial interpolation, the student, to develop the form of Lagrange, is encouraged to work operations, factored form and interpretation of the advantages of the method, as the number and degree of difficulty of tasks for this. Inverse interpolation and interpolation curves complement the chapter on seeking, whenever possible, use problem situations. The method of least squares stimulates the student the choice of setting function and determine this from the concept of minimizing the error. Polynomials of degree one, linear regression, and two are worked because of its importance in the high school curriculum. Also exploring concepts such as logarithms and exponential, it is proposed that the linear fit of exponential models using problem situations evidence in area at the time, the Biomathematics, modeling dynamics of population growth. Thus the student has contact with forms of provision that are useful in important areas such as public health, the modeling of epidemics and development of pathogens; public policy planning with modeling of growth and distribution of population; behavior of the economy as in the case of future interest rate forecasts. For this work may be an aid to teachers in a practical and interesting, the final chapter brings problems suggestion in the form of sheet, facilitating its reproduction and use in the classroom / Mestrado / Matemática em Rede Nacional - PROFMAT / Mestre em Matemática em Rede Nacional - PROFMAT
9

Optimising the Choice of Interpolation Nodes with a Forbidden Region

Bengtsson, Felix, Hamben, Alex January 2022 (has links)
We consider the problem of optimizing the choice of interpolation nodes such that the interpolation error is minimized, given the constraint that none of the nodes may be placed inside a forbidden region. Restricting the problem to using one-dimensional polynomial interpolants, we explore different ways of quantifying the interpolation error; such as the integral of the absolute/squared difference between the interpolated function and the interpolant, or the Lebesgue constant, which compares the interpolant with the best possible approximating polynomial of a given degree. The interpolation error then serves as a cost function that we intend to minimize using gradient-based optimization algorithms. The results are compared with existing theory about the optimal choice of interpolation nodes in the absence of a forbidden region (mainly due to Chebyshev) and indicate that the Chebyshev points of the second kind are near-optimal as interpolation nodes for optimizing the Lebesgue constant, whereas placing the points as close as possible to the forbidden region seems optimal for minimizing the integral of the difference between the interpolated function and the interpolant. We conclude that the Chebyshev points of the second kind serve as a great choice of interpolation nodes, even with the constraint on the placement of the nodes explored in this paper, and that the interpolation nodes should be placed as close as possible to the forbidden region in order to minimize the interpolation error.
10

Polinômios e aproximações de função / Polynomials and function approximations

Vanessa Priscila Nicolussi Marques 18 November 2016 (has links)
Os polinômios possuem características e propriedades que os tornam bastante importantes, seja modelando problemas da natureza e do cotidiano ou servindo como ferramenta de resolução de problemas ou, ainda, para alcançar resultados matemáticos mais avançados. O Currículo do Estado de São Paulo sugere uma sequência de conteúdos para serem trabalhados, levando o aluno a um aprendizado dos polinômios tanto do ponto de vista teórico quanto de aplicações. O ensino de polinômios é feito em espiral do 7º ano do Ensino Fundamental até o 3º ano do Ensino Médio, isto é, seu conteúdo é trabalhado gradativamente no decorrer dos anos escolares, sempre sendo retomado e aprofundado de acordo com o tempo escolar adequado. Este trabalho tem como objetivo contribuir com a formação de professores de Matemática do Ensino Básico apresentando uma sólida teoria sobre os polinômios no que diz respeito a definição, propriedades, operações algébricas, funções polinomiais, traçado de gráfico de polinômios e, já em um nível mais avançado, derivada e integral de polinômios. Além disso, revisamos os conceitos de espaços vetoriais, independência linear, base, projeções e ortogonalidade. A teoria apresentada é então utilizada no estudo de aproximações de funções por polinômios. Entre as formas de aproximação, apresentamos o polinômio de Taylor, a Interpolação Polinomial e o ajuste polinomial pelo Método dos Mínimos Quadrados. Ao longo do texto apresentamos aplicações no cotidiano como o cálculo do polinômio que descreve uma corrida de táxi, a fórmula 95 para aposentadoria e a curva de lucro de uma sorveteria em função do preço de seus sorvetes. / Polynomials have characteristics and properties that make them very important, modelling nature and daily problems or serving as a tool to solve problems, or even to achieve more advanced mathematical results. The curriculum of São Paulo state suggests a sequence of contents to be worked, leading the student to learning about polynomials from both the theoretical and the applications point of views. The polynomials teaching is done in a spiral way from the 7th year of elementary school to the 3rd year of high school, that is, its contents are gradually worked during the school years, always being resumed and deepened in accordance with the appropriate school time. This work aims to contribute to the training of basic education Mathematics teachers introducing them a solid theory of polynomials concerning about definition, properties, algebraic operations, polynomial functions, polynomial graphics and already at a more advanced level, derivative and integral of polynomials. In addition, we review the concepts of vector spaces, linear independence, base, projections and orthogonality. The presented theory is then used in the study of function approximations by polynomials. Among the forms of approach, we present the Taylor polynomial, the polynomial interpolation and polynomial fit by the least square method. Throughout the text we present applications in daily life such as the calculation of the polynomial that describes a taxi ride, the 95 formula for retirement and the the profit curve of an ice cream shop due to the price of their ice cream.

Page generated in 0.0955 seconds