• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 100
  • 66
  • 47
  • 19
  • 8
  • 6
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 312
  • 81
  • 43
  • 40
  • 36
  • 32
  • 32
  • 32
  • 32
  • 31
  • 29
  • 27
  • 26
  • 25
  • 25
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Arbitrary Degree T-Splines

Finnigan, Gordon Thomas 07 July 2008 (has links) (PDF)
T-Splines is a freeform surface type similar to NURBS, that allows partial rows of control points. Up until now, T-Splines have only been formally defined for the degree three case. This paper extends the definition to support all odd, even, and mixed degree T-Spline surfaces, making T-Splines a proper superset of all standard NURBS surfaces.
42

U-Splines: Splines Over Unstructured Meshes

Schmidt, Steven K. 30 March 2022 (has links)
U-splines are a novel approach to the construction of a spline basis for representing smooth objects in Computer-Aided Design (CAD) and Computer-Aided Engineering (CAE). A spline is a piecewise-defined function that satisfies continuity constraints between adjacent cells in a mesh. U-splines differ from existing spline constructions, such as Non-Uniform Rational B-splines (NURBS), subdivision surfaces, T-splines, and hierarchical B-splines, in that they can accommodate local variation in cell size, polynomial degree, and smoothness simultaneously over more varied mesh configurations. Mixed cell types (e.g., triangle and tetrahedron and quadrilateral and hexahedral cells in the same mesh) and T-junctions are also supported. The U-spline construction is presented for curves, surfaces, and volumes with higher dimensional generalizations possible. A set of requirements are given to ensure that the U-spline basis is positive, forms a partition of unity, is complete, and is locally linearly independent.
43

Skinning engineering models with non-uniform, hierarchical B-spline surfaces

Coe, David H. 05 September 2009 (has links)
This thesis presents the algorithms and methods to represent the skin of an engineering model with non-uniform, hierarchical B-spline surfaces. Non-uniform, hierarchical B-splines offer the mechanical designer many advantages: parametrically defined components may be added to a surface while maintaining C² surface continuity; detailed features may be added to a surface without globally affecting the B-spline control net; and an appropriate geometric basis for finite element meshing and analysis in the conceptual design phase can be established. These algorithms are applied to ASCYNT, a conceptual aircraft design code, to verify and validate the algorithms. A single-surface definition of an aircraft skin, appropriate for computational fluid dynamic and radar and infrared cross-section analysis, is designed using non-uniform hierarchical B-spline surfaces. / Master of Science
44

Desenvolvimento de uma base de funções paramétricas para interpolação de imagens médicas / Development of parametric basis function for interpolation of medical images

Soares, Isaias José Amaral 03 July 2013 (has links)
O uso de imagens é crucial na medicina, e seu uso no diagnóstico de doenças é uma das principais ferramentas clínicas da atualidade. Porém, frequentemente necessitam de pós-processamento para serem úteis. Embora ferramentas clássicas sejam utilizadas para esse fim, elas não dão tratamento específico a certas características de imagens fractais, como as provindas de sistemas biológicos. Nesse enfoque, este trabalho objetivou a criação de novas bases de interpolação utilizando a Q-Estatística para verificar se seriam estas seriam adequadas à representação de objetos com características fractais que as bases clássicas. Foram criados dois tipos de splines: uma unidimensional e outra bidimensional, que permitiram um tipo diferente de interpolação, fundamentado na q-Estatística. Os testes demonstraram a potencialidade dessas ferramentas para uso em sinais e imagens médicas, com acentuada redução do erro de interpolação no caso unidimensional (em até 351,876%) e uma redução sutil no caso bidimensional (0,3%). Como resultado adicional, foram criados filtros de imagens e avaliados seus resultados em imagens médicas, que resultaram em melhorias de até 1.340% de ganho efetivo na remoção de ruídos de natureza fractal (marrom). Os resultados sugerem que as q-bases desenvolvidas foram capazes de representar melhor imagens e sinais médicos, bem como é interessante o uso dos filtros desenvolvidos na remoção de diversos tipos de ruído do tipo 1/f^b. / The use of images is crucial in modern medicine, and diagnostic imaging is a major clinical tools used in detecting, monitoring and completion of many treatments. However, often the images need to be post-processed for display to health professionals or automated analysis, searching for signs of abnormalities. Although classical tools are used for that purpose, they do not give special treatment to certain characteristics of fractal images, such as those coming from biological systems. These characteristics are produced, in general, by complex dynamic systems as a result of internal interactions of sub-system components, giving the system a fractal character. In this context, the main objective of this work was to propose interpolation bases using the Q-statistic, creating bases of Q-interpolation, and verify if such bases would be best suited to the representation of objects with fractal characteristics than classical bases, assumed the premise that such a theory model best this kind of phenomenon than classical theory. Based on this hypothesis, we created two types of splines: one-dimensional and one-dimensional, called Q-splines, which allow a different type of interpolation and they can capture behaviors as super-additive or sub-additive among the constituents of a spline. These models have demonstrated numerically the potential use of this type of interpolation for use in signals and medical images, reducing the interpolation error by up to 351.876 % in the one-dimensional case and 0.3 % in two dimensional. As secondary results, were defined two families of image filters, called anisotropic Q-filters and isotropic Q-filters, and their results were evaluated in real medical images. In virtually all analyzes it was possible to obtain the best results from conventional approaches, sometimes with improvements of 1.340 % in some filters, in removing noise fractal nature (brown). The results were more modest for the interpolation of two-dimensional images, however, generally proved exciting and encouraging, clearly showing that these new approaches are not only viable, but also can produce better results compared to classical approaches. Based on these results, we concluded that the Q-bases developed are best able to represent not only signs but medical imaging (1D and 2D) although its use can be improved by the adoption of approaches adapted to the vector representation of information, that favor the use of splines. Similarly, the Q-filters were more suitable for the processing of medical signals when compared to conventional approaches.
45

Contribuição para a pecuária de precisão aplicada à bovinocultura de corte: metodologias de análise para pesagens em tempo real / Contribution to precision livestock farming for beef cattle: analysis methodologies for real-time weighing

Silva, Danieli Perez da 15 February 2019 (has links)
A introdução de sistemas automáticos de pesagem tem possibilitado a mensuração do peso vivo em tempo real, sem a necessidade de remover os animais de piquetes e manejá-los em currais com tronco de contenção e balanças. Embora estes equipamentos estejam disponíveis, sua contribuição para tomadas de decisões e gestão da produção de bovinos de corte é ainda pequena, principalmente em função de seu custo e da falta do desenvolvimento de metodologias para processamento e análise do volume de dados produzido. Neste contexto, o presente estudo teve como objetivo central testar e propor metodologias de análise para os dados de peso vivo provenientes de sistemas automáticos de pesagem, que possibilitem a percepção das relações não lineares entre o desempenho dos animais e o tempo. Para isso, este estudo foi subdividido em duas etapas: i) a primeira caracterizou-se pela avaliação do uso do sistema de plataformas de pesagem corporal automática (modelo VW 1000, Intergado®, Brasil) em fazenda comercial de bovinos Nelore P.O., sendo verificado a coerência biológica dos dados e analisado o grau de concordância entre este sistema e o sistema convencional de pesagem. ii) A segunda etapa caracterizou-se pela busca de ferramentas computacionais que permitissem mensurar o ganho de peso diário de modo a acompanhar suas variações em tempo real. Para isso, dois métodos de suavização foram analisados: a) Suavizador de gráfico de dispersão localmente ponderado (do inglês, Locally-Weighted Scatterplot Smoother, LOWESS) e b) B-spline penalizada (do inglês, Penalized B-splines, PB-splines). Ambos os métodos apresentaram bons ajustes aos dados de peso vivo, mas LOWESS estimou curvas menos suaves, as quais resultaram, por sua vez, em trajetórias de ganho de peso com maior variabilidade. Por outro lado, o método de suavização PB-splines estimou curvas com estruturas mais rígidas às flutuações que ocorrem com o peso vivo, não distorcendo, porém, a relação de dependência entre as variáveis. Assim, o presente estudo possibilitou afirmar que sistemas automáticos de pesagem, quando integrados aos métodos de suavização de dados aqui explorados, permitem estimar a relação entre o peso vivo e o tempo sem o estabelecimento prévio de uma função. Com isso, permitem também a construção de trajetórias de ganho de peso diário passíveis de serem utilizadas para a identificação de problemas tanto de lote quanto de indivíduos. As ferramentas aqui exploradas poderão auxiliar os futuros estudos de identificação das variações do ganho de peso inerentes aos animais ou ao ambiente, permitindo, assim, melhorar a identificação de problemas em sistemas de pecuária de precisão. / The introduction of automatic weighing systems presents opportunities to record cattle body weight multiple times per day without the need to remove them from pens or paddocks and handling them in squeeze-chute equipped with static weighing systems. Although these systems are available, its contribution to decision making in beef cattle management is still rather small. This area remains uninvestigated partially because the steps to data processing and analysis are not well defined, reducing the potential of this system to monitoring changes in animal performance. In this context, the main objective of this study was to develop an approach for analyzing the body weight records by an automatic weighing system that describes and shows the nonlinearity of the animal performance as a function of time. For this, the study was subdivided in two steps; First, daily body weight of Nelore cattle were collected in a commercial farm and analyzed in relation its biological coherence and to understand the agreement of such weighing systems to the conventional system. Second, characterization was performed, by the search for computational tools that allowed to measure the daily weight gain to follow its real variations. For this, two smoothing techniques were analyzed: a) Locally-Weighted Scatterplot Smoother (LOWESS), and b) penalized B-spline (PB-spline). Both techniques fitted well the body weight data, but LOWESS estimated curves less smoothed, which resulted in daily body weight gain trajectories with greater variability. On the other hand, the penalized PB-spline estimated curves that had structures more rigid to fluctuations that occur with body weight. But the form of the regression function not distorted the dependence relationship between the two variables. Thus, the present study concluded that automatic weighing systems when integrated with the smoothing techniques used allowed us to estimate the structural form between body weight and time without the reference to a specific model. Hence, it allowed the construction of daily weight gain trajectories that may be used to identify problems in pens or in individual animals. The tools used here may help future studies to identify the inherent and unnatural variations of daily weight gain, thus improving the efficiency of identifying problems in animal performance.
46

Exploration intégrée probabiliste pour robots mobiles évoluant en environnements complexes / Probabilistic Integrated Exploration for Mobile Robots in Complex Environments

Toriz Palacios, Alfredo 20 March 2012 (has links)
L'un des défis fondamentaux de la robotique d'aujourd'hui est d'obtenir des cartes robustes en utilisant des mécanismes efficaces pour l'exploration et la modélisation des environnements toujours plus complexes. Ce problème est connu comme celui de la planification, de la localisation et de la cartographie simultanée (SPLAM).Dans cette thèse nous avons développé des outils pour obtenir une stratégie de SPLAM. D'abord, l'exploration est faite par le graphe d'exploration aléatoire (REG) basé sur la création d'une structure de graphe et sur un contrôle de frontières. Ensuite, le problème de localisation et de cartographie simultanée (SLAM) est résolu avec une stratégie topologique basée sur des B-Splines. Pour valider notre stratégie, nous avons créé une autre approche de SPLAM basée sur des outils connus comme le Filtre de Kalman étendu pour le SLAM et sur l'arbre aléatoire (SRT) pour l'exploration. Ces résultats sont comparés avec les résultats de notre stratégie. / One of the fundamental challenges of today's robotics is to obtain robust maps using efficient mechanisms for exploring and modeling increasingly complex environments. This is known as simultaneous planning, localization and mapping (SPLAM) problem.Considering this problem, in this thesis we have developed some tools to obtain a SPLAM strategy. First, the exploration is made by the Random Exploration Graph approach (REG) which is based on the creation of a graph structure and on a frontier control. Next, the simultaneous localization and mapping (SLAM) problem is solved using a B-Spline based topologic strategy. To validate our strategy, we have created another SPLAM approach based on well known tools as the Extended Kalman Filter for SLAM and on the Sensor based Random tree (SRT) for the exploration problem. Its results are confronted with the results obtained by our strategy.
47

Boundary-constrained inverse consistent image registration and its applications

Kumar, Dinesh 01 May 2011 (has links)
This dissertation presents a new inverse consistent image registration (ICIR) method called boundary-constrained inverse consistent image registration (BICIR). ICIR algorithms jointly estimate the forward and reverse transformations between two images while minimizing the inverse consistency error (ICE). The ICE at a point is defined as the distance between the starting and ending location of a point mapped through the forward transformation and then the reverse transformation. The novelty of the BICIR method is that a region of interest (ROI) in one image is registered with its corresponding ROI. This is accomplished by first registering the boundaries of the ROIs and then matching the interiors of the ROIs using intensity registration. The advantages of this approach include providing better registration at the boundary of the ROI, eliminating registration errors caused by registering regions outside the ROI, and theoretically minimizing computation time since only the ROIs are registered. The first step of the BICIR algorithm is to inverse consistently register the boundaries of the ROIs. The resulting forward and reverse boundary transformations are extended to the entire ROI domains using the Element Free Galerkin Method (EFGM). The transformations produced by the EFGM are then made inverse consistent by iteratively minimizing the ICE. These transformations are used as initial conditions for inverse-consistent intensity-based registration of the ROI interiors. Weighted extended B-splines (WEB-splines) are used to parameterize the transformations. WEB-splines are used instead of B-splines since WEB-splines can be defined over an arbitrarily shaped ROI. Results are presented showing that the BICIR method provides better registration of 2D and 3D anatomical images than the small-deformation, inverse-consistent, linear-elastic (SICLE) image registration algorithm which registers entire images. Specifically, the BICIR method produced registration results with lower similarity cost, reduced boundary matching error, increased ROI relative overlap, and lower inverse consistency error than the SICLE algorithm.
48

An Additive Bivariate Hierarchical Model for Functional Data and Related Computations

Redd, Andrew Middleton 2010 August 1900 (has links)
The work presented in this dissertation centers on the theme of regression and computation methodology. Functional data is an important class of longitudinal data, and principal component analysis is an important approach to regression with this type of data. Here we present an additive hierarchical bivariate functional data model employing principal components to identify random e ects. This additive model extends the univariate functional principal component model. These models are implemented in the pfda package for R. To t the curves from this class of models orthogonalized spline basis are used to reduce the dimensionality of the t, but retain exibility. Methods for handing spline basis functions in a purely analytical manner, including the orthogonalizing process and computing of penalty matrices used to t the principal component models are presented. The methods are implemented in the R package orthogonalsplinebasis. The projects discussed involve complicated coding for the implementations in R. To facilitate this I created the NppToR utility to add R functionality to the popular windows code editor Notepad . A brief overview of the use of the utility is also included.
49

A unified framework for spline estimators

Schwarz, Katsiaryna 24 January 2013 (has links)
No description available.
50

Vector refinable splines and subdivision

Andriamaro, Miangaly Gaelle 12 1900 (has links)
Thesis (MSc (Mathematics))--Stellenbosch University, 2008. / In this thesis we study a standard example of refinable functions, that is, functions which can be reproduced by the integer shifts of their own dilations. Using the cardinal B-spline as an introductory example, we prove some of its properties, thereby building a basis for a later extension to the vector setting. Defining a subdivision scheme associated to the B-spline refinement mask, we then present the proof of a well-known convergence result. Subdivision is a powerful tool used in computer-aided geometric design (CAGD) for the generation of curves and surfaces. The basic step of a subdivision algorithm consists of starting with a given set of points, called the initial control points, and creating new points as a linear combination of the previous ones, thereby generating new control points. Under certain conditions, repeated applications of this procedure yields a continuous limit curve. One important goal of this thesis is to study a particular extension of scalar subdivision to matrix subdivision ...

Page generated in 0.0537 seconds