• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 185
  • 112
  • 38
  • 19
  • 17
  • 6
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • Tagged with
  • 470
  • 98
  • 49
  • 44
  • 33
  • 30
  • 29
  • 27
  • 27
  • 27
  • 26
  • 24
  • 22
  • 22
  • 22
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
111

High Order Volumetric Directional Pattern for Robust Face Recognition

Essa, Almabrok Essa 28 August 2017 (has links)
No description available.
112

Posouzení metody stanovení průtoku jímáním kapaliny do odměrné nebo vážicí nádoby / Method analysis for flow measurement by collecting fluid into the volumetric or weighing vessel

Valdová, Klára January 2016 (has links)
This diploma thesis is concerned with assessment of two methods of gauging the flow rate used in the sphere of official measurements on profiles with an unrestricted water level. This is the method of collecting liquid into a volumetric vessel and the method of collecting liquid into a weighing vessel (pouch). The main purpose of this work was to specify uncertainties determined using method A and B for these two methods of gauging flow rate within the terms of addressing the Metrology Development Plan of the Czech Office for Standards, Metrology and Testing, because these uncertainties were previously determined using older methodology and using less accurate flow rate benchmarks. The entire work is based on extensive experimental measurement of the flow rate, using the assessed methods, executed at the Laboratory of Water Management Research in Brno. The method of collecting liquids into a volumetric vessel was assessed using four various vessel volumes - 9 l, 15 l, 30 l and 50 l. Relative uncertainties determined using method A and B in relation to flow rate are determined for each vessel in the experimental section of this work. Within the scope of this thesis, these uncertainties were also determined for the method of collection of liquid into weighing vessel (pouch), which was assessed for flow rates of from 0,5 l/s to 10,0 l/s.
113

Estimating Thermal Conductivity and Volumetric Specific Heat of a Functionally Graded Material using Photothermal Radiometry

Koppanooru, Sampat Kumar Reddy 12 1900 (has links)
Functionally graded materials (FGMs) are inhomogeneous materials in which the material properties vary with respect to space. Research has been done by scientific community in developing techniques like photothermal radiometry (PTR) to measure the thermal conductivity and volumetric heat capacity of FGMs. One of the problems involved in the technique is to solve the inverse problem, i.e., estimating the thermal properties after the frequency scan has been obtained. The present work involves finding the unknown thermal conductivity and volumetric heat capacity of the FGMs by using finite volume method. By taking the flux entering the sample as periodic and solving the discretized 1-D thermal wave field equation at a frequency domain, one can obtain the complex temperatures at the surface of the sample for each frequency. These complex temperatures when solved for a range of frequencies gives the phase vs frequency scan which can then be compared to original frequency scan obtained from the PTR experiment by using a residual function. Brute force and gradient descent optimization methods have been implemented to estimate the unknown thermal conductivity and volumetric specific heat of the FGMs through minimization of the residual function. In general, the spatial composition profile of the FGMs can be approximated by using a smooth curve. Three functional forms namely Arctangent curve, Hermite curve, and Bezier curve are used in approximating the thermal conductivity and volumetric heat capacity distributions in the FGMs. The use of Hermite and Bezier curves gives the flexibility to control the slope of the curve i.e. the thermal property distribution along the thickness of the sample. Two-layered samples with constant thermal properties and three layered samples in which one of the layer has varying thermal properties with respect to thickness are considered. The program is written in Fortran and several test runs are performed. Results obtained are close to the original thermal property values with some deviation based on the stopping criteria used in the gradient descent algorithm. Calculating the gradients at each iteration takes considerable amount of time and if these gradient values are already available, the problem can be solved at a faster rate. One of the methods is extending automatic differentiation to complex numbers and calculating the gradient values ahead; this is left for future work.
114

Volumetric T-spline Construction for Isogeometric Analysis – Feature Preservation, Weighted Basis and Arbitrary Degree

Liu, Lei 01 September 2015 (has links)
Constructing spline models for isogeometric analysis is important in integrating design and analysis. Converting designed CAD (Computer Aided Design) models with B-reps to analysis-suitable volumetric T-spline is fundamental for the integration. In this thesis, we work on two directions to achieve this: (a) using Boolean operations and skeletons to build polycubes for feature-preserving high-genus volumetric T-spline construction; and (b) developing weighted T-splines with arbitrary degree for T-spline surface and volume modeling which can be used for analysis. In this thesis, we first develop novel algorithms to build feature-preserving polycubes for volumetric T-spline construction. Then a new type of T-spline named the weighted T-spline with arbitrary degree is defined. It is further used in converting CAD models to analysis-suitable volumetric T-splines. An algorithm is first developed to use Boolean operations in CSG (Constructive Solid Geometry) to generate polycubes robustly, then the polycubes are used to generate volumetric rational solid T-splines. By solving a harmonic field with proper boundary conditions, the input surface is automatically decomposed into regions that are classified into topologically either a cube or a torus. Two Boolean operations, union and difference, are performed with the primitives and polycubes are generated by parametric mapping. With polycubes, octree subdivision is carried out to obtain a volumetric T-mesh. The obtained T-spline surface is C2-continuous everywhere except the local region surrounding irregular nodes, where the surface continuity is elevated from C0 to G1. B´ezier elements are extracted from the constructed solid T-spline models, which are further used in isogeometric analysis. The Boolean operations preserve the topology of the models inherited from design and can generate volumetric T-spline models with better quality. Furthermore, another algorithm is developed which uses skeleton as a guidance to the polycube construction. From the skeleton of the input model, initial cubes in the interior are first constructed. By projecting corners of interior cubes onto the surface and generating a new layer of boundary cubes, the entire interior domain is split into different cubic regions. With the splitting result, octree subdivision is performed to obtain T-spline control mesh or T-mesh. Surface features are classified into three groups: open curves, closed curves and singularity features. For features without introducing new singularities like open or closed curves, we preserve them by aligning to the parametric lines during subdivision, performing volumetric parameterization from frame field, or modifying the skeleton. For features introducing new singularities, we design templates to handle them. With a valid T-mesh, we calculate rational trivariate T-splines and extract B´ezier elements for isogeometric analysis. Weighted T-spline basis functions are designed to satisfy partition of unity and linear independence. The weighted T-spline is proved to be analysis-suitable. Compared to standard T-splines, weighted T-splines have less geometrical constraint and can decrease the number of control points significantly. Trimmed NURBS surfaces of CAD models are reparameterized with weighted T-splines by a new edge interval extension algorithm, with bounded surface error introduced. With knot interval duplication, weighted T-splines are used to deal with extraordinary nodes. With B´ezier coefficient optimization, the surface continuity is elevated from C0 to G1 for the one-ring neighborhood elements. Parametric mapping and sweeping methods are developed to construct volumetric weighted T-splines for isogeometric analysis. Finally, we develop an algorithm to construct arbitrary degree T-splines. The difference between odd degree and even degree T-splines are studied in detail. The methods to extract knot intervals, calculate new weights to handle extraordinary nodes, and extract B´ezier elements for analysis are investigated with arbitrary degrees. Hybrid degree weighted Tspline is generated at designated region with basis functions of different degrees, for the purpose of performing local p-refinement. We also study the convergence rate for T-spline models of different degrees, showing that hybrid degree weighted T-splines have better performance after p-refinement. In summary, we develop novel methods to construct volumetric T-splines based on polycube and sweeping methods. Arbitrary degree weighted T-spline is proposed, with proved analysis-suitable properties. Weighted T-spline basis functions are used to reparameterize trimmed NURBS surfaces, handling extraordinary nodes, based on which surface and volumetric weighted T-spline models are constructed for isogeometric analysis.
115

VOLUMETRIC 3D VISUALIZATION OF TEST AND EVALUATION OPERATIONS

Briggs, James R., Deis, Michael R., Geng, Jason 10 1900 (has links)
International Telemetering Conference Proceedings / October 25-28, 1999 / Riviera Hotel and Convention Center, Las Vegas, Nevada / Time-Space-Position-Information (TSPI) visualization systems used today at the Air Force Flight Test Center (AFFTC) and simulation visualization tools used at the Air Armament Center (AAC) utilize two-dimensional (2D) display systems for both real-time and post-mission data analysis. Examples are monitors and large screen projection systems. Some TSPI visualization systems generate three-dimensional (3D) data as output, but the 3D data is translated so that it is compatible with 2D display systems. Currently, 3D volumetric display systems are being utilized by the Federal Aviation Administration (FAA) for monitoring air traffic in 3D without 3D goggles. The aircraft’s position information is derived from radar and fed to a volumetric display. The AFFTC and AAC need a similar system for Open Air Range testing utilizing the Global Positioning System (GPS) as the source of position information and Installed Systems Testing utilizing 6 Degree of Freedom (DOF) flight simulation data as the source of position information. This system should be capable of displaying realistic terrain structures, vehicle models and physical test configurations along with text data overlays. The ability to display the mission in real-time on a volumetric 3D display makes it possible for test engineers to observe resource utilization continuously as the mission develops. Quicker turn-around times in the decision process will lead to more efficient use of limited test resources and will increase the information content of the data being collected.
116

Recognising three-dimensional objects using parameterized volumetric models

Borges, Dibio Leandro January 1996 (has links)
This thesis addressed the problem of recognizing 3-D objects, using shape information extracted from range images, and parameterized volumetric models. The domains of the geometric shapes explored is that of complex curved objects with articulated parts, and a great deal of similarity between some of the parts. These objects are exemplified by animal shapes, however the general characteristics and complexity of these shapes are present in a wide range of other natural and man-made objects. In model-based object recognition three main issues constrain the design of a complete solution: representation, feature extraction, and interpretation. this thesis develops an integrated approach that addresses these three issues in the context of the above mentioned domain of objects. For representation I propose a composite description using globally deformable superquadratics and a set of volumetric primitives called geons: this description is shown to have representational and discriminative properties suitable for recognition. Feature extraction comprises a segmentation process which develops a method to extract a parts-based description of the objects as assemblies of defoemable superquadratics. Discontinuity points detected from the images are linked using 'active contour' minimization technique, and deformable superquadratic models are fitted to the resulting regions afterwards. Interpretation is split into three components: classification of parts, matching, and pose estimation. A Radical Basis Function [RBF] classifier algoritm is presented in order to classify the superquadratics shapes derived from the segmentation into one of twelve geon classes. The matching component is decomposed into two stages: first, an indexing scheme which makes effective use of the output of the [RBF] classifier in order to direct the search to the models which contain the parts identified. this makes the search more efficient, and with a model library that is organised in a meaningful and robust way, permits growth without compromising performance. Second, a method is proposed where the hypotheses picked from the index are searched using an Interpretation Tree algorithm combined with a quality measure to evaluate the bindings and the final valid hypotheses based on Possibility Theory, or Theory of Fuzzy Sets. The valid hypotheses ranked by the matching process are then passed to the pose estimation module. This module uses a Kalman Filter technique that includes the constraints on the articulations as perfect measurements, and as such provides a robust and generic way to estimate pose in object domains such as the one approached here. These techniques are then combined to produce an integrated approach to the object recognition task. The thesis develops such an integrated approach, and evaluates its perfomance inthe sample domain. Future extensions of each technique and the overall integration strategy are discussed.
117

Calibration of water content reflectometer in Rocky Mountain arsenal soil

Tang, Yucao 2009 August 1900 (has links)
This paper describes how water content reflectometers (WCRs) were analyzed to develop a calibration equation. Time domain reflectometry (TDR) technique is the most prevalent method in in-situ moisture monitoring; and WCR is a type of low frequency TDR sensors, which is sensitive to soil type. Developing soil-specific calibration and investigating different environmental effects on WCR calibration is important. This study focused on investigation of the soil dry density and temperature effects on WCR calibration in RMA soil. Two series of tests to develop soil-specific calibration with dry density and temperature offset were conducted. Results from testing program showed that WCR response was positive related to volumetric water content, dry density, and temperature. Equations were developed to illustrate the response-density-temperature-moisture relation. Application to a field site was also presented to illustrate the difference in volumetric water contents obtained by using manufacturer method and the calibration procedure drawn in this paper. / text
118

Toward functional imaging of the placenta

Stevenson, Gordon N. January 2014 (has links)
In obstetrics, the application of computer-based image analysis to provide deeper insight into pathology in early pregnancy is highly desirable but underdeveloped. One such pathology, fetal growth restriction (FGR) is a leading cause of mortality and morbidity in pregnancy. FGR affects approximately 3-10% of pregnancies in the western world leading to increased risk of stillbirth and health problems in later life. Morphometric or functional measurement of the placenta in pregnancy in utero may aid diagnosis of this pathology as the interface between placenta and mother is the site where the pathology manifests itself. Detection of growth restriction is yet to be resolved as poor, unreliable biochemical and image-based biomarkers have made it hard to detect and manage these pregnancies effectively. By the provision and development of tools for quantification of the placenta by three dimensional (3D) ultrasound (US) using image segmentation and mesh pro- cessing, this thesis aims to better facilitate clinical investigation of this major problem in obstetric healthcare. In a first contribution, 3D placental volume measurement using 3D US is used to classify the difference between normal and FGR pregnancies. Volume was estim- ated using the semi-automated random walker (RW) algorithm. The repeatability and reliability of the method was tested between three observers and showed the new method to be equivalent to manual segmentation. In a study of 143 women performed by our clinical partners, significantly smaller placental volume was found in pregnancies defined as small-for-gestational age (SGA). Expanding on volumetry, the utero-placental interface (UPI) is the location where the pathology that leads to FGR occurs. Manual manipulation of the volume is requiredtovisualisetheinterface, soweinvestigatedapplyinga“meshflattening” process to convert the contorted UPI into a disc to provide a standardised way to view the interface between acquistions and subjects. Finally, an existing, two dimensional (2D) Doppler standardisation technique was extended into 3D to provide standardisation of values of Doppler vascularity. This technique was then applied to measure the vascularity of volumes of interest relat- ive to the interface between placenta and mother. This test was then applied clin- ically in 143 women and found that the vascularity of the small-for-gestational age (SGA) pregnancies was significantly smaller than that of the population who produced appropriately sized babies. These three tools each provide augmentation of our understanding of placental health and function in pregnancy. From measuring the gross volume to estimat- ing the blood flow we show the potential clinical application for image analysis performed on 3D power Doppler (PD) ultrasound volumes.
119

Assessment of changes in the size of periapical radiolucencies 3-12 months post non-surgical root canal treatment using CBCT imaging: A pilot study

Fike, Jeremy W, DDS 01 January 2016 (has links)
The purpose of this study was to assess the changes in size of periapical lesions 3-12 months following root canal treatment using CBCT. Patients who had non-surgical root canal therapy (NSRCT) or non-surgical retreatment (ReTx) from July 30,2014 to August 19, 2015 with a periapical lesion of endodontic origin and received NSRCT or ReTx and had a pre-treatment or intra-treatment CBCT were invited to participate. Volumetric and linear measurements of periapical lesions on initial and post- treatment CBCT images were performed. A total of 20 patients with 23 treated teeth with 30 separate periapical radiolucent lesions returned for follow up 91-390 days after the initiation of endodontic treatment. Lesions showed an overall reduction in volume (p=0.0096), maximum coronal diameter (p=0.0117), maximum sagittal diameter (p=0.0071), and maximum axial diameter (p=0.0006). Lesions show a significant reduction in size 3-12 months following non-surgical endodontic treatment using CBCT.
120

Représentations efficaces de l'apparence sous-pixel / Efficient models for representing sub-pixel appearances

Loubet, Guillaume 25 June 2018 (has links)
L'objectif de cette thèse est lerendu de scène virtuelles extrêmement détaillées.Nous nous intéressons plus particulièrementaux algorithmes de rendu de haute qualitéreposant sur du path tracing qui sont très largement utilisésdans l'industrie des effets spéciaux et pour calculerle rendu des films d'animations.Les grandes quantités de détail nécessaires à la modélisationde mondes virtuels crédibles soulèvent de sérieux problèmes d'efficacitédu rendu qui paralysent les studios et compliquent grandementle travail des artistes. Nous introduisons de nouveaux algorithmespour préfiltrer les objets 3D complexes à des échelles arbitraires,afin de réduire les temps de chargement, les coûts d'intersection des rayons lumineux,le calcul des matériaux et la quantité de bruit dans les images,le tout sans porter atteinte à la qualité du rendu.Nos contributions principales sont une nouvelle approchehybride de niveaux de détail qui allie les avantages des maillageset des représentations volumiques pour le préfiltrage des objets complexes àdes échelles arbitraires, ainsi qu'une nouvelle approchede préfiltrage pour le cas des volumes hétérogènes de haute résolution. / We address the problem of rendering extremely complex virtualscenes with large amounts of detail. We focuson high-end off-line rendering algorithms based on path tracing thatare intensively used in the special effects and 3D animation industries.The large amounts of detail required for creating believable virtual worldsraise important efficiency problemsthat paralyze production rendering pipelines andgreatly complicate the work of 3D artists.We introduce new algorithms for prefilteringcomplex 3D assets while preserving their appearanceat all scales, in order to reduce loading times,ray intersection costs, shading costs and Monte Carlo noise,without lowering the quality of rendered frames.Our main contributions are a new hybrid LODapproach that combines the benefits ofmeshes and volumetric representationsfor prefiltering complex 3D assets,as well as a new approach for prefilteringhigh-resolution heterogeneous participatingmedia.

Page generated in 0.0631 seconds