• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 359
  • 54
  • 47
  • 45
  • 37
  • 19
  • 16
  • 6
  • 6
  • 6
  • 3
  • 3
  • 3
  • 3
  • 2
  • Tagged with
  • 727
  • 318
  • 113
  • 77
  • 74
  • 66
  • 57
  • 54
  • 54
  • 51
  • 41
  • 41
  • 41
  • 37
  • 35
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
401

Metodologia de otimização de lentes para lâmpadas de LED / Lenses optimization methodology for LED lamps

Barbosa, José Luiz Ferraz 14 June 2013 (has links)
Submitted by Erika Demachki (erikademachki@gmail.com) on 2014-09-25T20:32:22Z No. of bitstreams: 3 Dissertacao_JLFB_VFinal.pdf: 18601269 bytes, checksum: ec6c9955a2f1d95120cab2f797232e97 (MD5) Capa Dissertacao_JLFB_VFinal.pdf: 4769532 bytes, checksum: 3d15c52d57be870999449e2a7b78fbb3 (MD5) license_rdf: 23148 bytes, checksum: 9da0b6dfac957114c6a7714714b86306 (MD5) / Approved for entry into archive by Luciana Ferreira (lucgeral@gmail.com) on 2014-09-26T11:57:17Z (GMT) No. of bitstreams: 3 Dissertacao_JLFB_VFinal.pdf: 18601269 bytes, checksum: ec6c9955a2f1d95120cab2f797232e97 (MD5) Capa Dissertacao_JLFB_VFinal.pdf: 4769532 bytes, checksum: 3d15c52d57be870999449e2a7b78fbb3 (MD5) license_rdf: 23148 bytes, checksum: 9da0b6dfac957114c6a7714714b86306 (MD5) / Made available in DSpace on 2014-09-26T11:57:17Z (GMT). No. of bitstreams: 3 Dissertacao_JLFB_VFinal.pdf: 18601269 bytes, checksum: ec6c9955a2f1d95120cab2f797232e97 (MD5) Capa Dissertacao_JLFB_VFinal.pdf: 4769532 bytes, checksum: 3d15c52d57be870999449e2a7b78fbb3 (MD5) license_rdf: 23148 bytes, checksum: 9da0b6dfac957114c6a7714714b86306 (MD5) Previous issue date: 2013-06-14 / The purpose of this work is to present a methodology for optimizing the geometry of the Light Emitting Diode (LED) secondary lens, in non-imaging applications, which focuses on the distribution of illuminance on a target plane. The simulation of Ray Tracing is produced by stochastic method and the optimization process based on heuristic search interacts with Ray Tracing to nd the optimized parameters of the LED secondary lens geometry. / O propósito deste trabalho e apresentar uma metodologia para otimiza ção da geometria da lente secund ária de Light Emitting Diode (LED) para aplica ção em iluminação, tendo como enfoque a distribuição da iluminância sobre um plano alvo. A simulação do Ray Tracing e produzida através do m etodo estoc astico e o processo de otimização interage com o Ray Tracing através de um m étodo heur ístico na busca dos parâmetros otimizados da geometria da lente secundária do LED.
402

Legislating for Gender Equality in Korea: The Role of Women and Political Parties in Shaping the Timing of Legislation

January 2019 (has links)
abstract: This study examines the factors that shape the timing of a passage of a piece of controversial gender equality legislation by conducting a case study of the abolition of the family-head system in South Korea. This study draws on the method of process tracing with the data collected from the archives and the interviews. The case study mainly compares the legislative processes for the bills on the abolition of the family-head system in 16th and the 17th National Assemblies, in which the bills resulted to opposite outcomes. This study argues that the institutions of the legislative process mediate the impact of relevant actors for gender equality policymaking. In the bill initiation stage, only a small number of the elected officials are required to introduce a bill, and women representatives serve a vital role as they are more likely to introduce feminist bills than their male colleagues. This study argues that 1) the background of the women influencing their commitment to feminist agendas, 2) strong women’s movements contributing to issue saliency, and thereby the policy priorities of the issue, and 3) the resources and constraints inside the party for feminist policymaking influenced by party ideology, shape how active women representatives will be in advocating controversial gender equality agendas. In the later stages of policymaking, the efforts of a small number of women members are offset by that of political parties. Emphasizing the positive agenda control of the majority party and the negative agenda control of the minority parties, this study suggests that party issue positions are critical for the outcome of the bill. To explain the party issue position (re)shape, this study underlines 1) public opinion, 2) the emergence of new voter groups leading to the decline of the cleavage politics, 3) new party entry, and 4) women in the party and the party leadership. The findings highlight that the major parties’ issue positions shift in the 17th National Assembly greatly contributed to amplifying the bargaining power of the key allies and weakening the institutional leverage of the opponents, leading to the successful legislation of the bill. / Dissertation/Thesis / Doctoral Dissertation Political Science 2019
403

New algorithms for in vivo characterization of human trabecular bone: development, validation, and applications

Liu, Yinxiao 01 January 2013 (has links)
Osteoporosis is a common bone disease that increases risk of low-trauma fractures associated with substantial morbidity, mortality, and financial costs. Clinically, osteoporosis is defined by low bone mineral density (BMD). BMD explains approximately 60-70% of the variance in bone strength. The remainder is due to the cumulative and synergistic effects of other factors, including trabecular and cortical bone micro-architecture. In vivo quantitative characterization of trabecular bone (TB) micro-architecture with high accuracy, reproducibility, and sensitivity to bone strength will improve our understanding of bone loss mechanisms and etiologies benefitting osteoporotic diagnostics and treatment monitoring processes. The overall aim of the Ph.D. research is to design, develop and evaluate new 3-D imaging processing algorithms to characterize the quality of TB micro-architectural in terms of topology, orientation, thickness and spacing, and to move the new technology from investigational research into the clinical arena. Two algorithms regarding to this purpose were developed and validated in detail - (1) star-line-based TB thickness and marrow spacing computation algorithm, and (2) tensor scale (t-scale) based TB topology and orientation computation algorithm. The TB thickness and marrow spacing algorithm utilizes a star-line tracing technique that effectively accounts for partial voluming effects of in vivo imaging with voxel size comparable to TB thickness and also avoids the problem of digitization associated with conventional algorithms. Accuracy of the method was examined on computer-generated phantom images while the robustness of the method was evaluated on human ankle specimens in terms of stability across a wide range of resolutions, repeat scan reproducibility under in vivo condition, and correlation between thickness values computed at ex vivo and in vivo resolutions. Also, the sensitivity of the method was examined by its ability to predict bone strength of cadaveric specimens. Finally, the method was evaluated in an in vivo human study involving forty healthy young-adult volunteers and ten athletes. The t-scale based TB topology and orientation computation algorithm provides measures characterizing individual trabeculae on the continuum between perfect plate and perfect rod as well as individual trabecular orientation. Similar to the TB thickness and marrow spacing computation algorithm, accuracy was examined on computer-generated phantoms while robustness of the algorithm across ex vivo and in vivo resolution, repeat scan reproducibility, and the sensitivity to experimental mechanical bone strength were evaluated in a cadaveric ankle study. And the application of the algorithm was evaluated in a human study involving forty healthy young-adult volunteers and ten patients with SSRI treatment. Beside these two algorithms, an image thresholding algorithm based on the class uncertainty theory is developed to segment TB structure in CT images. Although the algorithm was developed for this specific application, it also works effectively for general 2-D and 3-D images. Moreover, the class uncertainty theory can be utilized as adaptive information in more sophisticated image processing algorithms such as Snakes, ASMs and graph search.
404

Développement de modèles asymptotiques en contrôle non destructif (CND) par ultrasons : interaction des ondes élastiques avec des irrégularités géométriques et prise en compte des ondes de tête. / Development of asymptotic models in ultrasonic non destructive techniques (NDT) : elastic waves interaction with geometrical irregularities and head waves modeling.

Ferrand, Adrien 13 May 2014 (has links)
L’onde de tête est l’onde de première arrivée observée au cours d’une inspection TOFD (Time Of Flight Diffraction). La technique TOFD est une méthode d’inspection très répandue en CND (Contrôle Non Destructif) par ultrasons, faisant appel à deux capteurs piézoélectriques positionnés symétriquement et en vis-à-vis, avec un écartement constant, au-dessus de la surface d’entrée de la pièce à inspecter.Une étude numérique montre que la propagation de l’onde de tête près d’une surface d’entrée irrégulière n’est plus un phénomène de propagation uniquement surfacique comme dans le cas d’une surface plane, mais fait aussi intervenir un phénomène de propagation volumique induit par des diffractions du champ ultrasonore sur les irrégularités de surface.Pour modéliser ces phénomènes, une méthode générique de tracé de rayons fondée sur le principe de Fermat généralisé est développée et détermine le parcours effectif dans une pièce de surface irrégulière de toutes les ondes ultrasonores se propageant dont l’onde de tête.L’évaluation des phénomènes de diffraction par des modèles d’amplitude suivant une approche rayons permet ensuite d’obtenir une simulation complète (temps de vol, front d’onde et amplitude) de l’onde de tête pour plusieurs types d’irrégularités surfaciques. Des validations théoriques et expérimentales de l’outil de simulation développé ont été effectuées et se sont avérées concluantes. / The head wave is the first arrival wave received during a TOFD (Time Of Flight Diffraction) inspection. The TOFD technique is a classical ultrasonic NDT (Non Destructive Testing) inspection method employing two piezoelectric transducers which are symmetrically placed facing each other with a constant spacing above the inspected specimen surface.The head wave propagation along an irregular entry surface is shown by a numerical study to be not only a surface propagation phenomenon, as for the plane surface case, but also involves a bulk propagation phenomenon caused by diffractions of the ultrasonic wave field on the surface irregularities.In order to model theses phenomena, a generic ray tracing method based on the generalized Fermat’s principle has been developed and establishes the effective path of any ultrasonic propagating wave in a specimen of irregular surface, notably including the effective head wave path.The diffraction phenomena evaluation by amplitude models using a ray approach allows to provide a complete simulation (time of flight, wave front and amplitude) of the head wave for numerous kinds of surface irregularity. Theoretical and experimental validations of the developed simulation tool have been carried out and have proven successful.
405

Simulation d'un télescope Wolter-I grande focale pour l'astronomie X-dur. Application aux projets spatiaux Simbol-X et PheniX.

Chauvin, Maxime 28 February 2011 (has links) (PDF)
L'avenir de l'astronomie X-dur repose sur le développement de nouveaux instruments permettant la focalisation des photons d'une centaine de keV. En effet, la focalisation permet un gain considérable en sensibilité et en résolution angulaire. Obtenue par réflexions rasantes sur des miroirs Wolter-I, son utilisation jusqu'ici limitée à la dizaine de keV peut être étendue à plus haute énergie grâce à un revêtement spécifique et une importante focale. L'observation du rayonnement X ne pouvant se faire qu'au delà de notre atmosphère, les dimensions des observatoires, et donc leur focale, étaient limitées par les capacités des lanceurs. Depuis quelques années, de nouvelles technologies comme les mats déployables ou le vol en formation sont à l'étude pour s'affranchir de cette limite. Afin de mieux comprendre le fonctionnement de ces télescopes, je détaille la géométrie des miroirs Wolter-I, la réflectivité de leur revêtement, la détection dans un semi-conducteur ainsi que la dynamique liée aux mats déployables et au vol en formation. Ces télescopes sont des systèmes optiques complexes, sujets à déformation au cours d'une observation et nécessitent une métrologie précise pour mesurer ces déformations afin de corriger l'image. Pour en étudier les performances, j'ai développé un code reproduisant le fonctionnement réel du télescope. Chaque photon est traité individuellement, son parcours et ses interactions dépendent de l'évolution de la structure du télescope au cours du temps. Chaque élément du télescope est modélisé, ainsi que la métrologie nécessaire à la restitution de sa dynamique. Le parcours du photon est calculé dans un espace vectoriel à trois dimensions, en utilisant des méthodes Monte-Carlo pour reproduire les défauts et la réflectivité des miroirs ainsi que les interactions dans le détecteur. La simulation fournit des images et des spectres en énergie, dont on peut extraire la résolution angulaire, le champ de vue, la surface efficace et l'efficacité de détection. En 2006, la mission d'astronomie Simbol-X fut sélectionnée dans le cadre de l'étude du vol en formation. Ce concept permet d'atteindre une grande distance focale en distribuant le télescope sur deux satellites. Cependant, la dynamique particulière liée au vol en formation a des conséquences sur les performances du télescope et nécessite d'être maitrisée. Dans le cadre de cette mission, ma simulation a permis d'étudier les conséquences de chaque mouvement des satellites sur les performances du télescope ainsi que les conséquences des défauts de la métrologie sur la correction des images. Cette étude a apporté des contraintes sur le contrôle d'attitude de chaque satellite et sur la précision de la métrologie nécessaire. Au regard des résultats obtenus, je démontre la faisabilité d'un tel télescope. Au delà de la mission Simbol-X, je me suis intéressé à l'optimisation des performances d'un télescope X-dur. En utilisant ma simulation, j'ai étudié l'influence de chaque paramètre sur les performances du télescope. Ces études ont mené à la conception du projet PheniX, un télescope opérant dans la gamme 1-200 keV, proposé par le Centre d'Etude Spatial des Rayonnements dans le cadre de l'appel d'offre M3 de l'Agence Spatiale Européenne. Equipé d'un nouveau type de revêtement et d'une focale de 40 mètres obtenue avec un mât déployable, ce télescope affiche un niveau de performance à 100 keV plus de 100 fois supérieur aux missions actuelles. Je présente ce projet ainsi que ses performances attendues, dans la dernière partie de ma thèse.
406

Supernova Cosmology in an Inhomogeneous Universe

Gupta, Rahul January 2010 (has links)
<p>The propagation of light beams originating from synthetic ‘Type Ia’ supernovae, through an inhomogeneous universe with simplified dynamics, is simulated using a Monte-Carlo Ray-Tracing method. The accumulated statistical (redshift-magnitude) distribution for these synthetic supernovae observations, which is illustrated in the form of a Hubble diagram, produces a luminosity profile similar to the form predicted for a Dark-Energy dominated universe. Further, the amount of mimicked Dark-Energy is found to increase along with the variance in the matter distribution in the universe, converging at a value of Ω<sub>X</sub> ≈ 0.7.</p><p>It can be thus postulated that at least under the assumption of simplified dynamics, it is possible to replicate the observed supernovae data in a universe with inhomogeneous matter distribution. This also implies that it is demonstrably not possible to make a direct correspondence between the observed luminosity and redshift with the distance of a cosmological source and the expansion rate of the universe, respectively, at a particular epoch in an inhomogeneous universe. Such a correspondences feigns an apparent variation in dynamics, which creates the illusion of Dark-Energy.</p>
407

Produktionsflöde för rendering av bilder i katalogproduktion

Forsman, Maria, Stråle, Emma January 2005 (has links)
<p>På IKEA går produktionsflödet för en produktbild i dagsläget från konstruktionsritning till fotografering via tillverkning och montering. En produkt som är färdig för tillverkning definieras av sin mekaniska konstruktion och sina ytegenskaper. Ledtiden och kostnaden för bildproduktionen skulle kunna minskas betydligt om produkten kunde avbildas redan i detta stadium, utan att finnas tillverkad. För detta ändamål såg IKEA en möjlighet i att introducera ett nytt produktionsflöde där avancerad datorgrafik, grafisk teknik och bildbehandling omvandlar konstruktionsritningen till produktbild. Ambitionen är att från ritningen ta ut en 3D-modell, placera den i en omgivning, ljussätta och belägga den med material och av detta skapa en tryckbar produktbild som uppfyller företagets höga kvalitetskrav.</p><p>Syftet med examensarbetet var att ta fram förslag på ett konkret sådant flöde genom att undersöka olika programvaror som skulle kunna användas samt att identifiera problemområden och komma med lösningsförslag på dessa. Flödet delades upp i de fem problemområdena konvertering, modellering, ljussättning, material och rendering som bearbetades parallellt. Fokus har under hela projektet legat på färghantering och upplevd bildkvalitet genom att korrekt färg- och detaljåtergivning i bilderna varit pekpinne och ledsagare.</p><p>Konstruktionsritningarna som görs i SolidWorks har konverterats till 3D-modeller med programvaran PolyTrans. I 3ds max har produktmodellerna satts in i en miljö som liknar den verkliga fotostudion, belysts och belagts med material. Efter en studie av olika programvaror för rendering beslutades det att mental ray skulle användas varpå alla bilder skapats med denna. mental ray är en komplex renderare som beräknar ljussättning fysikaliskt riktigt vilket bidrar till fotorealistiska bilder.</p><p>Examensarbetet handlade till mångt och mycket om att köra olika produkter genom flödet och testa med olika inställningar i de olika stegen. Flödet utvärderades kontinuerligt genom att bilderna som kom ut bedömdes visuellt. Dessutom genomfördes några tester för att undersöka inblandade programvarors eventuella brister vad gäller färghantering och upplevd bildkvalitet. Det slutliga resultatet av examensarbetet blev ett flöde som i stor utsträckning liknar arbetsflödet vid nuvarande bildproduktion. Med minimal insats från retuschavdelningen uppfyller bilderna de krav som ställs för att de skall kunna tryckas i IKEA katalogen.</p>
408

Ray Tracing Bézier Surfaces on GPU

Löw, Joakim January 2006 (has links)
<p>In this report, we show how to implement direct ray tracing of B´ezier surfaces on graphics processing units (GPUs), in particular bicubic rectangular Bézier surfaces and nonparametric cubic Bézier triangles. We use Newton’s method for the rectangular case and show how to use this method to find the ray-surface intersection. For Newton’s method to work we must build a spatial partitioning hierarchy around each surface patch, and in general, hierarchies are essential to speed up the process of ray tracing. We have chosen to use bounding box hierarchies and show how to implement stackless traversal of such a structure on a GPU. For the nonparametric triangular case, we show how to find the wanted intersection by simply solving a cubic polynomial. Because of the limited precision of current GPUs, we also propose a numerical approach to solve the problem, using a one-dimensional Newton search.</p>
409

Supernova Cosmology in an Inhomogeneous Universe

Gupta, Rahul January 2010 (has links)
The propagation of light beams originating from synthetic ‘Type Ia’ supernovae, through an inhomogeneous universe with simplified dynamics, is simulated using a Monte-Carlo Ray-Tracing method. The accumulated statistical (redshift-magnitude) distribution for these synthetic supernovae observations, which is illustrated in the form of a Hubble diagram, produces a luminosity profile similar to the form predicted for a Dark-Energy dominated universe. Further, the amount of mimicked Dark-Energy is found to increase along with the variance in the matter distribution in the universe, converging at a value of ΩX ≈ 0.7. It can be thus postulated that at least under the assumption of simplified dynamics, it is possible to replicate the observed supernovae data in a universe with inhomogeneous matter distribution. This also implies that it is demonstrably not possible to make a direct correspondence between the observed luminosity and redshift with the distance of a cosmological source and the expansion rate of the universe, respectively, at a particular epoch in an inhomogeneous universe. Such a correspondences feigns an apparent variation in dynamics, which creates the illusion of Dark-Energy.
410

Ray Tracing Bézier Surfaces on GPU

Löw, Joakim January 2006 (has links)
In this report, we show how to implement direct ray tracing of B´ezier surfaces on graphics processing units (GPUs), in particular bicubic rectangular Bézier surfaces and nonparametric cubic Bézier triangles. We use Newton’s method for the rectangular case and show how to use this method to find the ray-surface intersection. For Newton’s method to work we must build a spatial partitioning hierarchy around each surface patch, and in general, hierarchies are essential to speed up the process of ray tracing. We have chosen to use bounding box hierarchies and show how to implement stackless traversal of such a structure on a GPU. For the nonparametric triangular case, we show how to find the wanted intersection by simply solving a cubic polynomial. Because of the limited precision of current GPUs, we also propose a numerical approach to solve the problem, using a one-dimensional Newton search.

Page generated in 0.0388 seconds