• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 220
  • 113
  • 66
  • 35
  • 29
  • 22
  • 16
  • 16
  • 16
  • 11
  • 10
  • 2
  • 2
  • 1
  • Tagged with
  • 624
  • 117
  • 68
  • 61
  • 60
  • 58
  • 53
  • 50
  • 48
  • 48
  • 44
  • 38
  • 38
  • 36
  • 36
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Design of 3D Graphic Tile-based Rendering Engine for Embedded Systems

Tsai, Chung-hua 03 September 2007 (has links)
Due to the increasing demand of three-dimensional (3D) graphic applications in various consumer electronics, how to develop a low-cost 3D graphic hardware accelerator suitable for the embedded systems has become an important issue. A typical 3D graphic accelerator includes a geometry sub-system and a rendering sub-system. In this thesis a highly-efficient 3D graphic rendering intellectual property (IP) based on the tiled-based approach is proposed. An entire rendering IP consists of several modules. The main contributions of this thesis focus on the development of the setup-engine, rasterization module, and the integration of the whole modules for the rendering IP. In the design of setup engine, the thesis develops a folded arithmetic unit architecture mainly consisting of one iterative divider, three multipliers and several adders, which can finish the overall computation of the setup equations within less than 50 cycles. As for the rasterization module, this thesis develops several scan-conversion algorithms including hierarchical, fast skip, and boundary-edge test methods suitable for the tiled-based rendering process. The ordinary line drawing algorithm for the scan-line boundary search or the direct in-out test approach is not efficient for tile-based approach since the shape of triangle primitives may become irregular after tiling. Our experimental results show that the boundary-edge test can lead to the most compact design since it can transform the normal in-out test circuit for single pixel to detect two end-points of the scan-line simultaneously. In addition, the rasterization module can be divided into the scan-line and the fragment generation parts which can help the optimization and speedup of the individual part to achieve the desired overall fill-rate goal. Our simulation shows the fill-rate improvement based on this approach is around 60%. Finally, this thesis integrates all the sub-modules to the entire rendering IP core. This IP has been realized by 0.18 um technology. The total gate count is 504k. It can run up to 166 Mhz, and deliver the peak fill rate of 333M pixels/sec and 1.3G texels/sec. This IP has been highly verified, and achieves more than 95% code coverage. It has also been integrated with OPENGL ES software module, Linux operation system and geometry module, and successfully prototyped on the ARM versatile platform.
22

Embedded boundary scan for test & debug

Baig, Aijaz January 2009 (has links)
The boundary scan standard which has been in existence since the early nineties is widely used to test printed circuit boards (PCB). It is primarily aimed at providing increased physical test access to surface mounted devices on printed circuit boards (PCB). Using boundary scan avoids using functional testing and In-circuit-techniques like 'bed of nails' for structurally testing PCBs as increasing densities and complexities made opting for them a herculean task. Though the standard has had a revolutionizing effect on board testing conducted during the development and production phases, there is a lack of a standardized mechanism to allow IEEE 1149.1 to be used in a system post installation. This has led to problems typically encountered during field test runs, like the issue of high number of No-Fault-Found (NFF), being left unaddressed. The solution lies in conducting a structural test after a given module has already been installed in the system. This can be done by embedding the programmability features of the boundary scan test mechanism into the Unit under test (UUT) thereby enabling the UUT to conduct boundary scan based self tests without the need of external stimuli. In this thesis, a test and debug framework, which aims to use boundary-scan in post system-installation, is the subject of a study and subsequent enhancement. The framework allows embedding much of the test vector deployment and debug mechanism onto the Unit under test (UUT) to enable its remote testing and debug. The framework mainly consists of a prototype board which, along with the UUT, comprise the 'embedded system'. The following document is a description of the phased development of above said framework and its intended usage in the field.
23

Design of Low-cost Rendering Engine for 3D Stereoscopic Graphics

Lin, Shih-ming 14 February 2011 (has links)
In order to realize the advanced graphics rendering algorithms which tends to become more complex and flexible, more and more graphics processor units (GPU) include a micro-processor-like core to support the programmable shading capability. However, since the number of cycles spent in the fragment shader in programmable GPU will vary with different applications, the hardware implementation of the remaining fixed function of the graphics rendering flow becomes not trivial because the suitable target throughput is hard to set. In addition, the data transfer between the shader processor and other hardware fixed-function modules will also represent a big overhead. Therefore, this thesis focuses on realizing the rasterization, which is a very important fixed rendering function, and proposes a pure-software solution that can be executed by the shader processor. The pure-software rasterization requires 98 cycles in setup-stage, and an average of 13 cycles per pixel in interpolation-stage. To further accelerate this rasterization, this thesis also proposes an hardware-software codesign which uses a embedded scan-conversion unit to cooperate with the shader processor. This unit costs about 8.5K gates, which occupies only 1.7% of the entire GPU, but can help reduce more than 30% cycles compared with the pure-software approach in the test-benches used in this thesis. The other contribution of this thesis is to implement the stereoscopic graphic rendering function. To provide stereoscopic effect, the graphic rendering system has to run the entire rendering flow for additional passes to generate the results from different views. However, this thesis will embed an additional code in the fragement shader to adjust the x-coordinate position generated by vertex shader to avoid the additional running pass of the vertex shader.
24

Principles in Searching for, Detection and Identification Underwater Stationary Targets

Tsai, Ying-guan 26 July 2006 (has links)
Recovery of unattached offshore facilities or missing equipments is a challenging activity. Generally speaking, this activity involves a comprehensive procedure which includes: target characterization, searching, detection, verification, locating, reacquisition and salvage. Among them, target searching and detection are the most critical components of the whole procedure. The purpose of this investigation was dedicated in discussing the efficiency by the application of side-scan sonar, magnetometer and sub-bottom profiler simultaneously in searching, detecting, identifying and locating underwater stationary targets. Procedures of this research include: 1. Discussing the capabilities of instruments and verification cruises on target. 2. Discussing the salvage activity we conducted off Kaohsiung harbor on a depleted anchor. 3. Estimating the practicability of the methodology. According to the characteristics of these apparatus, the water depth, collected by echo sounder, is capable of expressing the relief of the seabed. Seabed sonographs, recorded by side-scan sonar, show that it is feasible to detect, verify and locate targets on the seabed. Sub-bottom profiler provides the sub-surface sedimentary information which can be used to detect buried targets. Magnetometer can detect environmental magnetic intensities, which can locate and determine the size of ferrous targets. Two depleted anchors were recorded off Kaohsiung harbor on the navigation chart. A recovery plan was then arranged which included 4 phases: collection of anchor characteristics, initial field survey and target detection, target verification and locating, target recovery. The underwater searching equipment employed in this activity include: side-scan sonar, sub-bottom profiler, magnetometer, echo sounder, underwater positioning system (include GPS), remotely operated vehicle (ROV) and professional divers. The offshore working platform used in this activity was R/V Ocean Research#3. Results of the initial search phase by side-scan sonar indicated there was only one potential target in the searching area. Follow up verification cruises confirmed acoustically that the target was an anchor with a piece of chain clogged on a block. The results of this investigation included the information such as the dimensions and the location of the anchor. Furthermore, the reason which caused the anchor being abandoned on the seafloor was derived. For underwater ferrous targets, such as anchor and chain cable, all of the aforementioned apparatus, have good potential for their detection and verification. It can be concluded that, applying these apparatus simultaneously can more effectively conduct searching, detecting, identifying and locating underwater stationary targets than by the application of a single instrument such as side-scan sonar system. Optical verifications of this target by ROV were attempted, nevertheless, were not success due to the difficulties in maneuvering OR#3 into proper position. An attempt to recovery this target by divers was arranged. But due to bad weather and rough sea state, the divers were not even allowed to dive. However, according to the experiences collected, a target reacquisition and recovery facility was built to fulfill the necessity of guiding divers to the target and lift it.
25

MARS Spectral CT: Image quality performance parameters using the Medipix3.0 detector

Tang, Dikai Nate January 2013 (has links)
The research in this thesis was undertaken because information on the relationship between scan parameters and image quality for the MARS spectral CT was lacking. However, the MARS spectral CT is expected to extend into clinical use in the future, so it is absolutely crucial that we know how the quality of the images that it produces is effected by different can parameters. This will allow us to make further improvements to the machine, and ultimately help clinicians to visualise important information in patients which are not revealed by other imaging modalities. This thesis provides information on how the image quality is affected by different scan parameters on the MARS spectral CT using a Medipix3 silicon quad detector. In particular, it explores how different numbers of projections, exposure time products (mAs), and peak tube voltages (kVp) with different threshold energies (kV) effect the image noise, image resolution and image uniformity, respectively. This provides a set of guidelines for future work using the MARS scanner to obtain images of optimal quality. This thesis also determines that the new image reconstruction software mART developed by Niels de Ruiter, is a suitable replacement for the reconstruction software OctopusCT that is currently being used by the MARS team. Using mART reduces the scan times and dose delivered by the MARS spectral CT.
26

Construction of Bone Anisotropic Finite Element Model from Computed Tomography (CT) Scans

kazembakhshi, siamak 17 September 2014 (has links)
The thesis proposes a new procedure to describe bone anisotropy in the finite element model using computed tomography (CT) images. First, bone density was correlated to CT numbers using the empirical function established in previous studies; pointwise bone density gradient was then calculated from interpolation functions of bone densities. Second, principal anisotropic directions were defined using the bone density gradient. Third, the magnitude of bone density gradient was incorporated to an existing bone elasticity-density correlation established by experiments. A method was also introduced to assign the anisotropic material properties to finite element models in Abaqus. The effect on the predicted von Misses stresses and principal strains in the bone by adopting the anisotropic or isotropic material model was investigated by finite element simulations using Abaqus.
27

Στατιστικές συναρτήσεις σάρωσης και αξιοπιστία συστημάτων / Scan statistics and systems' reliability

Πήττα, Θεοδώρα 22 December 2009 (has links)
Σκοπός της εργασίας είναι η σύνδεση της στατιστικής συνάρτησης σάρωσης S_(n,m), που εκφράζει τον μέγιστο αριθμό των επιτυχιών που περιέχονται σε ένα κινούμενο παράθυρο μήκους m το οποίο “σαρώνει” n - συνεχόμενες προσπάθειες Bernoulli, με την αξιοπιστία ενός συνεχόμενου k-μεταξύ-m-από-τα-n συστήματος αποτυχίας (k-μεταξύ-m-από-τα-n:F σύστημα). Αρχικά υπολογίζουμε τη συνάρτηση κατανομής και τη συνάρτηση πιθανότητας της στατιστικής συνάρτησης σάρωσης S_(n,m). Αυτό το επιτυγχάνουμε συνδέοντας την S_(n,m) με την τυχαία μεταβλητή T_k^((m))που εκφράζει τον χρόνο αναμονής μέχρι να συμβεί μια γενικευμένη ροή ή αλλιώς μέχρι να συμβεί η “πρώτη σάρωση” σε μια ακολουθία τυχαίων μεταβλητών Bernoulli οι οποίες παίρνουν τιμές 0 ή 1 ανάλογα με το αν έχουμε αποτυχία ή επιτυχία, αντίστοιχα. Υπολογίζουμε τη συνάρτηση κατανομής και τη συνάρτηση πιθανότητας της T_k^((m)) είτε με τη μέθοδο της εμβάπτισης σε Μαρκοβιανή αλυσίδα είτε μέσω αναδρομικών τύπων και παίρνουμε τις αντίστοιχες συναρτήσεις για την τυχαία μεταβλητή S_(n,m) [Glaz and Balakrishnan (1999), Balakrishnan and Koutras (2001)]. Στη συνέχεια ασχολούμαστε με την αξιοπιστία του συνεχόμενου k-μεταξύ-m-από-τα-n:F συστήματος (Griffith, 1986). Ένα τέτοιο σύστημα αποτυγχάνει αν ανάμεσα σε m συνεχόμενες συνιστώσες υπάρχουν τουλάχιστον k που αποτυγχάνουν (1≤k≤m≤n). Παρουσιάζουμε ακριβείς τύπους για την αξιοπιστία για k=2 καθώς και για m=n,n-1,n-2,n-3 (Sfakianakis, Kounias and Hillaris, 1992) και δίνουμε έναν αναδρομικό αλγόριθμο για τον υπολογισμό της (Malinowski and Preuss, 1994). Χρησιμοποιώντας μια δυϊκή σχέση ανάμεσα στη συνάρτηση κατανομής της T_k^((m)) και κατ’ επέκταση της S_(n,m) με την αξιοπιστία, συνδέουμε την αξιοπιστία αυτού του συστήματος με τη στατιστική συνάρτηση σάρωσης S_(n,m). Τέλος σκιαγραφούμε κάποιες εφαρμογές των στατιστικών συναρτήσεων σάρωσης στην μοριακή βιολογία [Karlin and Ghandour (1985), Glaz and Naus (1991), κ.ά.], στον ποιοτικό έλεγχο [Roberts,1958] κ.τ.λ.. / The aim of this dissertation is to combine the scan statistic S_(n,m), which represents the maximum number of successes contained in a moving window of length m over n consecutive Bernoulli trials, with the reliability of a consecutive k-within-m-out-of-n failure system (k-within-m-out-of-n:F system). First, we evaluate the probability mass function and the cumulative distribution function of the random variable S_(n,m). We obtain that by combining S_(n,m) with the random variable T_k^((m)) which denotes the waiting time until for the first time k successes are contained in a moving window of length m (scan of type k/m) over a sequence of Bernoulli trials with 1 marked as a success and 0 as a failure. The probability mass function and the cumulative distribution function of T_k^((m)) are evaluated using two methods: i. Markov chain embedding method and ii. recursive schemes. Finally, through T_k^((m)) we evaluate the probability mass function and the cumulative distribution function of S_(n,m) [Glaz and Balakrishnan (1999), Balakrishnan and Koutras (2002)]. Next, we evaluate the reliability, R, of the consecutive k-within-m-out-of-n failure system (Griffith, 1986). Such a system fails if and only if there exist m consecutive components which include among them at least k failed ones (1≤k≤m≤n). Exact formulae for the reliability are presented for k=2 as well as for m=n,n-1,n-2,n-3 (Sfakianakis, Kounias and Hillaris, 1992). A recursive algorithm for the reliability evaluation is also given (Malinowski and Preuss, 1994). Using a dual relation between the cumulative distribution function of T_k^((m)) and therefore of S_(n,m) and the reliability R, we manage to combine the reliability of this system with the scan statistic S_(n,m). Finally, we briefly present some other applications of the scan statistics in molecular biology [Karlin and Ghandour (1985), Glaz and Naus (1991), e.t.c.], quality control [Roberts,1958] and other more.
28

Άμεσος υπολογισμός του μη γραμμικού δείκτη διάθλασης μέσω της τεχνικής Z-scan για ελλειπτικές γκαουσιανές δέσμες

Σωτηρίου, Χριστίνα 01 October 2012 (has links)
Η παρούσα μεταπτυχιακή διπλωματική εργασία ασχολείται με τον άμεσο υπολογισμό του μη γραμμικού δείκτη διάθλασης με τη βοήθεια μιας απλής μαθηματικής σχέσης και ενός πίνακα που παράγαμε, εφαρμόζοντας την τεχνικής Ζ-scan με απευθείας μέτρηση των διαστάσεων της δέσμης και τη χρήση ελλειπτικής γκαουσιανής δέσμης. Προτού αναφερθούμε στη μέθοδο υπολογισμού που προτείνουμε, θα γίνει εισαγωγή στη μη γραμμική οπτική, στη συνέχεια θα αναλυθούν οι βασικές μορφές των δεσμών λέιζερ που χρησιμοποιούνται και μετά τα φαινόμενα της αυτό-εστίασης και από-αύτο-εστίασης, που είναι βασικά για την κατανόηση της λειτουργίας της τεχνικής Ζ-scan και της ερμηνείας των καμπυλών Ζ-scan. Έπειτα παρουσιάζεται η τεχνική Ζ-scan και οι διάφορες παραλλαγές της με τα πλεονεκτήματα και τα μειονεκτήματα τους. Κατόπιν μελετάται η μορφή των καμπυλών Ζ-scan με απευθείας μέτρηση των διαστάσεων της δέσμης για κυκλικές και ελλειπτικές Γκαουσιανές δέσμες. Τέλος, εξηγούμε πως μέσω πολλών εξομοιώσεων Ζ-scan που πραγματοποιήσαμε, καταφέραμε να παράγουμε τη μαθηματική σχέση και τον αντίστοιχο πίνακα, που μας φέρνουν στο επιθυμητό αποτέλεσμα του υπολογισμού του μη γραμμικού δείκτη διάθλασης ενός υλικού αποφεύγοντας τους επίπονους και χρονοβόρους μαθηματικούς υπολογισμούς του παρελθόντος. / -
29

The Impact of Liquefaction on the Microstructure of Cohesionless Soils

January 2013 (has links)
abstract: The effect of earthquake-induced liquefaction on the local void ratio distribution of cohesionless soil is evaluated using x-ray computed tomography (CT) and an advanced image processing software package. Intact, relatively undisturbed specimens of cohesionless soil were recovered before and after liquefaction by freezing and coring soil deposits created by pluviation and by sedimentation through water. Pluviated soil deposits were liquefied in the small geotechnical centrifuge at the University of California at Davis shared-use National Science Foundation (NSF)-supported Network for Earthquake Engineering Simulation (NEES) facility. A soil deposit created by sedimentation through water was liquefied on a small shake table in the Arizona State University geotechnical laboratory. Initial centrifuge tests employed Ottawa 20-30 sand but this material proved to be too coarse to liquefy in the centrifuge. Therefore, subsequent centrifuge tests employed Ottawa F60 sand. The shake table test employed Ottawa 20-30 sand. Recovered cores were stabilized by impregnation with optical grade epoxy and sent to the University of Texas at Austin NSF-supported facility at the University of Texas at Austin for high-resolution CT scanning of geologic media. The local void ratio distribution of a CT-scanned core of Ottawa 20-30 sand evaluated using Avizo® Fire, a commercially available advanced program for image analysis, was compared to the local void ratio distribution established on the same core by analysis of optical images to demonstrate that analysis of the CT scans gave similar results to optical methods. CT scans were subsequently conducted on liquefied and not-liquefied specimens of Ottawa 20-30 sand and Ottawa F60 sand. The resolution of F60 specimens was inadequate to establish the local void ratio distribution. Results of the analysis of the Ottawa 20-30 specimens recovered from the model built for the shake table test showed that liquefaction can substantially influence the variability in local void ratio, increasing the degree of non-homogeneity in the specimen. / Dissertation/Thesis / M.S. Civil and Environmental Engineering 2013
30

Estudo do microcrédito na cidade de Goiânia: o espaço é relevante?

Oliveira, Felipe Resende 12 March 2014 (has links)
Submitted by Suethene Souza (suethene.souza@ufpe.br) on 2015-03-13T16:47:41Z No. of bitstreams: 2 DISSERTAÇÃO Filipe Resende Oliveira.pdf: 1143088 bytes, checksum: 00731c4fa9cfba25729e5095d7ba0be0 (MD5) license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) / Made available in DSpace on 2015-03-13T16:47:41Z (GMT). No. of bitstreams: 2 DISSERTAÇÃO Filipe Resende Oliveira.pdf: 1143088 bytes, checksum: 00731c4fa9cfba25729e5095d7ba0be0 (MD5) license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) Previous issue date: 2014-03-12 / CNPQ / O trabalho busca analisar a possibilidade de influência do ambiente nos empréstimos realizados pelo Banco do Povo de Goiânia. Além disso, o trabalho visa captar a presença de alguma influência do ambiente para aglomeração dos indivíduos inadimplentes. A base de dados é obtida pelo Banco do Povo de Goiânia. O estudo se baseia nos modelos de difusão da informação. A metodologia utilizada para detecção de clusters espacial é o modelo Scan Statistics, no qual as distribuições de probabilidade associada aos dados em aleatoriedade espacial são as distribuições de Poisson e Bernoulli. Os resultados indicam a existência de cluster para os empreendedores. Quando analisamos os clientes inadimplentes há 30 dias ou mais, o método indica que os clientes estão distribuídos aleatoriamente no município de Goiânia.

Page generated in 0.0803 seconds