• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 159
  • 27
  • 23
  • 16
  • 11
  • 11
  • 6
  • 6
  • 5
  • 4
  • 3
  • 3
  • 2
  • 2
  • 1
  • Tagged with
  • 344
  • 42
  • 40
  • 28
  • 27
  • 25
  • 24
  • 23
  • 21
  • 19
  • 18
  • 17
  • 17
  • 17
  • 16
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
181

Wideband Reconfigurable Vector Antenna for 3-D Direction Finding Application

Duplouy, Johan 14 January 2019 (has links) (PDF)
Direction finding plays a crucial role in various civilian and military applications, related to either radionavigation or radiolocation. Most of the direction finding antennas operate over a wide frequency band, but only a minority of them enable the direction of arrival estimation of an incoming electromagnetic field over a 3-D angular coverage (i.e., estimation of both azimuth and elevation angles). An original approach to obtain a 3-D angular coverage consists in measuring the six components of the incident electromagnetic field through a so-called vector antenna. The aim of this Ph.D. is to design a passive, compact and wideband vector antenna in order to cover a maximum of applications. Two vector antennas have been designed, manufactured and experimentally characterized. Unlike conventional topology, they enable the measurement of the components of an incoming electromagnetic field thanks to the radiation pattern reconfigurability of an original arrangement of Vivaldi antennas. The first prototype is mounted over a finite metallic support and enables the direction of arrival estimation of vertically-polarized electromagnetic fields over a 1.69:1 bandwidth while the second one can be used regardless of the polarization of the incoming electromagnetic fields over a 8:1 bandwidth. Moreover, the direction finding performances of these vector antennas have been improved in terms of estimation accuracy, sensitivity, robustness to angular ambiguity and polarization mismatch by synthesizing new radiation patterns in the estimation process. A method based on the Cramer-Rao lower bound has been proposed to select efficiently and rapidly the additional radiation patterns
182

Automated and interactive approaches for optimal surface finding based segmentation of medical image data

Sun, Shanhui 01 December 2012 (has links)
Optimal surface finding (OSF), a graph-based optimization approach to image segmentation, represents a powerful framework for medical image segmentation and analysis. In many applications, a pre-segmentation is required to enable OSF graph construction. Also, the cost function design is critical for the success of OSF. In this thesis, two issues in the context of OSF segmentation are addressed. First, a robust model-based segmentation method suitable for OSF initialization is introduced. Second, an OSF-based segmentation refinement approach is presented. For segmenting complex anatomical structures (e.g., lungs), a rough initial segmentation is required to apply an OSF-based approach. For this purpose, a novel robust active shape model (RASM) is presented. The RASM matching in combination with OSF is investigated in the context of segmenting lungs with large lung cancer masses in 3D CT scans. The robustness and effectiveness of this approach is demonstrated on 30 lung scans containing 20 normal lungs and 40 diseased lungs where conventional segmentation methods frequently fail to deliver usable results. The developed RASM approach is generally applicable and suitable for large organs/structures. While providing high levels of performance in most cases, OSF-based approaches may fail in a local region in the presence of pathology or other local challenges. A new (generic) interactive refinement approach for correcting local segmentation errors based on the OSF segmentation framework is proposed. Following the automated segmentation, the user can inspect the result and correct local or regional segmentation inaccuracies by (iteratively) providing clues regarding the location of the correct surface. This expert information is utilized to modify the previously calculated cost function, locally re-optimizing the underlying modified graph without a need to start the new optimization from scratch. For refinement, a hybrid desktop/virtual reality user interface based on stereoscopic visualization technology and advanced interaction techniques is utilized for efficient interaction with the segmentations (surfaces). The proposed generic interactive refinement method is adapted to three applications. First, two refinement tools for 3D lung segmentation are proposed, and the performance is assessed on 30 test cases from 18 CT lung scans. Second, in a feasibility study, the approach is expanded to 4D OSF-based lung segmentation refinement and an assessment of performance is provided. Finally, a dual-surface OSF-based intravascular ultrasound (IVUS) image segmentation framework is introduced, application specific segmentation refinement methods are developed, and an evaluation on 41 test cases is presented. As demonstrated by experiments, OSF-based segmentation refinement is a promising approach to address challenges in medical image segmentation.
183

Integer programming-based decomposition approaches for solving machine scheduling problems

Sadykov, Ruslan 26 June 2006 (has links)
The aim in this thesis is to develop efficient enumeration algorithms to solve certain strongly NP-hard scheduling problems. These algorithms were developed using a combination of ideas from Integer Programming, Constraint Programming and Scheduling Theory. In order to combine different techniques in one algorithm, decomposition methods are applied. The main idea on which the first part of our results is based is to separate the optimality and feasibility components of the problem and let different methods tackle these components. Then IP is ``responsible' for optimization, whereas specific combinatorial algorithms tackle the feasibility aspect. Branch-and-cut and branch-and-price algorithms based on this idea are proposed to solve the single-machine and multi-machine variants of the scheduling problem to minimize the sum of the weights of late jobs. Experimental research shows that the algorithms proposed outperform other algorithms available in the literature. Also, it is shown that these algorithms can be used, after some modification, to solve the problem of minimizing the maximum tardiness on unrelated machines. The second part of the thesis deals with the one-machine scheduling problem to minimize the weighted total tardiness. To tackle this problem, the idea of a partition of the time horizon into intervals is used. A particularity of this approach is that we exploit the structure of the problem to partition the time horizon. This particularity allowed us to propose two new Mixed Integer Programming formulations for the problem. The first one is a compact formulation and can be used to solve the problem using a standard MIP solver. The second formulation can be used to derive lower bounds on the value of the optimal solution of the problem. These lower bounds are of a good quality, and they can be obtained relatively fast.
184

Stiffness and vibration properties of slender tensegrity structures

Dalil Safaei, Seif January 2012 (has links)
The stiffness and frequency properties of tensegrity structures are functions of the pre-stress, topology, configuration, and axial stiffness of the elements. The tensegrity structures considered are tensegrity booms, tensegrity grids, and tensegrity power lines. A study has been carried out on the pre-stress design. It includes (i) finding the most flexible directions for different pre-stress levels, (ii) finding the pre-stress pattern which maximizes the first natural frequency. To find the optimum cross-section areas of the elements for triangular prism and Snelson tensegrity booms, an optimization approach is utilized. A constant mass criterion is considered and the genetic algorithm (GA) is used as the optimization method. The stiffness of the triangular prism and Snelson tensegrity booms are modified by introducing actuators. An optimization approach by means of a GA is employed to find the placement of the actuators and their minimum length variations. The results show that the bending stiffness improves significantly, but still an active tensegrity boom is less stiff than a passive truss boom. The GA shows high accuracy in searching the non-structural space. The tensegrity concept is employed to design a novel transmission power line .A tensegrity prism module is selected as the building block. A complete parametric study is performed to investigate the influence of several parameters such as number of modules and their dimensions on the stiffness and frequency of the structure. A general approach is suggested to design the structure considering wind and ice loads. The designed structure has more than 50 times reduction of the electromagnetic field and acceptable deflections under several loading combinations. A study on the first natural frequencies of Snelson, prisms, Micheletti, Marcus and X-frame based tensegrity booms has been carried out. The result shows that the differences in the first natural frequencies of the truss and tensegrity booms are significant and not due to the number of mechanisms or pre-stress levels. The tensegritybooms of the type Snelson with 2 bars and prism with 3 bars have higher frequencies among tensegrity booms. / <p>QC 20120904</p>
185

Deployable Tensegrity Structures for Space Applications

Tibert, Gunnar January 2002 (has links)
QC 20100901
186

EXPRESSIVE WRITING AND ADOLESCENCE: SHORT AND LONG TERM EFFECTS AND LINGUISTIC PATTERNS OF FUNCTIONING

FACCHIN, FEDERICA 16 March 2010 (has links)
Il presente programma di ricerca si è proposto di valutare l’utilizzo di un intervento di scrittura su una popolazione di soggetti adolescenti iscritti al primo anno di scuola superiore. Ciò nei termini di risultati (effetti a breve e lungo termine), mediatori e moderatori. Tali questioni sono state esplorate nell’ambito di due diversi progetti. Il primo progetto si è esplicato in uno studio sperimentale sull’efficacia di un intervento di Expressive Writing [EW] standard rispetto ad un compito di scrittura focalizzato sugli aspetti positivi di un evento (Benefit Finding [BF]) e ad un gruppo di controllo che scriveva su temi superficiali (Trivial [TR]). Le analisi quantitative sono state effettuate all’interno di due studi finalizzati a testare gli effetti della modificazione delle consegne di scrittura e gli effetti di mediazione del linguaggio sui risultati dell’intervento (Studio 1), nonché gli effetti di moderazione di ottimismo, supporto familiare e relazioni interpersonali (Studio 2). Questi studi hanno fornito supporto alla letteratura di ricerca che ha messo in luce gli effetti positivi della scrittura tra gli adolescenti, rivelando risultati migliori per il gruppo BF. Il secondo progetto ha implicato uno studio naturalistico condotto in una scuola superiore del Nord Italia due settimane dopo la morte di uno studente, avvenuta durante le lezioni. In particolare è stato esaminato il processo della scrittura, considerando i cambiamenti linguistici in funzione di variabili legate al trattamento (le tre sessioni di scrittura) e ai partecipanti (l’ottimismo e le traiettorie individuali di adattamento). / The current research program aimed to evaluate the use of a writing intervention on an adolescent population transitioning from middle school to high school in terms of outcomes (short and long term effects), mediating mechanisms, and moderators. These issues have been investigated within two different projects. The first project implied a longitudinal, randomized, controlled trial on the efficacy of a standard Expressive Writing [EW] intervention relative to a writing assignment focused on positive aspects of an event (Benefit Finding [BF]) and a control group writing on neutral topics (Trivial [TR]). Quantitative analyses were performed within two studies aimed to test the effects of altering writing assignments and the mediating effects of language on writing outcomes (Study 1), and the moderating effects of optimism, family support, and interpersonal relationships (Study 2). These studies provided support to research literature showing positive effects of writing among adolescents, revealing greater outcomes for the BF group. The second project implied a small, naturalistic study conducted in a high school of Northern Italy two weeks after a student died during classes (Study 3). The process of writing was examined, considering linguistic changes in writing as a function of treatment variables (the three writing sessions) and participant variables (optimism and individuals’ adjustment trajectories).
187

Efficient algorithms for the identification of miRNA motifs in DNA sequences

Mendes, Nuno D 06 June 2011 (has links) (PDF)
Unravelling biological processes is dependent on the adequate modelling of regulatory mechanisms that determine the timing and spatial patterns of gene expression. In the last decade, a novel regulatory mechanism has been discovered and its biological importance has been increasingly recognised. This mechanism is mediated by RNA molecules named miRNAs that are the product of the maturation of non-coding gene transcripts and act post- transcriptionally usually to dampen or abolish the expression of protein-coding genes. Despite having eluded detection for such a long time, it is now clear that the elucidation of the expression pattern of many genes cannot be achieved without incorporating the effects of miRNA-mediated regulation. The technical difficulties that the experimental detection of these regulators entailed prompted the development of increasingly sophisticated computational approaches. Gene finding strategies originally developed for coding genes cannot be applied since these non- coding molecules are subject to very different sequence restraints and are too short to exhibit statistical properties that can be easily distinguished from the background. As a result, com- putational tools came to rely heavily on the identification of conserved sequences, distant homologs and machine learning techniques. Recent developments in sequencing technology have overcome some of the limitations of earlier experimental approaches, but pose new computational challenges. At present, the identification of new miRNA genes is therefore the result of the use of several approaches, both computational and experimental. In spite of the advancement that this research field has known in the last several years, we are still not able to formally and rigourously characterise miRNA genes in order to identify whichever sequence, structure or contextual requirements are needed to turn a DNA sequence into a functional miRNA. Efforts using computational algorithms towards the enumeration of the full set of miRNAs of an organism have been limited by strong reliance on arguments of precursor conservation and feature similarity. However, miRNA precursors may arise anew or be lost across the evolutionary history of a species and a newly-sequenced genome may be evolutionarily too distant from other genomes for an adequate comparative analysis. In addition, the learning of intricate classification rules based purely on features shared by miRNA precursors that are currently known may reflect a perpetuating identification bias rather than a sound means to tell true miRNAs from other genomic stem-loops. In this thesis, we present a strategy to sieve through the vast amount of stem-loops found in metazoan genomes in search of pre-miRNAs, significantly reducing the set of candidates while retaining most known miRNA precursors. Our approach relies on precursor properties derived from the current knowledge of miRNA biogenesis, analysis of the precursor structure and incorporation of information about the transcription potential of each candidate. i Our approach has been applied to the genomes of Drosophila melanogaster and Anophe- les gambiae, which has allowed us to show that there is a strong bias amongst annotated pre-miRNAs towards robust stem-loops in these genomes and to propose a scoring scheme for precursor candidates which combines four robustness measures. Additionally, we have identified several known pre-miRNA homologs in the newly-sequenced Anopheles darlingi and shown that most are found amongst the top-scoring precursor candidates for that or- ganism, with respect to the combined score. The structural analysis of our candidates and the identification of the region of the structural space where known precursors are usually found allowed us to eliminate several candidates, but also showed that there is a staggering number of genomic stem-loops which seem to fulfil the stability, robustness and structural requirements indicating that additional evidence is needed to identify functional precursors. To this effect, we have introduced different strategies to evaluate the transcription potential of the remaining candidates which vary according to the information which is available for the dataset under study.
188

Truth Commissions: Did the South African Truth and Reconciliation Commission serve the purpose for which it was established?

Abduroaf, Muneer January 2010 (has links)
<p>Since the 1980&rsquo / s, many dictatorships around the world have been replaced by new democracies. These old dictatorships were notorious for their human rights abuses. Many people were killed and tortured / and many others were disappeared. When the new governments came into power, they had to confront these injustices that were perpetrated under the predecessor regime. This was necessary to create a culture of human rights / promote a respect for the law and access to justice. Many confronted these injustices in different ways, some granted amnesty, some prosecuted and others instituted truth commissions. This research paper focuses on truth commissions. The research focuses particularly on the study of the South African Truth Commission. The mandate of the South African Truth Commission is analysed and the investigation into whether the commission served the purpose for which it had been established is discussed.</p>
189

The Need for Post-conflict Investigatory Mechanisms in the R2P Doctrine

Navaratnam, Kubes 12 January 2011 (has links)
In the wake of atrocities arising from internal armed conflicts in the 1990s, the International Commission on Intervention and State Sovereignty introduced the Responsibility to Protect doctrine (“R2P”) as a solution to reconcile the notion of state sovereignty with the need to protect citizens. The lack of available protection for internal armed conflicts and the subsequent evolution of the humanitarian intervention debate facilitated the unanimous acceptance of R2P’s fundamental principles by all UN member states. This paper examines the development of the R2P doctrine and its current status as customary law. By identifying its inadequacies, the paper raises questions of the doctrine’s viability in fulfilling the emerging norm of the collective responsibility to protect. In order to remedy these shortfalls and ensure the doctrine’s effectiveness, the paper argues the need to incorporate post-conflict investigatory mechanisms into the R2P.
190

The Need for Post-conflict Investigatory Mechanisms in the R2P Doctrine

Navaratnam, Kubes 12 January 2011 (has links)
In the wake of atrocities arising from internal armed conflicts in the 1990s, the International Commission on Intervention and State Sovereignty introduced the Responsibility to Protect doctrine (“R2P”) as a solution to reconcile the notion of state sovereignty with the need to protect citizens. The lack of available protection for internal armed conflicts and the subsequent evolution of the humanitarian intervention debate facilitated the unanimous acceptance of R2P’s fundamental principles by all UN member states. This paper examines the development of the R2P doctrine and its current status as customary law. By identifying its inadequacies, the paper raises questions of the doctrine’s viability in fulfilling the emerging norm of the collective responsibility to protect. In order to remedy these shortfalls and ensure the doctrine’s effectiveness, the paper argues the need to incorporate post-conflict investigatory mechanisms into the R2P.

Page generated in 0.0674 seconds