• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 514
  • 232
  • 186
  • 72
  • 38
  • 22
  • 17
  • 14
  • 13
  • 13
  • 11
  • 10
  • 9
  • 7
  • 7
  • Tagged with
  • 1333
  • 121
  • 113
  • 100
  • 98
  • 98
  • 97
  • 88
  • 67
  • 63
  • 61
  • 60
  • 59
  • 52
  • 51
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
161

Etude de l'influence de la porosité sur les propriétés électriques de dépôts réalisés par projection plasma

Beauvais, Sébastien 08 July 2003 (has links) (PDF)
L?alumine est un matériau très utilisé pour ses propriétés d?isolant électrique et sa grande stabilité chimique. Son<br />utilisation comme revêtement isolant est envisagé pour l?amélioration des sondes géologiques. La souplesse du<br />procédé de projection plasma a permis l?obtention de dépôts présentant une large gamme de porosités aux proportions et morphologies variées. Le dépôt se construit par empilement de gouttelettes en fusion. Celles-ci, en s?étalant et se solidifiant pour former des lamelles, génèrent un réseau de porosité interconnectée, anisotrope et tridimensionnel. Ce dernier est difficile à observer et caractériser. Il comprend les pores globulaires, les fissures inter-lamellaires et les fissures intra-lamellaires. Leur caractérisation microstructurale a été réalisée par analyse d?images de coupes de dépôts. Six d?entre eux, ont été sélectionnés pour mesurer leurs propriétés électriques. Ces microstructures très particulières, riches en défauts, entraînent une dégradation plus ou moins importante des propriétés électriques par rapport à l?alumine massive. Ces mesures ont montré que la porosité, principalement via les fissures intralamellaires, constituait des canaux « perforants » reliant le substrat à la surface du dépôt. Des mesures par spectroscopie d?impédance ont révélé que pour tous les dépôts, lors d?une immersion, le liquide arrivait à<br />atteindre le substrat et à amorcer une réaction de corrosion au fond des pores. Enfin, la méthode appelée<br />« Scanning Electron Microscopy Mirror Effect », consistant à irradier un matériau avec un canon à électrons, a<br />démontré que suivant leur orientation, les fissures constituaient soit des chemins privilégiés, soit des obstacles pour les porteurs de charges au sein du matériau.<br />Cette porosité « perforante » étant due à son haut degré d?interconnexion, une simulation tridimensionnelle de la<br />microstructure et de la porosité a été développée. Elle se fait par empilements successifs de lamelles incorporant<br />de manière aléatoire les pores et les fissures. Pour cela, des lamelles étalées sur des substrats d?alumine polis et<br />préchauffés ont été observées et caractérisées. L?acquisition de leurs volumes par microscopie confocale a<br />permis de les modéliser. Les probabilités de présence des défauts ont été déterminées à partir d?observations de<br />coupes de dépôts. Cette démarche a aboutie à la création d?images 3D du dépôt réel. A partir de ces images,<br />après un maillage approprié, des calculs par éléments finis, ont permis de révéler une anisotropie des propriétés<br />électriques en relation directe avec celle de la microstructure. Cette simulation couplée au calcul par éléments finis semble très prometteuse pour la compréhension des relations microstructure/propriétés des dépôts réalisés par projection plasma.
162

Schemas for safe and efficient XML processing

Colazzo, Dario 08 September 2011 (has links) (PDF)
Ce manuscrit d'Habilitation à Diriger des Recherches présente des résultats que j'ai obtenus dans le cadre d'activités de recherche menées depuis 2005 en tant que Maître de Conférences à l'Université Paris-Sud XI. Au début de cette période XML ( eXtensible Markup Language) était déjà reconnus comme le standard pour la représentation de données semi structurées. En même temps, XML c'est aussi affirmé comme format de représentation dans le contexte de l'intégration et l'échange de données. Pendant cette période mes intérêts de recherche se sont situés à la confluence des langages des bases de données et langages de programmation, et se sont focalisé sur l'utilisation des systèmes de types pour assurer la sureté et optimisation des programmes manipulant les données XML. Plus en détails, je me suis principalement intéressé à trois axes de recherche: i) optimisation de requêtes et mise à jours XML via la projection de données, ii) vérification de la correction des mappings entre deux schémas XML, iii) algorithmes efficaces pour la vérification d'inclusion entre schémas XML (une propriété qui est à la base des systèmes de types pour requêtes et mises à jour XML). Ce manuscrit d'Habilitation à Diriger des Recherches est consacré à ces trois axes de recherche, et présente le contexte, les motivations et résultats obtenus pour chacun des axes.
163

Traffic Concurrency Management Through Delay and Safety Mitigations

Chimba, Deo 18 April 2008 (has links)
Travelers experience different transportation-related problems on roadways ranging from congestion, delay, and crashes, which are partially due to growing background traffic and traffic generated by new developments. With regards to congestion, metropolitan planning organizations (MPOs) pursue a variety of plans for mitigating congestion. These plans include, amongst other measures, imposing impact fees. The current research evaluates how delay and safety can be incorporated in the mitigation process as special impact fees. This study also evaluates traffic projection methodologies used in traffic impact studies. Traffic volume is a critical factor in determining both current and future desired and undesired highway operations. Highway crashes are also influenced by traffic volume, as a higher frequency of crashes is expected at more congested locations and vice versa. Accurately forecasted traffic data is required for accurate future planning, traffic operations, safety evaluation, and countermeasures. Adhering to the importance of accurate traffic projection, this study introduces a simplistic traffic projection methodology for small-scale projection utilizing three parameters logistic function as a forecasting tool. Three parameters logistic function produced more accurate future traffic prediction compared to other functions. When validation studies were performed, the coefficient of correlation was found to be above 90 percent in each location. The t-values for the three parameters were highly significant in the projection. The confidence intervals have been calculated at a 95 percent confidence level using the delta method to address the uncertainty and reliability factor in the projection using logistic function. A delay mitigation fee resulting from increases in travel time is also analyzed in this research. In regular traffic flow, posted speed limit is the base of measuring travel time within the segment of the road. The economic concept of congestion pricing is used to evaluate the impact of this travel time delay per unit trip. If the relationship between the increase in time and trip is known, then the developer can be charged for the costs of time delays for travelers by using that relationship. The congestion pricing approach determines the average and marginal effect of the travel time. With the known values of time, vehicle occupancy, and number of travel days per year, the extra cost per trip caused by additional trips is estimated. This cost becomes part of the mitigation fee that the developer incurs as a result of travel time delays for the travelers due to the development project. Using the Bureau of Public Road (BPR) travel time function and parameters found in 2000 HCM (Highway Capacity Manual), the average and marginal travel times were determined. The value of time was taken as $7.50 per hour after reviewing different publications, which relate it to minimum wage. The vehicle occupancy is assumed as 1.2 persons per vehicle. Other assumptions include 261 working days per year and 4 percent rate of return. The total delay impact fee will depend on the number of years needed for the development to have effect. Since the developer is charged a road impact fee due to constructions cost for the road improvement, the delay mitigation fee should be credited to the road impact fee to avoid double charging the developer. As an approach to incorporate safety into mitigation fees, the study developed a crash prediction model in which all factors significantly influencing crash occurrences are considered and modeled. Negative binomial (NB) is selected as the best crash modeling distribution among other generalized linear models. The developed safety component of the mitigation fee equation considers scenarios in which the proposed new development is expected to increase crash frequency. The mitigation fee equation is designed to incorporate some roadway features and traffic characteristics generated by the new development that influence crash occurrence. Crash reduction factors are introduced and incorporated in the safety mitigation fees equation. The difference between crash frequency before and after the development is multiplied by the crash cost then divided by the trips to obtain crash cost per trip. Crash cost is taken as $28,000/crash based on literature review. To avoid double charging the developer, either the road impact fee is applied as a credit to the delay mitigation fee or vice versa. In summary, this study achieved and contributed the following to researchers and practitioners: ... Developed logistic function as a simplified approach for traffic projection ... Developed crash model for crash prediction ... Developed safety mitigation fee equation utilizing the crash modeling ... Developed delay mitigation fee equation using congestion pricing approach
164

Focus presuppositions

Abusch, Dorit January 2007 (has links)
This paper reviews notions related to focus and presupposition and addresses the hypothesis that focus triggers an existential presupposition. Presupposition projection behavior in certain examples appears to favor a presuppositional analysis of focus. It is argued that these examples are open to a different analysis using givenness theory. Overall, the analysis favors a weak semantics for focus not including an existential presupposition.
165

The Multilingual Forest : Investigating High-quality Parallel Corpus Development

Adesam, Yvonne January 2012 (has links)
This thesis explores the development of parallel treebanks, collections of language data consisting of texts and their translations, with syntactic annotation and alignment, linking words, phrases, and sentences to show translation equivalence. We describe the semi-manual annotation of the SMULTRON parallel treebank, consisting of 1,000 sentences in English, German and Swedish. This description is the starting point for answering the first of two questions in this thesis. What issues need to be considered to achieve a high-quality, consistent,parallel treebank? The units of annotation and the choice of annotation schemes are crucial for quality, and some automated processing is necessary to increase the size. Automatic quality checks and evaluation are essential, but manual quality control is still needed to achieve high quality. Additionally, we explore improving the automatically created annotation for one language, using information available from the annotation of the other languages. This leads us to the second of the two questions in this thesis. Can we improve automatic annotation by projecting information available in the other languages? Experiments with automatic alignment, which is projected from two language pairs, L1–L2 and L1–L3, onto the third pair, L2–L3, show an improvement in precision, in particular if the projected alignment is intersected with the system alignment. We also construct a test collection for experiments on annotation projection to resolve prepositional phrase attachment ambiguities. While majority vote projection improves the annotation, compared to the basic automatic annotation, using linguistic clues to correct the annotation before majority vote projection is even better, although more laborious. However, some structural errors cannot be corrected by projection at all, as different languages have different wording, and thus different structures. / I denna doktorsavhandling utforskas skapandet av parallella trädbanker. Dessa är språkliga data som består av texter och deras översättningar, som har märkts upp med syntaktisk information samt länkar mellan ord, fraser och meningar som motsvarar varandra i översättningarna. Vi beskriver den delvis manuella uppmärkningen av den parallella trädbanken SMULTRON, med 1.000 engelska, tyska och svenska meningar. Denna beskrivning är utgångspunkt för att besvara den första av två frågor i avhandlingen. Vilka frågor måste beaktas för att skapa en högkvalitativ parallell trädbank? De enheter som märks upp samt valet av uppmärkningssystemet är viktiga för kvaliteten, och en viss andel automatisk bearbetning är nödvändig för att utöka storleken. Automatiska kvalitetskontroller och automatisk utvärdering är av vikt, men viss manuell granskning är nödvändig för att uppnå hög kvalitet. Vidare utforskar vi att använda information som finns i uppmärkningen, för att förbättra den automatiskt skapade uppmärkningen för ett annat språk. Detta leder oss till den andra av de två frågorna i avhandlingen. Kan vi förbättra automatisk uppmärkning genom att överföra information som finns i de andra språken? Experimenten visar att automatisk länkning som överförs från två språkpar, L1–L2 och L1–L3, till det tredje språkparet, L2–L3, får förbättrad precision, framför allt för skärningspunkten mellan den överförda länkningen och den automatiska länkningen. Vi skapar även en testsamling för experiment med överföring av uppmärkning för att lösa upp strukturella flertydigheter hos prepositionsfraser. Överföring enligt majoritetsprincipen förbättrar uppmärkningen, jämfört med den grundläggande automatiska uppmärkningen, men att använda språkliga ledtrådar för att korrigera uppmärkningen innan majoritetsöverföring är ännu bättre, om än mer arbetskrävande. Vissa felaktiga strukturer kan dock inte korrigeras med hjälp av överföring, eftersom de olika språken använder olika formuleringar, och därmed har olika strukturer.
166

Performance Projections of HPC Applications on Chip Multiprocessor (CMP) Based Systems

Shawky Sharkawi, Sameh Sh 2011 May 1900 (has links)
Performance projections of High Performance Computing (HPC) applications onto various hardware platforms are important for hardware vendors and HPC users. The projections aid hardware vendors in the design of future systems and help HPC users with system procurement and application refinements. In this dissertation, we present an efficient method to project the performance of HPC applications onto Chip Multiprocessor (CMP) based systems using widely available standard benchmark data. The main advantage of this method is the use of published data about the target machine; the target machine need not be available. With the current trend in HPC platforms shifting towards cluster systems with chip multiprocessors (CMPs), efficient and accurate performance projection becomes a challenging task. Typically, CMP-based systems are configured hierarchically, which significantly impacts the performance of HPC applications. The goal of this research is to develop an efficient method to project the performance of HPC applications onto systems that utilize CMPs. To provide for efficiency, our projection methodology is automated (projections are done using a tool) and fast (with small overhead). Our method, called the surrogate-based workload application projection method, utilizes surrogate benchmarks to project an HPC application performance on target systems where computation component of an HPC application is projected separately from the communication component. Our methodology was validated on a variety of systems utilizing different processor and interconnect architectures with high accuracy and efficiency. The average projection error on three target systems was 11.22 percent with standard deviation of 1.18 percent for twelve HPC workloads.
167

3-D Reconstruction from Single Projections, with Applications to Astronomical Images

Cormier, Michael January 2013 (has links)
A variety of techniques exist for three-dimensional reconstruction when multiple views are available, but less attention has been given to reconstruction when only a single view is available. Such a situation is normal in astronomy, when a galaxy (for example) is so distant that it is impossible to obtain views from significantly different angles. In this thesis I examine the problem of reconstructing the three-dimensional structure of a galaxy from this single viewpoint. I accomplish this by taking advantage of the image formation process, symmetry relationships, and other structural assumptions that may be made about galaxies. Most galaxies are approximately symmetric in some way. Frequently, this symmetry corresponds to symmetry about an axis of rotation, which allows strong statements to be made about the relationships between luminosity at each point in the galaxy. It is through these relationships that the number of unknown values needed to describe the structure of the galaxy can be reduced to the number of constraints provided by the image so the optimal reconstruction is well-defined. Other structural properties can also be described under this framework. I provide a mathematical framework and analyses that prove the uniqueness of solutions under certain conditions and to show how uncertainty may be precisely and explicitly expressed. Empirical results are shown using real and synthetic data. I also show a comparison to a state-of-the-art two-dimensional modelling technique to demonstrate the contrasts between the two frameworks and show the important advantages of the three-dimensional approach. In combination, the theoretical and experimental aspects of this thesis demonstrate that the proposed framework is versatile, practical, and novel---a contribution to both computer science and astronomy.
168

Reconstruction and Visualization of Polyhedra Using Projections

Hasan, Masud January 2005 (has links)
Two types of problems are studied in this thesis: reconstruction and visualization of polygons and polyhedra. <br /><br /> Three problems are considered in reconstruction of polygons and polyhedra, given a set of projection characteristics. The first problem is to reconstruct a closed convex polygon (polyhedron) given the number of visible edges (faces) from each of a set of directions <em>S</em>. The main results for this problem include the necessary and sufficient conditions for the existence of a polygon that realizes the projections. This characterization gives an algorithm to construct a feasible polygon when it exists. The other main result is an algorithm to find the maximum and minimum size of a feasible polygon for the given set <em>S</em>. Some special cases for non-convex polygons and for perspective projections are also studied. <br /><br /> For reconstruction of polyhedra, it is shown that when the projection directions are co-planar, a feasible polyhedron (i. e. a polyhedron satisfying the projection properties) can be constructed from a feasible polygon and vice versa. When the directions are covered by two planes, if the number of visible faces from each of the directions is at least four, then an algorithm is presented to decide the existence of a feasible polyhedron and to construct one, when it exists. When the directions see arbitrary number of faces, the same algorithm works, except for a particular sub-case. <br /><br /> A polyhedron is, in general, called equiprojective, if from any direction the size of the projection or the projection boundary is fixed, where the "size" means the number of vertices, edge, or faces. A special problem on reconstruction of polyhedra is to find all equiprojective polyhedra. For the case when the size is the number of vertices in the projection boundary, main results include the characterization of all equiprojective polyhedra and an algorithm to recognize them, and finding the minimum equiprojective polyhedra. Other measures of equiprojectivity are also studied. <br /><br /> Finally, the problem of efficient visualization of polyhedra under given constraints is considered. A user might wish to find a projection that highlights certain properties of a polyhedron. In particular, the problem considered is given a set of vertices, edges, and/or faces of a convex polyhedron, how to determine all projections of the polyhedron such that the elements of the given set are on the projection boundary. The results include efficient algorithms for both perspective and orthogonal projections, and improved adaptive algorithm when only edges are given and they form disjoint paths. A related problem of finding all projections where the given edges, faces, and/or vertices are not on the projection boundary is also studied.
169

The Analysis of Long-run Real Exchange Rate in Japan

Liu, Ya-chun 26 July 2010 (has links)
Purchasing Power Parity (PPP) has been regarded as the most important theory to explain the exchange rate movement based on relative price levels of two countries. After 1973, more and more countries were taking the floating exchange rate system, and the real exchange is testing out to be a non-stationary time seriess. This would be some real factors to have an effect on the real exchange rate. In the article, We study how these possible factors change the real exchange rate and make use of Wu et.al (2008) and Lee (2010)¡¦s local projection to estimate the impulse responses under the non-stationary time series which has cointegration vectors, and then we compare the difference between the impulse response in conventional VAR and the impulse response in Local Projection. The emprical model we use is the smae one as in Zhou (1995) and Wang and Dunne (2003), and the rule of the data is the same as in Wang and Dunne (2003). Finally, we get the consistent conclusion with Wu et.al (2008), Zhou (1995) and Wang and Dunne (2003).
170

The Impulse Response Analysis of General Inference on Cointegration Vector for Non-Stationary Process by Local Projection

Lin, Meng-wei 26 July 2010 (has links)
Jorda (2005) proposed the new method to estimate impulse response functions by local projection. The new method, local projection, can avoid the misspecification problem. That is, local projections are robust to misspecification of the data generating process (DGP). Wu, Lee, and Wang (2008) extended the Jorda¡¦s local projection from stationary time series I(0) to non-stationary time series I(1). It makes the local projection be a more generally applicative method for the Macroeconomic. In the article, I relax the cointegration vector which assumed to be known in the Wu, Lee, and Wang (2008) and Lee(2010). From the inference of Johansen (1995) I can get the property of super-consistent between £] and ˆ £] in the cointegration vector. I use the above condition and OLS to estimate impulse response functions, and in the asymptotic theorem, the cointegration vectors which assumed to be known or estimated by Johansen MLE are both get the consistent coefficients of impulse responses.

Page generated in 0.03 seconds