• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 513
  • 232
  • 186
  • 72
  • 38
  • 22
  • 17
  • 14
  • 13
  • 13
  • 11
  • 10
  • 9
  • 7
  • 7
  • Tagged with
  • 1332
  • 121
  • 113
  • 100
  • 98
  • 98
  • 97
  • 88
  • 67
  • 63
  • 61
  • 60
  • 59
  • 52
  • 51
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
231

Simulating urban growth for Baltimore-Washington metropolitan area by coupling SLEUTH model and population projection

Zhao, Suwen 18 June 2015 (has links)
This study used two modelling approaches to predict future urban landscape for the Baltimore-Washington metropolitan areas. In the first approach, we implemented traditional SLEUTH urban simulation model by using publicly available and locally-developed land cover and transportation data. Historical land cover data from 1996, 2001, 2006, and 2011 were used to calibrate SLEUTH model and predict urban growth from 2011 to 2070. SLEUTH model achieved 94.9% of overall accuracy for a validation year of 2014. For the second modelling approach, we predicted future county-level population (e.g., 2050) using historical population data and time-series forecasting. We then used future population projection of 2050, aided by strong population-imperviousness statistical relationship (R2, 0.78-0.86), to predict total impervious surface area for each county. These population-predicted total impervious surface areas were compared to SLEUTH model output, at the county-aggregated spatial scale. For most counties, SLEUTH generated substantially higher number of impervious pixels. An annual urban growth rate of 6.24% for SLEUTH model was much higher than the population-based approach (1.33%), suggesting a large discrepancy between these two modelling approaches. The SLEUTH simulation model, although achieved high accuracy for 2014 validation, may have over-predicted urban growth for our study area. For population-predicted impervious surface area, we further developed a lookup table approach to integrate SLEUTH out and generated spatially explicit urban map for 2050. This lookup table approach has high potential to integrate population-predicted and SLEUTH-predicted urban landscape, especially when future population can be predicted with reasonable accuracy. / Master of Science
232

Architecture as model and projection

Bohlmeyer, John M. 21 October 2014 (has links)
This thesis begins as a study for an observatory and ends as a place to simply look upward. Architecture is explored through questions of model and projection. These questions become more and more discernable as the model is built and photographed. The model is an analogue to a potential building. The drawings reveal the projective nature of architectural form. The photographs of the model capture surfaces as elements of architecture. At this point, the work is free to pose its own questions. / Master of Architecture
233

Design and Fabrication of a Mask Projection Microstereolithography System for the Characterization and Processing of Novel Photopolymer Resins

Lambert, Philip Michael 17 September 2014 (has links)
The goal of this work was to design and build a mask projection microstereolithography (MPμSL) 3D printing system to characterize, process, and quantify the performance of novel photopolymers. MPμSL is an Additive Manufacturing process that uses DLP technology to digitally pattern UV light and selectively cure entire layers of photopolymer resin and fabricate a three dimensional part. For the MPμSL system designed in this body of work, a process was defined to introduce novel photopolymers and characterize their performance. The characterization process first determines the curing characteristics of the photopolymer, namely the Critical Exposure (Ec) and Depth of Penetration (Dp). Performance of the photopolymer is identified via the fabrication of a benchmark test part, designed to determine the minimum feature size, XY plane accuracy, Z-axis minimum feature size, and Z-axis accuracy of each photopolymer with the system. The first characterized photopolymer was poly (propylene glycol) diacrylate, which was used to benchmark the designed MPμSL system. This included the achievable XY resolution (212 micrometers), minimum layer thickness (20 micrometers), vertical build rate (360 layers/hr), and maximum build volume (6x8x36mm3). This system benchmarking process revealed two areas of underperformance when compared to systems of similar design, which lead to the development of the first two research questions: (i) 'How does minimum feature size vary with exposure energy?' and (ii) 'How does Z-axis accuracy vary with increasing Tinuvin 400 concentration in the prepolymer?' The experiment for research question (i) revealed that achievable feature size decreases by 67% with a 420% increase in exposure energy. Introducing 0.25wt% of the photo-inhibitor Tinuvin 400 demonstrated depth of penetration reduction from 398.5 micrometers to 119.7 micrometers. This corresponds to a decrease in Z-axis error from 119% (no Tinuvin 400) to 9% Z-axis error (0.25% Tinuvin 400). Two novel photopolymers were introduced to the system and characterized. Research question (iii) asks 'What are the curing characteristics of Pluronic L-31 how does it perform in the MPμSL system?' while Research Question 4 similarly queries 'What are the curing characteristics of Phosphonium Ionic Liquid and how does it perform in the MPμSL system?' The Pluronic L-31 with 2wt% photo-initiator had an Ec of 17.2 mJ/cm2 and a Dp of 288.8 micrometers, with a minimum feature size of 57.3 ± 5.7 micrometers, with XY plane error of 6% and a Z-axis error of 83%. Phosphonium Ionic Liquid was mixed in various concentrations into two base polymers, Butyl Diacrylate (0% PIL and 10% PIL) and Poly Ethylene Dimethacrylate (5% PIL, 15% PIL, 25% PIL). Introducing PIL into either base polymer caused the Ec to increase in all samples, while there is no significant trend between increasing concentrations of IL in either PEGDMA or BDA and depth of penetration. Any trends previously identified between penetration depth and Z accuracy do not seem to extend from one resin to another. This means that overall, among all resins, depth of penetration is not an accurate way to predict the Z axis accuracy of a part. Furthermore, increasing concentrations of PIL caused increasing % error in both XY plane and Z-axis accuracy . / Master of Science
234

Incompressible SPH method for simulating Newtonian and non-Newtonian flows with a free surface.

Shao, Songdong, Lo, E.Y.M. January 2003 (has links)
No / An incompressible smoothed particle hydrodynamics (SPH) method is presented to simulate Newtonian and non-Newtonian flows with free surfaces. The basic equations solved are the incompressible mass conservation and Navier¿Stokes equations. The method uses prediction¿correction fractional steps with the temporal velocity field integrated forward in time without enforcing incompressibility in the prediction step. The resulting deviation of particle density is then implicitly projected onto a divergence-free space to satisfy incompressibility through a pressure Poisson equation derived from an approximate pressure projection. Various SPH formulations are employed in the discretization of the relevant gradient, divergence and Laplacian terms. Free surfaces are identified by the particles whose density is below a set point. Wall boundaries are represented by particles whose positions are fixed. The SPH formulation is also extended to non-Newtonian flows and demonstrated using the Cross rheological model. The incompressible SPH method is tested by typical 2-D dam-break problems in which both water and fluid mud are considered. The computations are in good agreement with available experimental data. The different flow features between Newtonian and non-Newtonian flows after the dam-break are discussed.
235

Vers un système de projection icosaédral hiérarchique global sans distorsions pour cartographie Web

Benamrani, Noureddine 20 April 2018 (has links)
Les systèmes de projection cartographique adaptés aux services de cartographie web, suscitent encore de nombreuses questions de recherche. La majorité des services de cartographie Web (ex. Google Maps, Bing Maps) utilise la Projection Web Mercator(WMP), mais cette dernière présente de grandes distorsions systématiques notamment dans les régions nordiques. Il nous est alors paru nécessaire de développer une autre méthode permettant de projeter la surface du globe avec un minimum de distorsion. Notre approche s’inspire de la projection myriahedrale qui suppose que chaque face du myriahedron est suffisamment petite de telle sorte que les distorsions soient négligeables. La méthode proposée consiste à explorer une nouvelle approche de tessellation de la surface du globe et qui permet de projeter la surface du globe sur les faces de la tessellation et à plusieurs niveaux de détails. Cela permet de compenser la faiblesse des méthodes de tessellations existantes utilisées dans la cartographie Web. Cette tessellation utilise un icosaèdre comme modèle géométrique de base avec une densité de partitionnement des faces de l’icosaèdre entre les niveaux de récursivité égale à 4.La méthodologie proposée consiste en quatre étapes successives: a) la construction d'une structure hiérarchique qui résulte de la subdivision récursive des faces de l'icosaèdre ; b) la définition d’un système de projection approprié à la tessellation icosaédrique; c) la projection des données géospatiales de la sphère terrestre sur chaque face de l’icosaèdre; d) le dépliage de la tessellation icosaédrique résultante sur un plan en utilisant des algorithmes de calcul du plus court chemin afin de maintenir le voisinage autour du point d’intérêt. Nous présenterons les étapes de développement et d’implémentation du système proposé et les résultats obtenus dans le cadre de ce projet de recherche. L’étude comparative avec d’autres systèmes de projection montre que notre approche minimise mieux les déformations par tout sur le globe et surtout dans les régions nordiques. / Map projection systems adapted to web mapping services still raise many research questions. The majority of web mapping services (ex. Google Maps, Bing Maps) use Web Mercator Projection (WMP) which introduces large systematic distorsionsin spatial data especially in polar rigions. Therefore, it is necessary to develop an alternative method for projecting these regions with minimal distortions. Our approach is inspired of myriahedral projections which assume that each face of a myriahedron is small enough so that the distortions are negligible. Here, in this research work we propose a new approach for the tessellation of the surface of the globe and the projectionof the spatial information from the glob to the faces of the tessellation at differentlevels of details. This compensates for the weakness of the methods used for tessellation in the existing web mapping systems. The proposed tessellation is created based on an icosahedron with a partitioning density of the faces equal to 4. The proposed methodology consists of four stages: a) constructing of a hierarchical structure resulting the recursive subdivision of the faces of the icosahedron, while maintaining topological relationships between the triangles in each level of detail; b) defining of an appropriate projection system to the icosahedral tessellation; c) projecting of geospatial data of the terrestrial sphere on each face of the icosahedron; d) unfolding the resulting icosahedral tessellation on a plane around a point of interest. Here we present different stages of development and implementation of the proposed system and the results obtained in the framework of this research project. The comparative study with other projection systems shows that our approach allows to better minimize different distortions every where on the globe and specially in the polar rigions.
236

Development of a high-pressure xenon gas time projection chamber and evaluation of its performance at around the Q value of ???Xe double-beta decay / 高圧キセノンガスタイムプロジェクションチェンバーの開発および???Xeの二重ベータ崩壊Q値付近におけるその性能評価

Yoshida, Masashi 25 March 2024 (has links)
京都大学 / 新制・課程博士 / 博士(理学) / 甲第25115号 / 理博第5022号 / 新制||理||1716(附属図書館) / 京都大学大学院理学研究科物理学・宇宙物理学専攻 / (主査)教授 中家 剛, 教授 田島 治, 准教授 WENDELLRoger / 学位規則第4条第1項該当 / Doctor of Agricultural Science / Kyoto University / DFAM
237

Integral Projection Models and analysis of patch dynamics of the reef building coral Monstastraea annularis

Burgess, Heather Rachel January 2011 (has links)
Over the past 40 years, coral cover has reduced by as much as 80%. At the same time, Coral Reefs are coming under increasing threat from hurricanes, as climate change is expected to increase the intensity of hurricanes. Therefore, it has become increasingly important to understand the effect of hurricanes on a coral population. This Thesis focuses on the reef-building coral Montastraea annularis. This species once dominated Caribbean Coral Reefs, but is fast being replaced by faster growing more opportunistic species. It is important that the underlying dynamics of the decline is understood, if managers stand any chance of reversing this decline. The aim of this Thesis is to investigate the effect of hurricane activity on the dynamics of the reef-building coral Montastraea annularis. To achieve this the Integral Projection Model (IPM) method was adopted and the results compared to those produced using the more traditional method of Population Projection Matrix (PPM) method. The models were fitted using census data from June 1998 to January 2003, which described the area of individual coral patches on a sample of ramets on Glovers Reef, Belize. Glovers Reef is a marine reserve that lies 30km off the coast of Belize and 15km east of the main barrier reef. Three hurricanes struck Glovers Reef during the study: Hurricane Mitch (October 1998), Hurricane Keith (September 2000) and Hurricane Iris (October 2001). The data have been divided by two different methods in order to test two research questions, firstly if the initial trauma following a hurricane affects the long term dynamics of a population and, secondly, if the dynamics exhibited during a hurricane varied with hurricane strength. In this Thesis five main results are shown: 1. All models for all divisions of data are in long term decline. 2. As initial trauma increased, the long term growth rates decreased, conversely the short term extremes increased. 3. Fragmentation is more likely as patch size increased and more likely under stronger hurricanes. 4. Integral Projection Modelling painted a similar picture to Population Projection Matrix models and should be a preferred method of analysis.5. Interaction of the IPMs can be used to model the changing occurrence of hurricanes under climate change. It is shown that with increased intensity, the population could become extinct 6.3 years sooner. This research is the first step in modelling coral patch populations by the IPM method. It suggests possible functional forms and compares the results with the PPM method. Further research is required into the biological functions which drive fragmentation, the method by which large patches divide into groups of smaller patches. The conclusions from this Thesis add to the growing body of knowledge concerning the response of coral species to hurricanes, focusing on the importance of understanding patch dynamics, in order to understand colonial dynamics.
238

Contribution des informations expérimentales et expertes à l'amélioration des modèles linéaires d'étalonnage multivarié en spectrométrie / Cooperation between experimental and expert informations for improving spectrometer calibrations

Boulet, Jean-Claude 13 December 2010 (has links)
Les spectres contiennent de l'information sur la composition d'échantillons. Cette information est extraite au moyen d'une première famille d'outils chimiométriques, les étalonnages. Une deu xième famille d'outils, les prétraitements, est destinée à enlever une information spectrale nuisible. Etalonnages et prétraitements sont construits à partir de deux types d'informations: (1) les informations expérimentales basées sur l'expérience; (2) les informations expertes basées sur la connaissance a priori. L'objectif de la thèse est d'étudier les complémentarités et synergies entre ces deux types d'informations. Après une étude bibliographique, un modèle général commun aux étalonnages et prétraitements est proposé. L'information utile ou nuisible contenue dans un spectre est obtenue par projection orthogonale de ce spectre (selon un métrique Sigma) sur une matrice P dont les colonnes constituent une base de l'espace vectoriel associé à l'information utile ou nuisible. Selon les cas, l'information utile est conservée alors que l'information nuisible est éliminée. Le modèle général est ensuite implémenté par deux nouvelles méthodes. L'IDC-Improved Direct Calibration est une méthode d'étalonnage direct utilisant conjointement des informations expérimentales et expertes. Ensuite VODKA-PLSR est une généralisation de la PLSR. Un vecteur r est mis en évidence, il permet d'inclure de l'information experte dans le modèle. En conclusion ce travail permet une vision plus synthétique des modèles existants, propose deux nouveaux modèles d'étalonnage et ouvre de nombreuses possibilités pour créer de nouveaux modèles d'étalonnage et de prétraitement. / Spectra contain informations about the composition of samples. This information is obtained using calibration. Harmful spectral information can be previoulsy withdrawn using pretraitments. Both calibration and pretraitment models are based on two types of informations: (1) experimental information based on measurements onto samples; (2) expert information based on a previous knowledge. The aim of this thesis is to study the links between those two types of information. After a biography review, a general model including both calibrations and pretraitments is proposed. The usefull or harmful spectral information is obtained after spectra have been orthogonaly projected (with a Sigma metrix ) onto a P matrix whose columns define a basis of the vectorial subspace described by the usefull or harmful information. Thus usefull information is kept whereas harmful information is withdrawn. Two new methods are proposed. First IDC-Improved Direct Calibra tion is a direct calibration method using both experimental and expert informations. Then VODKA-PLSR is a generalisation of PLSR. A vector r permits the use of expert information by the regression model. To conclude, this works allows a global view of existing tools, proposes two new models and offers new possibilities for building new models.
239

Propriétés symplectiques et hamiltoniennes des orbites coadjointes holomorphes / Symplectic and Hamiltonian properties of holomorphic coadjoint orbits

Deltour, Guillaume 10 December 2010 (has links)
L'objet de cette thèse est l'étude de la structure symplectique des orbites coadjointes holomorphes, et de leurs projections.Une orbite coadjointe holomorphe O est une orbite coadjointe elliptique d'un groupe de Lie G réel semi-simple connexe non compact à centre fini provenant d'un espace symétrique hermitien G/K, telle que O puisse être naturellement munie d'une structure kählérienne G-invariante. Ces orbites coadjointes sont une généralisation de l'espace symétrique hermitien G/K.Dans cette thèse, nous prouvons que le symplectomorphisme de McDuff se généralise aux orbites coadjointes holomorphes, décrivant la structure symplectique de l'orbite O par le produit direct d'une orbite coadjointe compacte et d'un espace vectoriel symplectique. Ce symplectomorphisme est ensuite utilisé pour déterminer les équations de la projection de l'orbite O relative au sous-groupe compact maximal K de G, en faisant intervenir des résultats récents de Ressayre en Théorie Géométrique des Invariants. / This thesis studies the symplectic structure of holomorphic coadjoint orbits and the projection of such orbits.A holomorphic coadjoint orbit O is an elliptic coadjoint orbit which is endowed with a natural invariant Kählerian structure. These coadjoint orbits are defined for real semi-simple connected non compact Lie group G with finite center such that G/K is a Hermitian symmetric space, where K is a maximal compact subgroup of G. Holomorphic coadjoint orbits are a generalization of the Hermitian symmetric space G/K.In this thesis, we prove that the McDuff's symplectomorphism, available for Hermitian symmetric spaces, has an analogous for holomorphic coadjoint orbits. Then, using this symplectomorphism and recent GIT arguments from Ressayre, we compute the equations of the projection of the orbit O, relatively to the maximal compact subgroup K.
240

Partitioning XML data, towards distributed and parallel management / Méthode de Partitionnement pour le traitement distribué et parallèle de données XML.

Malla, Noor 21 September 2012 (has links)
Durant cette dernière décennie, la diffusion du format XML pour représenter les données générées par et échangées sur le Web a été accompagnée par la mise en œuvre de nombreux moteurs d’évaluation de requêtes et de mises à jour XQuery. Parmi ces moteurs, les systèmes « mémoire centrale » (Main-memory Systems) jouent un rôle très important dans de nombreuses applications. La gestion et l’intégration de ces systèmes dans des environnements de programmation sont très faciles. Cependant, ces systèmes ont des problèmes de passage à l’échelle puisqu’ils requièrent le chargement complet des documents en mémoire centrale avant traitement.Cette thèse présente une technique de partitionnement des documents XML qui permet aux moteurs « mémoire principale » d’évaluer des expressions XQuery (requêtes et mises à jour) pour des documents de très grandes tailles. Cette méthode de partitionnement s’applique à une classe de requêtes et mises à jour pertinentes et fréquentes, dites requêtes et mises à jour itératives.Cette thèse propose une technique d'analyse statique pour reconnaître les expressions « itératives ». Cette analyse statique est basée sur l’extraction de chemins à partir de l'expression XQuery, sans utilisation d'information supplémentaire sur le schéma. Des algorithmes sont spécifiés, utilisant les chemins extraits par l’étape précédente, pour partitionner les documents en entrée en plusieurs parties, de sorte que la requête ou la mise à jour peut être évaluée sur chaque partie séparément afin de calculer le résultat final par simple concaténation des résultats obtenus pour chaque partie. Ces algorithmes sont mis en œuvre en « streaming » et leur efficacité est validée expérimentalement.En plus, cette méthode de partitionnement est caractérisée également par le fait qu'elle peut être facilement implémentée en utilisant le paradigme MapReduce, permettant ainsi d'évaluer une requête ou une mise à jour en parallèle sur les données partitionnées. / With the widespread diffusion of XML as a format for representing data generated and exchanged over the Web, main query and update engines have been designed and implemented in the last decade. A kind of engines that are playing a crucial role in many applications are « main-memory » systems, which distinguish for the fact that they are easy to manage and to integrate in a programming environment. On the other hand, main-memory systems have scalability issues, as they load the entire document in main-memory before processing. This Thesis presents an XML partitioning technique that allows main-memory engines to process a class of XQuery expressions (queries and updates), that we dub « iterative », on arbitrarily large input documents. We provide a static analysis technique to recognize these expressions. The static analysis is based on paths extracted from the expression and does not need additional schema information. We provide algorithms using path information for partitioning the input documents, so that the query or update can be separately evaluated on each part in order to compute the final result. These algorithms admit a streaming implementation, whose effectiveness is experimentally validated. Besides enabling scalability, our approach is also characterized by the fact that it is easily implementable into a MapReduce framework, thus enabling parallel query/update evaluation on the partitioned data.

Page generated in 0.054 seconds