• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 86
  • 50
  • 23
  • 15
  • 15
  • 10
  • 9
  • 7
  • 6
  • 4
  • 2
  • 2
  • 2
  • 2
  • 2
  • Tagged with
  • 284
  • 37
  • 30
  • 29
  • 25
  • 23
  • 22
  • 21
  • 19
  • 18
  • 18
  • 18
  • 18
  • 17
  • 17
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
171

Evaluation of the operational effects of u-turn movement

Liu, Pan 01 June 2006 (has links)
In Florida, the increased installation of non-traversable medians and directional median opening has produced an increased number of U-turns on multilane highways. Arguments have been advanced by some opponents of median modification projects that the increased numbers of U-turns may result in safety and operational problems on multilane highways. The primary objective of this study is to evaluate the operational effects of U-turn movement on multilane roadways. To achieve this research objective, extensive data were collected. Field measurements were conducted at 40 sites in the Tampa Bay area of Florida to collect traffic operations data. Besides, the crash histories of 179 selected roadway segments in central Florida were investigated. Statistical analysis was conducted based on the collected traffic operations data and crash data to quantitatively evaluate the operational performance of U-turn movement. Delay and travel time were compared for different driveway left- turn alternatives that are widely used in Florida and nationally. Crash rate models were developed to evaluate how the separation distance between a driveway exit and the downstream U-turn bay impacts the safety performance of vehicles making right-turns followed by U-turns (RTUT). With the crash data analysis results, the minimum separation distances under different roadway conditions were determined to facilitate driver use of RTUTs. The capacity of U-turn movement was analyzed under two different situations: (1) U-turns are provided at a signalized intersection; and (2) U-turns are provided at an unsignalized intersection. Adjustment factors were developed to quantify the impacts of the presence of U-turning vehicles on the capacity of a signalized intersection. The critical gaps and follow-up time for U-turn movement at unsignalized intersections were estimated. With the estimated critical gaps and follow-up time, the Harders model was used to determine the capacity of U-turn movem ent at an unsignalized intersection. This study also looks extensively at the minimum roadway width and median width required by vehicles to perform U-turn maneuvers on 4-lane divided roadways. It was found that a roadway width of 46 ft is generally sufficient for most types of design vehicles (except heavy vehicles) to perform a continuous U-turn maneuver without impedance.
172

Locating median lines and hyperplanes with a restriction on the slope / Platzierung von Mediangeraden und Medianhyperebenen mit einer Beschränkung der Steigung

Krempasky, Thorsten 17 May 2012 (has links)
No description available.
173

Modélisation Volumes-Finis en maillages non-structurés de décharges électriques à la pression atmosphérique

Zakari, Mustapha 10 December 2013 (has links) (PDF)
La modélisation numérique des décharges plasma joue un rôle important dans la compréhension des mécanismes physiques ou chimiques ayant lieu dans les dispositifs assistés par plasma. Une grande partie de ces mécanismes est déjà prise en compte dans les codes actuels. En revanche, beaucoup d'entre eux ne permettent pas de travailler avec des géométries complexes. Cette limitation provient essentiellement de l'utilisation de maillages structurés, cartésiens. Ceux-ci ne sont pas bien adaptés aux géométries courbes. Les calculs en maillages structurés deviennent rapidement compliqués et spécifiques à une géométrie donnée. Notre travail concerne la modélisation de décharge pour un réacteur de traitement à la pression atmosphérique développé par Dow Corning. Sa configuration complexe ainsi que ses grandes dimensions nous ont incités à faire un nouveau code fonctionnant en maillages non structurés. Celui-ci doit être capable de s'adapter à la présence d'une pointe, d'arrondis et de multiples diélectriques mais aussi permettre le passage rapide à de nouvelles géométries. De plus ses grandes dimensions nécessitent l'utilisation de maillages raffinés uniquement aux endroits nécessaires (pointe, surfaces des diélectriques...). Le modèle mathématique utilisé est basé sur l'équation de Poisson couplée aux équations de transport de type dérive-diffusion. Plusieurs discrétisations numériques ont été testées dans des configurations physiques différentes. Nous présentons et validons les méthodes numériques choisies. Les résultats obtenus pour le réacteur Dow Corning sont alors exposés et commentés.
174

Hypothyroidism and Pregnancy

Granfors, Michaela January 2015 (has links)
Hypothyroidism is a common endocrine disorder affecting women of reproductive age. On a global level, iodine deficiency is still the most common cause of hypothyroidism. Also genetic variations, in particular SNP rs4704397 in the PDE8B gene, are responsible for a significant proportion of TSH variations.  Untreated hypothyroidism has significant adverse effects on pregnancy and fetal outcome. Most international guidelines suggest targeted thyroid testing in pregnant women with risk factors for thyroid disturbances. In a case-control study, an association between homozygous A/A as well as homozygous G/G carriers of SNP rs 4704397 in PDE8B and recurrent miscarriage was found. The explanation for this association is unknown. In a nationwide survey, all guidelines for thyroid testing and management of hypothyroidism during pregnancy in Sweden were collected and compared with international guidelines. The local guidelines were variable and poorly compliant with the international guidelines. In a follow-up in one district, 5,254 pregnant women were included for subsequent review of their medical reports. We found a targeted thyroid testing rate of 20.1% in clinical practice, with an overall frequency of women with trimester-specific elevated TSH of 18.5%. More disturbingly, half of the women who were on levothyroxine treatment at the time of conception had an elevated TSH level at thyroid testing. In a subsequent cohort study of the 5,254 women, we found the prevalence of trimester-specific elevated TSH and overt hypothyroidism to be equal in targeted thyroid tested and untested women. In a cross-sectional study, a median urinary iodine concentration (UIC) of 98 μg/l was found in the study population. According to WHO/UNICEF/IGN criteria, the population-based median UIC during pregnancy should be 150-249 μg/l. In conclusion, genetic variations may contribute to adverse pregnancy outcomes. In clinical practice, thyroid testing and the management of hypothyroidism during pregnancy is unsatisfactory, regarding the whole chain from development of local guidelines to their implementation and to targeted thyroid testing. Moreover, our results indicate insufficient iodine status in the pregnant population of Sweden.
175

Enhance the understanding of whole-genome evolution by designing, accelerating and parallelizing phylogenetic algorithms

Yin, Zhaoming 22 May 2014 (has links)
The advent of new technology enhance the speed and reduce the cost for sequencing biological data. Making biological sense of this genomic data is a big challenge to the algorithm design as well as the high performance computing society. There are many problems in Bioinformatics, such as how new functional genes arise, why genes are organized into chromosomes, how species are connected through the evolutionary tree of life, or why arrangements are subject to change. Phylogenetic analyses have become essential to research on the evolutionary tree of life. It can help us to track the history of species and the relationship between different genes or genomes through millions of years. One of the fundamentals for phylogenetic construction is the computation of distances between genomes. Since there are much more complicated combinatoric patterns in rearrangement events, the distance computation is still a hot topic as much belongs to mathematics as to biology. For the distance computation with input of two genomes containing unequal gene contents (with insertions/deletions and duplications) the problem is especially hard. In this thesis, we will discuss about our contributions to the distance estimation for unequal gene order data. The problem of finding the median of three genomes is the key process in building the most parsimonious phylogenetic trees from genome rearrangement data. For genomes with unequal contents, to the best of our knowledge, there is no algorithm that can help to find the median. In this thesis, we make our contributions to the median computation in two aspects. 1) Algorithm engineering aspect, we harness the power of streaming graph analytics methods to implement an exact DCJ median algorithm which run as fast as the heuristic algorithm and can help construct a better phylogenetic tree. 2) Algorithmic aspect, we theoretically formulate the problem of finding median with input of genomes having unequal gene content, which leads to the design and implementation of an efficient Lin-Kernighan heuristic based median algorithm. Inferring phylogenies (evolutionary history) of a set of given species is the ultimate goal when the distance and median model are chosen. For more than a decade, biologists and computer scientists have studied how to infer phylogenies by the measurement of genome rearrangement events using gene order data. While evolution is not an inherently parsimonious process, maximum parsimony (MP) phylogenetic analysis has been supported by widely applied to the phylogeny inference to study the evolutionary patterns of genome rearrangements. There are generally two problems with the MP phylogenetic arose by genome rearrangement: One is, given a set of modern genomes, how to compute the topologies of the according phylogenetic tree; Another is, given the topology of a model tree, how to infer the gene orders of the ancestor species. To assemble a MP phylogenetic tree constructor, there are multiple NP hard problems involved, unfortunately, they organized as one problem on top of other problems. Which means, to solve a NP hard problem, we need to solve multiple NP hard sub-problems. For phylogenetic tree construction with the input of unequal content genomes, there are three layers of NP hard problems. In this thesis, we will mainly discuss about our contributions to the design and implementation of the software package DCJUC (Phylogeny Inference using DCJ model to cope with Unequal Content Genomes), that can help to achieve both of these two goals. Aside from the biological problems, another issue we need to concern is about the use of the power of parallel computing to assist accelerating algorithms to handle huge data sets, such as the high resolution gene order data. For one thing, all of the method to tackle with phylogenetic problems are based on branch and bound algorithms, which are quite irregular and unfriendly to parallel computing. To parallelize these algorithms, we need to properly enhance the efficiency for localized memory access and load balance methods to make sure that each thread can put their potentials into full play. For the other, there is a revolution taking place in computing with the availability of commodity graphical processors such as Nvidia GPU and with many-core CPUs such as Cray-XMT, or Intel Xeon Phi Coprocessor with 60 cores. These architectures provide a new way for us to achieve high performance at much lower cost. However, code running on these machines are not so easily programmed, and scientific computing is hard to tune well on them. We try to explore the potentials of these architectures to help us accelerate branch and bound based phylogenetic algorithms.
176

Utvärdering av mötesfria vägar : Analys av olyckor på mötesfria vägar i Karlstadsregionen / Evaluation of median barriers : Analysis of accidents on roads with median barriers in the Karlstad region

Kylén, Linda January 2014 (has links)
Sedan år 1998 har det i Nollvisionens fotspår startats ett utvecklingsprogram i Sverige som syftar till att omvandla gamla 13 meters landsvägar och motortrafikleder till mötesfria. Implementeringen var tänkt att påtagligt reducera antalet mötes- och omkörningsolyckor samt singelolyckor med svåra konsekvenser i form av svårt skadade och dödade utan att försämra trafiksäkerheten i övrigt. Syftet med denna studie är att göra en effektmätning av de mötesfria vägarnas införande i Karlstadsregionen samt att göra en sammanställning av de olycksrisker mötesfria vägar omfattas av. Frågeställningarna som används i studien är: - Har det blivit säkrare på vägarna sedan implementeringen av mötesfria vägar i Karlstadsregionen? - Hur sker olyckor på mötesfria vägar inom Karlstadsregionen?  För att beskriva hur olyckor sker på mötesfria vägar inom Karlstadsregionen har en deskriptiv analys tillämpats som grundats på de beskrivningar av händelseförlopp som dokumenterats i STRADA och CORE, mellan åren 2010-2013. För att avgöra huruvida vägarna blivit säkrare sedan implementering tillämpades en segmenterad linjär regressionsanalys där antalet personskadeolyckor studerats, tre år innan och tre år efter ombyggnad för respektive vägavsnitt. Singel- och upphinnandeolyckor var de dominerande olyckstyperna på mötesfria vägar i Karlstadsregionen mellan åren 2010-2013 då de sammanlagt stod för 72,3% av samtliga olyckor som medfört skada. Vid kategoriseringen av huvudorsak till olycka framgick det att 42% av alla olyckor kan spåras till brister i samspel mellan trafikanter och väderförhållanden bedömdes i 24,1% av fallen vara huvudorsak till olycka. Den statistiska analysen var inte signifikant, men gav indikation på att vägarna blivit säkrare sedan implementering då trenden för samtliga skadade minskat. / In the footsteps of Vision Zero, a development program in Sweden was initiated in 1998. The program aimed to increase road safety on existing 13-meter roads and express roads by implementing median barriers. The purpose of this study is to measure the impact of the transformed roadways in the Karlstad region and to examine the different types of accident risks the roadways are covered by. The research questions used are: - Has the implementation of median barriers in the Karlstad region contributed to safer roads? - How do accidents occur on roads with median barriers? To describe how accidents occur on roads with median barriers in Karlstad region a descriptive analysis was made by the description of event that is documented in STRADA and CORE, between the years 2010-2013. To determine whether the roads became safer after implementation a segmented linear regression analysis was applied. Accidents resulting in injury were examined, three years before and three years after reconstruction for each road section. Single-vehicle accidents and rear-end collisions were the dominating accident types on roadways with median barriers in the Karlstad region between the years 2010-2013. They together accounted for 72,3% of all accidents that resulted in injury. When the main cause of accident was examined, it emerged that 42% of all accidents could be traced to deficiencies in the interaction between road users. Weather conditions were estimated to be the main cause of accident in 24,1% of all the studied cases. The statistical analysis was not significant, but indicated that the roads became safer after the implementation since the observed trend for all types of injured decreased.
177

Multi Item Integrated Location/inventory Problem

Balcik, Burcu 01 January 2003 (has links) (PDF)
In this study, the design of a three-level distribution system is considered in which a single supplier ships a number of items to the retailers via a set of distribution centers (DC) and stochastic demand is observed at the retailers. The problem is to specify the number and location of the DCs, and the assignment of the retailers to the DCs in such a way that total facility, transportation, safety stock, and joint ordering and average inventory costs are minimized, and customer service requirements are satisfied. Single source constraints are imposed on the assignment of the retailers to the DCs. The integrated location/inventory model incorporates the inventory management decisions into the strategic location/allocation decisions by considering the benefits of risk pooling and the savings that result in the joint replenishment of a group of items. We develop two heuristic methods to solve the non-linear integer-programming model in an integrated way: (1) Improvement type heuristic, (2) Constructive type heuristic. The heuristic algorithms are tested on a number of problem instances with 81 demand points (retailers) and 4 different types of items. Both of the heuristics are able to generate solutions in very reasonable times. The results are compared to the results of the p-median problem and found that the total cost and the number of DCs can be lowered using our integrated model instead of the p-median problem. Finally, sensitivity analysis is performed with respect to the changes in inventory, transportation, and ordering cost parameters, and variability of the demand.
178

Long-term trends in fine particle number concentrations in the urban atmosphere of Brisbane : the relevance of traffic emissions and new particle formation

Mejia, Jaime F. January 2008 (has links)
The measurement of submicrometre (< 1.0 m) and ultrafine particles (diameter < 0.1 m) number concentration have attracted attention since the last decade because the potential health impacts associated with exposure to these particles can be more significant than those due to exposure to larger particles. At present, ultrafine particles are not regularly monitored and they are yet to be incorporated into air quality monitoring programs. As a result, very few studies have analysed their long-term and spatial variations in ultrafine particle concentration, and none have been in Australia. To address this gap in scientific knowledge, the aim of this research was to investigate the long-term trends and seasonal variations in particle number concentrations in Brisbane, Australia. Data collected over a five-year period were analysed using weighted regression models. Monthly mean concentrations in the morning (6:00-10:00) and the afternoon (16:00-19:00) were plotted against time in months, using the monthly variance as the weights. During the five-year period, submicrometre and ultrafine particle concentrations increased in the morning by 105.7% and 81.5% respectively whereas in the afternoon there was no significant trend. The morning concentrations were associated with fresh traffic emissions and the afternoon concentrations with the background. The statistical tests applied to the seasonal models, on the other hand, indicated that there was no seasonal component. The spatial variation in size distribution in a large urban area was investigated using particle number size distribution data collected at nine different locations during different campaigns. The size distributions were represented by the modal structures and cumulative size distributions. Particle number peaked at around 30 nm, except at an isolated site dominated by diesel trucks, where the particle number peaked at around 60 nm. It was found that ultrafine particles contributed to 82%-90% of the total particle number. At the sites dominated by petrol vehicles, nanoparticles (< 50 nm) contributed 60%-70% of the total particle number, and at the site dominated by diesel trucks they contributed 50%. Although the sampling campaigns took place during different seasons and were of varying duration these variations did not have an effect on the particle size distributions. The results suggested that the distributions were rather affected by differences in traffic composition and distance to the road. To investigate the occurrence of nucleation events, that is, secondary particle formation from gaseous precursors, particle size distribution data collected over a 13 month period during 5 different campaigns were analysed. The study area was a complex urban environment influenced by anthropogenic and natural sources. The study introduced a new application of time series differencing for the identification of nucleation events. To evaluate the conditions favourable to nucleation, the meteorological conditions and gaseous concentrations prior to and during nucleation events were recorded. Gaseous concentrations did not exhibit a clear pattern of change in concentration. It was also found that nucleation was associated with sea breeze and long-range transport. The implications of this finding are that whilst vehicles are the most important source of ultrafine particles, sea breeze and aged gaseous emissions play a more important role in secondary particle formation in the study area.
179

Essays on macroeconomics and international finance /

Francisco, Eva de. January 2004 (has links) (PDF)
NY, Univ., Dep. of Economics, Diss.--Rochester, 2004. / Kopie, ersch. im Verl. UMI, Ann Arbor, Mich. - Enth. 3 Beitr.
180

Μακροοικονομική απόδοση και ανεξαρτησία Κεντρικής Τράπεζας / Macroeconomic performance and Central Bank independence

Δημακοπούλου, Νικολίτσα 20 October 2010 (has links)
Σκοπός της παρούσας διπλωματικής είναι να εξετάσει την επίδραση στην μακροοικονομική απόδοση της ανεξαρτησίας της Κεντρικής Τράπεζας. Αποτελεί ενδιαφέρον θέμα τόσο για τις νεοεισερχόμενες χώρες της Ευρωπαικής Ένωσης αλλά και για τις χώρες που εμφανίζουν ιδιαίτερα υψηλό πληθωρισμό. / The purpose of this paper is to examine the impact of the central bank independence on the macroeconomic performance.It is considered to be a very interesting topic not only for the 'newcomers'countries to the European Union but also for the countries with high inflation.

Page generated in 0.0419 seconds