• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 543
  • 275
  • 80
  • 78
  • 70
  • 30
  • 26
  • 25
  • 19
  • 16
  • 10
  • 7
  • 6
  • 6
  • 6
  • Tagged with
  • 1465
  • 188
  • 146
  • 105
  • 102
  • 96
  • 94
  • 89
  • 81
  • 68
  • 68
  • 66
  • 60
  • 58
  • 57
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Investigating the Effect of Freeway Congestion Thresholds on Decision-making Inputs

Qu, Tongbin 2010 August 1900 (has links)
Congestion threshold is embedded in the congestion definition. Two basic approaches exist in current practice for setting the congestion threshold. One common approach uses the “free-flow” or unimpeded conditions as the congestion threshold. Another approach uses target or “acceptable” conditions. The limited research that has been conducted on the congestion threshold issue focuses on operational problems or policy debates, but relatively little investigation of the effect on decision-making for transportation investment and resource allocation. This research investigated the differences inherent in the threshold choices using detailed freeway data from seven metropolitan areas. Congestion performance measures of delay per mile, Travel Time Index and Planning Time Index were evaluated. This research specifically examined: 1) the ranking values of congestion measure for different congestion thresholds under a variety of real-world travel time distributions, 2) the relationship between change of congestion threshold and change of performance measure, and 3) the appropriateness of using speed limit as a congestion threshold choice by evaluating the peak and off-peak average speed changes in relation to a speed limit change in Houston, Texas. The rankings of congestion measures for freeway segments hold steady across the congestion thresholds ranging from 60 mph to 30 mph and across the congestion measures. From an investment point of view, the congestion threshold speed used is not a concern for funding allocation. The relationship between the delay values for an alternative threshold and the 60 mph threshold has a quadratic form. As the alternative threshold decreases further away from 60 mph, the increment is larger. The more congested a section is, the less the threshold affects measured congestion. For very congested sections, most of the delay is associated with speeds below 30 mph. The posted speed limit affects travel time distribution in the free flow driving condition but does not affect travel time distribution during congested driving conditions. However, if the speed limit or a percentage of speed limit is used to estimate the congestion, the amount of congestion may be underestimated because the free flow speed is higher than the speed limit.
62

Dual Threshold Voltage SRAM & BIST Comparators

Lee, Po-Ming 24 September 2003 (has links)
Since the invention of SRAM (Static Random Access Memory), many improvements have been proposed. The major targets are speed, area, and power consumption. The evolution of the CMOS process technology makes it possible to implement SRAM by using dual threshold voltage transistors. Hence, we will use TSMC (Taiwan Semiconductor Manufacturing Company) 0.25 $mu$m 1P5M CMOS process to realize the dual threshold voltage SRAM in this thesis. In order to reduce SRAM internal power consumption, we also propose quenchers to suppress unwanted oscillation between bit lines. In addition, several types of BIST (Build In Self Test) comparators are also proposed to test the mentioned SRAM. After detailed simulations, the proposed comparators possess impressive results in high fan-in, low transistor count, and high speed. The proposed SRAM and BIST comparators are fabricated by the CMOS process provided by National Science Council Chip Implementation Center (CIC). The measurements of the chips are fully corrected to meet the design goals.
63

Bounds on the map threshold of iterative decoding systems with erasure noise

Wang, Chia-Wen 10 October 2008 (has links)
Iterative decoding and codes on graphs were first devised by Gallager in 1960, and then rediscovered by Berrou, Glavieux and Thitimajshima in 1993. This technique plays an important role in modern communications, especially in coding theory and practice. In particular, low-density parity-check (LDPC) codes, introduced by Gallager in the 1960s, are the class of codes at the heart of iterative coding. Since these codes are quite general and exhibit good performance under message-passing decoding, they play an important role in communications research today. A thorough analysis of iterative decoding systems and the relationship between maximum a posteriori (MAP) and belief propagation (BP) decoding was initiated by Measson, Montanari, and Urbanke. This analysis is based on density evolution (DE), and extrinsic information transfer (EXIT) functions, introduced by ten Brink. Following their work, this thesis considers the MAP decoding thresholds of three iterative decoding systems. First, irregular repeat-accumulate (IRA) and accumulaterepeataccumulate (ARA) code ensembles are analyzed on the binary erasure channel (BEC). Next, the joint iterative decoding of LDPC codes is studied on the dicode erasure channel (DEC). The DEC is a two-state intersymbol-interference (ISI) channel with erasure noise, and it is the simplest example of an ISI channel with erasure noise. Then, we introduce a slight generalization of the EXIT area theorem and apply the MAP threshold bound for the joint decoder. Both the MAP and BP erasure thresholds are computed and compared with each other. The result quantities the loss due to iterative decoding Some open questions include the tightness of these bounds and the extensions to non-erasure channels.
64

Inflation,growth and welfare in a small open economy

Wang, Xing-bin 11 August 2009 (has links)
none
65

Development of a Novel Method for Deriving Thresholds of Toxicological Concern (TTCs) for Vaccine Constituents

White, Jennifer Jessica 01 January 2013 (has links)
Abstract Safety assessment relating to the presence of impurities, residual materials and contaminants in vaccines is a focus area of research at the United States Food and Drug Administration (FDA). Sponsors who submit Investigational New Drug (IND) applications for new vaccine products must report the results of safety assessments to the Division of Vaccines and Related Products Applications (DVRPA). Scientifically defining thresholds of toxicological concern (TTCs) as they apply to vaccine constituents will provide a useful aid to the sponsors and public regarding safety assessments of compounds for which there is little or no toxicity data. TTCs are mathematically modeled and extrapolated levels, below which adverse human health effects are not expected to occur (Kroes, 2004). In this project, we accessed DVRPA's submission databases and open source data to yield an initial chemical test set. Using INCHEM, RepDose, RTECS and TOXNET, we gathered LD50 and TDLo data. Using a structure-based decision tree, provided in the ToxTree software package, (3) different algorithms (The Cramer extended, the In vivo rodent micronucleus assay, and the Benigni-Bossa rule base for carcinogenicity by ISS) were applied to assign the initial test set (n= 197) of chemicals into structural families based on structural alerts (SAs). This resulted in six (6) potential methods for elucidating TTCs: In vivo rodent micronucleus assay/ LD50, Benigni-Bossa/ LD50, Cramer extended/ LD50, In vivo rodent micronucleus assay/ TDLo, Benigni-Bossa/ TDLo, and the Cramer extended/ TDLo. After each algorithm designated two structural families each, the distribution of TDLo's and LD50's for each structural family was subjected to a preliminary data analysis using JMP statistical software version 9. Based on an analysis of quantiles, skew, and kurtosis, it was concluded that the TDLo dataset was of poor quality and was dropped from further analysis, and that the In vivo rodent micronucleus assay algorithm failed to partition the initial test set in a meaningful way, so it too was culled from further consideration. This resulted in (2) remaining TTC methods for further consideration: Benigni-Bossa/ LD50 and the Cramer extended/ LD50. The remaining methods were subjected to internal validation based on Gene-Tox, CCRIS, CPDB, IARC, and EPA classaifications for genotoxic mutagenicity and carcinogenicity. Validation parameters were calculated for both methods and it was determined that the Benigni-Bossa/ LD50 method outperformed the Cramer extended/ LD50 method in terms of specificity (87.2 vs. 48.1%#37;), accuracy (65.2 vs. 52.94%#37;), positive predictivity (66.6 vs. 50%#37;), negative predictivity (64.8 vs. 56.5%#37;), ROC+ (2 vs. 1) and ROC- (1.84 vs. 1.3). These results indicated that the Benigni-Bossa/ LD50 was the most appropriate for calculating TTCs for vaccine constituents. For each class, the lower 2.5th percentile LD50 was extrapolated to a TTC value using safety estimates derived using uncertainty factors (UF) and adjusting for adult human weight. Final TTCs were designated as 18.06 μg/ person and 20.616 μg/ person for the Benigni-Bossa positive and negative structural families.
66

Resource allocation of drones flown in a simulated environment / Resursfördelning av drönare i en simulerad miljö

Wikström, Anders January 2014 (has links)
In this report we compare three different assignment algorithms in how they can be used to assign a set of drones to get to a set of goal locations in an as resource efficient way as possible. An experiment is set up to compare how these algorithms perform in a somewhat realistic simulated environment. The Robot Operating system (ROS) is used to create the experimental environment. We found that by introducing a threshold for the Hungarian algorithm we could reduce the total time it takes to complete the problem while only sightly increasing total distance traversed by the drones.
67

Inhabiting the Threshold: Housing and Public-Private Interface at Halifax’s St. Patrick’s-Alexandra School

Christian, Michael 17 March 2014 (has links)
A public-private interface is a dynamic threshold between the private residence and the public city. It can be critically examined in terms of social scales, defensibility and ownership of space. As cities densify, they face the challenge of providing dwelling space while intensifying community integration. Current approaches to housing often rarify cultural and social richness in the resultant communities. A new framework is needed for residential development, including an awareness of social dynamics, and building respectfully on positive patterns in existing contexts. This thesis proposes a densifying mixed-used residential scheme on the vacant site of Halifax’s St. Patrick’s-Alexandra School, governed by a framework of social scales and responding to typological and physical conditions in the community. It seeks to integrate public services into existing structures, and to articulate the threshold between public and private programs, making a case for a socially vibrant model of urban housing.
68

Buddhist Society of Wonderful Enlightenment Terrace: Observations on Functionalism

Lo, Kevin Kei Fung 18 March 2013 (has links)
Louis Sullivan’s “form ever follows function” had a profound influence on architecture. Although often confused as synonymous with modernism, functionalism is more closely related to positivism in its bias toward science and its rejection of introspective knowledge. This dismissal of the superfluous (such as aesthetic form or ornamentation) diminished the intuitive “human” in architecture by assuming universal rationality. This thesis re-examines functionalism in a contemporary setting: a vertical Buddhist temple set in between two tenement buildings within a New York City plot. Influenced by the work of Lars Lerup and the early work of Diller and Scofidio, the design explores the poetic tensions and obsessions between the profane world of the inhabitants and the sacred world of the temple through abstraction without any attempt to resolve them.
69

Screening of EIA in the Free State Province : a comparative analysis between the 1997 and 2006 EIA Regulations / C.N.J. Welman

Welman, Coert Nicolaas Jacobus January 2008 (has links)
Thesis (M. Environmental Management)--North-West University, Potchefstroom Campus, 2009.
70

Screening of EIA in the Free State Province : a comparative analysis between the 1997 and 2006 EIA Regulations / C.N.J. Welman

Welman, Coert Nicolaas Jacobus January 2008 (has links)
Thesis (M. Environmental Management)--North-West University, Potchefstroom Campus, 2009.

Page generated in 0.2271 seconds