Spelling suggestions: "subject:"olerance"" "subject:"atolerance""
31 |
Cold acclimation in ArabidopsisRekarte Cowie, Iona January 2002 (has links)
No description available.
|
32 |
Interfaces for embedded parallel multiprocessor networksTriger, Simon January 2002 (has links)
No description available.
|
33 |
Interspecific gene transfer of cold tolerance into Lycopersicon esculentumCox, Janice Susan January 1989 (has links)
No description available.
|
34 |
Characterisation of T cell anergy in allo-antigen specific CD4⺠cellsWheeler, Paul Richard January 2002 (has links)
No description available.
|
35 |
Performance optimizations for compiler-based error detectionMitropoulou, Konstantina January 2015 (has links)
The trend towards smaller transistor technologies and lower operating voltages stresses the hardware and makes transistors more susceptible to transient errors. In future systems, performance and power gains will come at the cost of unreliable areas on the chip. For this reason, there is an increased need for low-overhead highly-reliable error detection methodologies. In the last years, several techniques have been proposed. The majority of them are based on redundancy which can be implemented at several levels (e.g., hardware, instruction, thread, process, etc). In instruction-level error detection approaches, the compiler replicates the instructions of the program and inserts checks wherever they are needed. The checks evaluate code correctness and decide whether or not an error has occurred. This type of error detection is more flexible than the hardware alternatives. It allows the programmer to choose the protected area of the program and it can be applied without any hardware modifications. On the other hand, the replicated instructions and the checks cause a large slowdown making software techniques less appealing. In this thesis, we propose two techniques that aim at reducing the error detection overhead of compiler-based approaches and improving system’s performance without sacrificing the fault-coverage. The first technique, DRIFT, achieves this by decoupling the execution of the code (original and replicated) from the checks. The checks are compare and jump instructions. The latter ones tend to make the code sequential and prohibit the compiler from performing aggressive instruction scheduling optimizations. We call this phenomenon basic-block fragmentation. DRIFT reduces the impact of basic-block fragmentation by breaking the synchronized execute-check-confirm-execute cycle. In this way, DRIFT generates a scheduler-friendly code with more instruction-level parallelism (ILP). As a result, it reduces the performance overhead down to 1.29× (on average) and outperforms the state-of-the-art by up to 29.7% retaining the same fault-coverage. Next, CASTED focuses on reducing the impact of error detection overhead on single-chip scalable architectures that are composed of tightly-coupled cores. The proposed compiler methodology adaptively distributes the error detection overhead to the available resources across multiple cores, fully exploiting the abundant ILP of these architectures. CASTED adapts to a wide range of architecture configurations (issue-width, inter-core communication). The results show that CASTED matches the performance of, and often outperforms, sometimes by as mush as 21.2%, the best fixed state-of-the-art approach while maintaining the same fault coverage.
|
36 |
Possible associations of soluble carbohydrates with chemical desiccation and drought resistance in winter wheatCerono, Julio Cesar 08 July 1997 (has links)
Drought is a major limiting abiotic stress influencing wheat production in
many parts of the world. The erratic nature of water deficits makes breeding
and selection for drought resistance deficient. In environments with late
season drought stress, yield losses are usually associated with kernel abortion
or reduction in kernel growth. Remobilization of soluble carbohydrates from
the stem has been associated with drought resistance. The objectives of this
investigation were i) to asses the role of soluble carbohydrates in the
determination of drought resistance, ii) their association with productivity, and
iii) to evaluate a rapid technique to identify genotypes with higher capacity of
soluble carbohydrate remobilization. Nine winter wheat cultivars differing in
their response to drought stress were grown under irrigated and nonirrigated
conditions during the grain filling period at the Sherman Branch Experiment
Station, Moro. The cultivars were also grown at the Hyslop Crop Science Laboratory, where plots were chemically desiccated with Sodium Chlorate or
left untreated. All control and treated plots were evaluated for soluble
carbohydrates in two vegetative fractions, stem plus sheath and leaf blades. These values were correlated with the relative reductions in kernel weight and yield observed on the treated plots.
Differences among cultivars were observed for the concentration of soluble carbohydrates in the stem and leaf fractions. Time elapsed from anthesis was a major determinant of the variation in carbohydrates concentration observed during grain filling. Stem soluble carbohydrates accumulated to a much greater extent than leaf soluble carbohydrates. The concentration of stem carbohydrates was not related with the reductions caused by chemical desiccation or drought stress. However, potential contributions from stem reserves (ratio between potential spike weight and stem reserves) were marginally associated with drought resistance. Stem soluble carbohydrates were positively associated with productivity, suggesting that stems are not competitive sinks, but temporary storage organs of excess of assimilates. Under chemical desiccation most of the soluble carbohydrates were lost in respiration, and the reductions in kernel weight and yield observed were not correlated with those observed under drought. It was concluded that the technique did not reasonably simulated drought in terms of yield reductions nor carbohydrate remobilization. / Graduation date: 1998
|
37 |
Computer aided tolerance analysis and process selection for AutoCADPamulapati, Sairam V. 25 February 1997 (has links)
The fundamental objective of a design engineer in performing tolerance technology is to transform functional requirements into tolerances on individual parts based on existing data and algorithms for design tolerance analysis and synthesis. The transformation of functional requirements into tolerances must also consider the existing process capabilities and manufacturing costs to determine the optimal tolerances and processes.
The main objective of this research is to present an integrated but modular system for Computer Aided Tolerance Allocation, Tolerance Synthesis and Process Selection. The module is implemented in AutoCAD using the ARX 1.1 (AutoCAD Runtime Extension Libraries), MFC 4.2, Visual C++ 4.2, Access 7.0, AutoCAD Development System, AutoLISP, and Other AutoCAD Customization tools.
The integrated module has two functions:
a. Tolerance analysis and allocation: This module uses several statistical and optimization techniques to aggregate component tolerances. Random number generators are used to simulate historical data used by most of the optimization techniques to
perform tolerance analysis. Various component tolerance distributions are considered
(Beta, Normal, and Uniform). The proposed analysis technique takes into consideration
the distribution of each fabrication of the component, this provides designers . The
proposed tolerance analysis method takes into consideration the distribution of each
fabrication process of the assembly. For assemblies with non-normal natural process
tolerance distributions, this method allows designers to assign assembly tolerances that
are closer to actual assembly tolerances when compared to other statistical methods. This
is verified by comparing the proposed tolerance analysis method to the results of Monte
Carlo simulations. The method results in assembly tolerances similar to those provided
by Monte Carlo simulation yet is significantly less computationally-intensive.
b. Process Selection: This thesis introduces a methodology for concurrent design that considers the allocation of tolerances and manufacturing processes for minimum cost. This methodology brings manufacturing concerns into the design process. A simulated annealing technique is used to solve the optimization problem. Independent, unordered, manufacturing processes are assumed for each assembly. The optimization technique uses Monte Carlo simulation. A simulated annealing technique is used to control the Monte Carlo analysis. In this optimization technique, tolerances are allocated using the cost-tolerance curves for each of the individual components. A cost-tolerance curve is defined for each component part in the assembly. The optimization algorithm varies the tolerance for each component and searches systematically for the combination of tolerances that minimizes the cost. The proposed tolerance allocation/process selection method was found to be superior to other tolerance allocation methods based on manufacturing costs. / Graduation date: 1997
|
38 |
Associative tolerance to nicotine's analgesic effects: studies on number of conditioning trials and corticosteroneDavis, Kristina 30 September 2004 (has links)
This study examined the number of conditioning trials necessary to produce associative nicotine tolerance and the changes in corticosterone levels during the procedures. Six independent groups of rats (N = 355) were run through tolerance acquisition procedures for 1, 5, or 10 conditioning sessions. Treatment groups were comprised of animals that received nicotine-environment pairings, animals that received nicotine explicitly unpaired with the drug administration environment, and control groups that received either saline throughout or no treatment. Three of the groups were tested for nicotine-induced analgesia using the tail-flick and hot-plate assays, and three groups were blood sampled after either nicotine or saline injection. Pairing of environment with nicotine produced greater tolerance for rats after 5 conditioning sessions in the tail flick and after 10 conditioning sessions in the hot-plate. Corticosterone levels were elevated in all rats given nicotine. Rats that received the nicotine-environment pairing showed a conditioned release of corticosterone in response the environment after both 5 and 10 conditioning sessions.
|
39 |
Test and fault-tolerance for network-on-chip infrastructuresGrecu, Cristian 05 1900 (has links)
The demands of future computing, as well as the challenges of nanometer-era VLSI design, will require new design techniques and design styles that are simultaneously high performance, energy-efficient, and robust to noise and process variation. One of the emerging problems concerns the communication mechanisms between the increasing number of blocks, or cores, that can be integrated onto a single chip. The bus-based systems and point-to-point interconnection strategies in use today cannot be easily scaled to accommodate the large numbers of cores projected in the near future. Network-on-chip (NoC) interconnect infrastructures are one of the key technologies that will enable the emergence of many-core processors and systems-on-chip with increased computing power and energy efficiency. This dissertation is focused on testing, yield improvement and fault-tolerance of such NoC infrastructures.
A fast, efficient test method is developed for NoCs, that exploits their inherent parallelism to reduce the test time by transporting test data on multiple paths and testing multiple NoC components concurrently. The improvement of test time varies, depending on the NoC architecture and test transport protocol, from 2X to 34X, compared to current NoC test methods. This test mechanism is used subsequently to perform detection of NoC link permanent faults, which are then repaired by an on-chip mechanism that replaces the faulty signal lines with fault-free ones, thereby increasing the yield, while maintaining the same wire delay characteristics. The solution described in this dissertation improves significantly the achievable yield of NoC inter-switch channels â from 4% improvement for an 8-bit wide channel, to a 71% improvement for a 128-bit wide channel. The direct benefit is an improved fault-tolerance and increased yield and long-term reliability of NoC based multicore systems.
|
40 |
The bracketing breakdown: An exploration of risk tolerance in broad and narrow choice framesMoher, Ester January 2009 (has links)
The field of decision making has largely focused on the influence of contextual factors on risk tolerance. Much work has focused on how the problem itself is presented, in hopes of understanding the circumstances under which individuals may be helped in areas of long-term investment and planning through encouragement of greater risk tolerance. Specifically, when making financial decisions, it has been suggested that by presenting individual decisions in groups (Gneezy & Potters, 1997), or by presenting feedback less frequently (Thaler et al, 1997), participants are able to process individual problems in a holistic manner, which encourages risk tolerance when deciding. This literature has typically made claims that these effects are dependent on how the problem is presented. However, evidence for the benefits of “broadly bracketed” problems often relies as much on the presentation of aggregated outcomes as it relies on the grouping of problems. The purpose of this thesis was to further examine whether bracketing effects might be attributable to manipulations of problem framing or outcome framing.
In addition, it has been suggested that perhaps individuals who differ in processing styles might respond differentially to framing effects in general (Frederick, 2005). That is, perhaps individuals who are more intuitive decision makers might be more susceptible to context-based changes, and so might show larger framing effects. Deliberative decision makers, on the other hand, might overcome these framing effects by reflecting on, or actively “reframing”, the problem. A secondary purpose of this thesis was thus to investigate individual differences in the magnitude of the bracketing effect on risk tolerance.
In Experiment 1, problem and outcome bracketing were examined in the domain of discrete choices, while in Experiment 2, bracketing was examined with continuous
iv
investments. Results suggest that when investment opportunities are identical, problem framing encourages long-term risk tolerance. However, when choices are somewhat different from one another, as is often the case in real-world investment situations, outcome information is critical to encouraging long-term risk tolerance. Together, results suggest a critical reevaluation of the bracketing hypothesis and its application to long-term investment.
|
Page generated in 0.0546 seconds