Spelling suggestions: "subject:"dcaling."" "subject:"fcaling.""
71 |
A survey on misunderstanding of dental scaling in Hong KongYoung, Yau-yau, Cecilia., 楊幽幽. January 2006 (has links)
published_or_final_version / Community Medicine / Master / Master of Public Health
|
72 |
Characterisation and monitoring of mineral deposits in down-hole petroleum pipelinesChristidis, Konstantinos January 2000 (has links)
No description available.
|
73 |
A divide-and-conquer implementation of the discrete variational DFT method for large molecular and solid systemsWarschkow, Oliver January 1999 (has links)
No description available.
|
74 |
The grand old party - a party of values?Mair, Patrick, Rusch, Thomas, Hornik, Kurt 27 November 2014 (has links) (PDF)
In this article we explore the semantic space spanned by self-reported statements of Republican voters. Our semantic structure analysis uses multidimensional scaling and social network analysis to extract, explore, and visualize word patterns and word associations in response to the stimulus statement "I'm a Republican, because ..." which were collected from the official website of the Republican Party. With psychological value theory as our backdrop, we examine the association of specific keywords within and across the statements, compute clusters of statements based on these associations, and explore common word sequences Republican voters use to characterize their political association with the Party. (authors' abstract)
|
75 |
On the Role of Performance Interference in Consolidated EnvironmentsRameshan, Navaneeth January 2016 (has links)
With the advent of resource shared environments such as the Cloud, virtualization has become the de facto standard for server consolidation. While consolidation improves utilization, it causes performance-interference between Virtual Machines (VMs) from contention in shared resources such as CPU, Last Level Cache (LLC) and memory bandwidth. Over-provisioning resources for performance sensitive applications can guarantee Quality of Service (QoS), however, it results in low machine utilization. Thus, assuring QoS for performance sensitive applications while allowing co-location has been a challenging problem. In this thesis, we identify ways to mitigate performance interference without undue over-provisioning and also point out the need to model and account for performance interference to improve the reliability and accuracy of elastic scaling. The end goal of this research is to leverage on the observations to provide efficient resource management that is both performance and cost aware. Our main contributions are threefold; first, we improve the overall machine utilization by executing best-effort applications along side latency critical applications without violating its performance requirements. Our solution is able to dynamically adapt and leverage on the changing workload/phase behaviour to execute best-effort applications without causing excessive interference on performance; second, we identify that certain performance metrics used for elastic scaling decisions may become unreliable if performance interference is unaccounted. By modelling performance interference, we show that these performance metrics become reliable in a multi-tenant environment; and third, we identify and demonstrate the impact of interference on the accuracy of elastic scaling and propose a solution to significantly minimise performance violations at a reduced cost. / <p>QC 20160927</p>
|
76 |
Politiques de gestion d'énergie et de température dans les systèmes informatiques / Scheduling algorithms for energy and thermal management in computer systemsLetsios, Dimitrios 22 October 2013 (has links)
La gestion de la consommation d’énergie et de la température est devenue un enjeu crucial dans les systèmes informatiques. En effet, un grand centre de données consomme autant d’électricité qu’une ville et les processeurs modernes atteignent des températures importantes dégradant ainsi leurs performances et leur fiabilité. Dans cette thèse, nous étudions différents problèmes d’ordonnancement prenant en compte la consommation d’énergie et la température des processeurs en se focalisant sur leur complexité et leur approximabilité. Pour cela, nous utilisons le modèle de Yao et al. (1995) (modèle de variation de vitesse) pour la gestion d’énergie et le modèle de Chrobak et al. (2008) pour la gestion de la température. / Nowadays, the enegy consumption and the heat dissipation of computing environments have emerged as crucial issues. Indeed, large data centers consume as muse electricity as a city while modern processors attain high temperatures degrading their performance and decreasing their reliability.. In this thesis, we study various energy and temperature aware scheduling problems and we focus on their complexity and approximability. A dominant technique for saving energy is by prosper scheduling of the jobs through the operating system combined with appropriate scaling of the processor's speed. This technique is referred to as speed scaling in the literature and its theoretical study was initiated by Yao, Demers and Shenker (FOCS'1995). In order to manage the thermal behavior of a computing device, we adaopt the approach of Chrobak, Dürr, Hurand and Robert (AAIM'2008). The main assumption is that some jobs are more CPU intensive than others and more heat is generated during their execution. Moreover, the cooling of a computing device occurs by introducing appropriate idle periods.
|
77 |
Public street surveillance: a psychometric study on the perceived social risk.BROOKS, David, d.brooks@ecu.edu.au January 2003 (has links)
Public street surveillance, a domain of Closed Circuit Television (CCTV), has grown enormously and is becoming common place with increasing utilization in society as an all-purpose security tool. Previous authors (Ditton, 1999; Davies, 1998; Horne, 1998; Tomkins, 1998) have raised concern over social, civil and privacy issues, but there has been limited research to quantify these concerns. There are a number of core aspects that could relocate the risk perception and therefore, social support of public street surveillance. This study utilized the psychometric paradigm to quantitatively measure the social risk perception of public street surveillance. The psychometric paradigm is a method that presents risk perception in a two factor representation, being dread risk and familiarity to risk. Four additional control activities and technologies were tested, being radioactive waste, drinking water chlorination, coal mining disease and home swimming pools. Analysis included spatial representation, and multidimensional scaling (MDS) Euclidean and INDSCAL methods. The study utilized a seven point Likert scale, pre and post methodology, and had a target population of N=2106, with a sample of N=135 (alpha=0.7).
|
78 |
Skalning och brusberäkning av tvåportsadaptorer / Scaling and noisecalculation of twoportadaptorSamuelsson, Daniel January 2004 (has links)
<p>The goal of this work is to summarize the calculations for scaling and noise of twoportadaptor. Two different methods has been described and used for the final results.</p>
|
79 |
Impact of Technology Scaling on Leakage Reduction TechniquesGhafari, Payam January 2007 (has links)
CMOS technology is scaling down to meet the performance, production cost, and power requirements of the microelectronics industry. The increase in the transistor leakage current is one of the most important negative side effects of technology scaling. Leakage affects not only the standby and active power consumption, but also the circuit reliability, since it is strongly correlated to the process variations. Leakage current influences circuit performance differently depending on: operating conditions (e.g., standby, active, burn in test), circuit family (e.g., logic or memory), and environmental conditions (e.g., temperature, supply voltage). Until the introduction of high-K gate dielectrics in the lower nanometer technology nodes, gate leakage will remain the dominant leakage component after subthreshold leakage.
Since the way designers control subthreshold and gate leakage can change from one technology to another, it is crucial for them to be aware of the impact of the total leakage on the operation of circuits and the techniques that mitigate it.
Consequently, techniques that reduce total leakage in circuits operating in the active mode at different temperature conditions are examined. Also, the implications of technology scaling on the choice of techniques to mitigate total leakage are investigated. This work resulted in guidelines for the design of low-leakage circuits in nanometer technologies. Logic gates in the 65nm, 45nm, and 32nm nodes are simulated and analyzed. The techniques that are adopted for comparison in this work affect both gate and subthreshold leakage, namely, stack forcing, pin reordering, reverse body biasing, and high threshold voltage transistors. Aside from leakage, our analysis also highlights the impact of these techniques on the circuit's performance and noise margins.
The reverse body biasing scheme tends to be less effective as the technology scales since this scheme increases the band to band tunneling current. Employing high threshold voltage transistors seems to be one of the most effective techniques for reducing leakage with minor performance degradation. Pin reordering and natural stacks are techniques that do not affect the performance of the device, yet they reduce leakage. However, it is demonstrated that they are not as effective in all types of logic since the input values might switch only between the highly leaky states.
Therefore, depending on the design requirements of the circuit, a combination, or hybrid of techniques which can result in better performance and leakage savings, is chosen. Power sensitive technology mapping tools can use the guidelines found as a result of the research in the low power design flow to meet the required maximum leakage current in a circuit. These guidelines are presented in general terms so that they can be adopted for any application and process technology.
|
80 |
Variability-Aware Design of Static Random Access Memory Bit-CellGupta, Vasudha January 2008 (has links)
The increasing integration of functional blocks in today's integrated circuit designs necessitates a large embedded memory for data manipulation and storage. The most often used embedded memory is the Static Random Access Memory (SRAM), with a six transistor memory bit-cell. Currently, memories occupy more than 50% of the chip area and this percentage is only expected to increase in future. Therefore, for the silicon vendors, it is critical that the memory units yield well, to enable an overall high yield of the chip. The increasing memory density is accompanied by aggressive scaling of the transistor dimensions in the SRAM. Together, these two developments make SRAMs increasingly susceptible to process-parameter variations. As a result, in the current nanometer regime, statistical methods for the design of the SRAM array are pivotal to achieve satisfactory levels of silicon predictability.
In this work, a method for the statistical design of the SRAM bit-cell is proposed. Not only does it provide a high yield, but also meets the specifications for the design constraints of stability, successful write, performance, leakage and area. The method consists of an optimization framework, which derives the optimal design parameters; i.e., the widths and lengths of the bit-cell transistors, which provide maximum immunity to the variations in the transistor's geometry and intrinsic threshold voltage fluctuations. The method is employed to obtain optimal designs in the 65nm, 45nm and 32nm technologies for different set of specifications. The optimality of the resultant designs is verified. The resultant optimal bit-cell designs in the 65nm, 45nm and 32nm technologies are analyzed to study the SRAM area and yield trade-offs associated with technology scaling. In order to achieve 50% scaling of the bit-cell area, at every technology node, two ways are proposed. The resultant designs are further investigated to understand, which mode of failure in the bit-cell becomes more dominant with technology scaling. In addition, the impact of voltage scaling on the bit-cell designs is also studied.
|
Page generated in 0.0678 seconds