Jin, Yu Ho
16 January 2010
Traditionally, the microprocessor design has focused on the computational aspects of the problem at hand. However, as the number of components on a single chip continues to increase, the design of communication architecture has become a crucial and dominating factor in defining performance models of the overall system. On-chip networks, also known as Networks-on-Chip (NoC), emerged recently as a promising architecture to coordinate chip-wide communication. Although there are numerous interconnection network studies in an inter-chip environment, an intra-chip network design poses a number of substantial challenges to this well-established interconnection network field. This research investigates designs and applications of on-chip interconnection network in next-generation microprocessors for optimizing performance, power consumption, and area cost. First, we present domain-specific NoC designs targeted to large-scale and wire-delay dominated L2 cache systems. The domain-specifically designed interconnect shows 38% performance improvement and uses only 12% of the mesh-based interconnect. Then, we present a methodology of communication characterization in parallel programs and application of characterization results to long-channel reconfiguration. Reconfigured long channels suited to communication patterns enhance the latency of the mesh network by 16% and 14% in 16-core and 64-core systems, respectively. Finally, we discuss an adaptive data compression technique that builds a network-wide frequent value pattern map and reduces the packet size. In two examined multi-core systems, cache traffic has 69% compressibility and shows high value sharing among flows. Compression-enabled NoC improves the latency by up to 63% and saves energy consumption by up to 12%.
21 January 2006
In recent years, since the external environment changes rapidly, the medical institutes of Taiwan face the unprecedented challenges. Striving for the survival, hospitals seek various kinds of development strategy to improve the competitive advantages. The role of the information department of hospitals has changed from the work-support level to the strategy-decision level to help hospitals reach their strategic goals. National Health Insurance System is now closely linked with all the national welfare. The financial pressure of National Health Insurance System can be reduced if hospitals can share medical resources to reduce the unnecessary waste. The information department, in this respect, plays an important role. Balanced Scorecard (BSC) is a framework that provides the measure for a strategic management system. It is noticed by practical circles and academia fields recently and has been widely applied in enterprise now, however, in the Information Department of the medical institutes, it is still new. The purpose of this research is to build the balanced scorecard of the information department of the hospital , to take a balance between multi-domain as the demand , and to develop a strategic measurement framework which gives consideration to the financial affairs, customer, inside procedure, learning and growing. It is expected that by improving the whole performance of the information department can help hospitals reach its strategic goals. The research method adopts a case study which sets up an information department's balanced scorecard framework. The framework and performance measurement were revised by the interview results of administration team, users and information department members in this case . The conclusions of this research are as follows: 1.Information department can develop its own mission, key value, vision and development strategy to support the organization to reach the strategic goals. 2.Information department can develop the Strategy Map and indicators of measurement to guide striving directions of staff. 3.The measurement method related to measuring achievement indicators should reasonably and objectively show the effect of execution. 4.The performance management system of the information department should possess strategy management and communication functions, and must set up a mechanism for feedback and revision. The main contribution of this research is to investigate the balanced scorecard of the information department and set up a BSC framework suitable for the information department in this case. The measurement indicators of CMMI (Capability Maturity Model Integration) are used as the measurement indicators of BSC. By introducing CMMI, the information department can make the performance evaluation more objective and meaningful.
08 July 2002
29 August 2005
Instrument transformers are a crucial component of power system protection. They supply the protection system with scaled-down replicas of current and voltage signals present in a power network to the levels which are safe and practical to op- erate with. The conventional instrument transformers are based on electromagnetic coupling between the power network on the primary side and protective devices on the secondary. Due to such a design, instrument transformers insert distortions in the mentioned signal replicas. Protective devices may be sensitive to these distortions. The inuence of distortions may lead to disastrous misoperations of protective devices. To overcome this problem, a new instrument transformer design has been devised: optical sensing of currents and voltages. In the theory, novel instrument transform- ers promise a distortion-free replication of the primary signals. Since the mentioned novel design has not been widely used in practice so far, its superior performance needs to be evaluated. This poses a question: how can the new technology (design) be evaluated, and compared to the existing instrument transformer technology? The importance of this question lies in its consequence: is there a necessity to upgrade the protection system, i.e. to replace the conventional instrument transformers with the novel ones, which would be quite expensive and time-consuming? The posed question can be answered by comparing inuences of both the novel and the conventional instrument transformers on the protection system. At present, there is no systematic approach to this evaluation. Since the evaluation could lead to an improvement of the overall protection system, this thesis proposes a comprehensive and systematic methodology for the evaluation. The thesis also proposes a complete solution for the evaluation, in the form of a simulation environment. Finally, the thesis presents results of evaluation, along with their interpretation.
New public management and governance collide federal-level performance measurement in networked public management networks /DeGroff, Amy S. January 2009 (has links)
Thesis (Ph.D)--Public Policy, Georgia Institute of Technology, 2009. / Committee Chair: Theodore H. Poister, Ph.D.; Committee Member: Gordon Kingsley, Ph.D.; Committee Member: John Thomas, Ph.D.; Committee Member: Judith Ottoson, Ph.D.; Committee Member: Patricia Reeves, Ph.D. Part of the SMARTech Electronic Thesis and Dissertation Collection.
賴文建, Lai, Man-kin.
published_or_final_version / Psychology / Master / Master of Philosophy
28 May 2008
Performance evaluation of document recognition systems is a difficult and practically-important problem. In this thesis, we contribute to the understanding of performance evaluation by studying some issues that arise in evaluation of systems for recognition of mathematical expressions. Issues that are discussed cover the reported performance evaluation experiments, the code availability, the nature of the mathematical notation, the extent of the coverage of mathematical recognition systems, and the quantification of performance evaluation results. For each issue, we discuss its impact on performance evaluation, give an overview of the state of the art for addressing it and point out open problems. / Thesis (Master, Computing) -- Queen's University, 2008-05-21 15:34:21.966
30 January 2013
Video dissemination capabilities are crucial for the deployment of many services over VANETs. These services range from enhancing safety via the dissemination of video from the scene of an accident, to advertisement of local services or businesses. This work considers the infrastructure-less scenario of VANETs and dissemination of video content over this network environment, which is extremely challenging mainly due to its dynamic topology and stringent requirements for video streaming. This study discusses issues and challenges that need to be tackled for disseminating high-quality video over VANETs. Furthermore it surveys and analyzes the suitability of diﬀerent existing solutions aimed towards eﬀective and eﬃcient techniques for video dissemination in vehicular networks. As a result, a set of the most promising techniques are selected, described in detail and evaluated based on standard terms in quality of service. This thesis also discusses eﬃciency and suitability of these techniques for video dissemination and compares their performance over the same network condition. In addition, a detailed study on the eﬀect of network coding on video dissemination protocols has been conducted to guide how to employ this technique properly for video streaming over VANETs. From this study, a summary of the observations was obtained and used to design a new hybrid solution by deploying robust and eﬃcient techniques in number of existing protocols in an optimal manner. The proposed hybrid video dissemination protocol outperforms other protocols in term of delivery ratio and complies with other quality-of-service requirements for video broadcasting over vehicular environments.
Fiallos Rivera, Javier E.
14 January 2014
Performance of Emergency Department (ED) physicians (MDs) is multi-faceted since it impacts multiple dimensions such as health outcomes of patients, utilization of resources, throughput of patients and timeliness of care. Therefore, the assessment of their performance demands the use of a tool that allows considering multiple evaluation criteria. However, commonly used multi-criteria evaluation methods often require assigning weights to dimensions in order to define their relative importance on a final performance score. This feature introduces subjectivity in the development of weights and has the potential to produce biased results. The purpose of this thesis research is to develop a multi-dimensional evaluation tool for evaluating performance of ED MDs. The proposed evaluation tool relies on a mathematical programming model known as Data Envelopment Analysis (DEA). The use of DEA does not ask for subjective weighting assignments for each dimension that describe the ED MDs’ performance. It is capable of considering multiple heterogeneous performance measures to identify benchmark practice and the individual improvements leading to best practice of each evaluated unit. The DEA model described here was developed from real data to assess the performance of 20 PED MDs from the Children’s Hospital of Eastern Ontario (CHEO). Multiple evaluations were run on stratified data in order to identify benchmark practice in each of seven categories of patients’ complaints and to determine the impact of accompanying MD trainees on PED MDs’ performance. For each PED MD, performance scores and improvements in each category of patients’ complaints (i.e. respiratory, trauma, abdominal, fever, gastroenterology, allergy and Ear-Nose-Throat complaints) were determined. This helped identifying the required improvements that would lead PED MDs to achieve benchmark performance. Regarding the influence of MD trainees on PED MDs’ performance, results show that most PED MDs (15 out of 20) perform better when they are not accompanied by a trainee which motivates further research to assess trade-offs between teaching and clinical performance. In summary, DEA proved to be an appropriate tool for performance evaluation of PED MDs because it helped to identify benchmark performers and provided information for performance improvements under a multi dimensional performance evaluation framework.
Gadgil, Kalyani Surendra
08 April 2020
In this thesis, we benchmark the Cognitive Radios Test System version 2.0 (CRTSv.2) to analyze its software performance with respect to its internal structure and design choices. With the help of system monitoring and profiling tools, CRTSv.2 is tested to quantitatively evaluate its features and understand its shortcomings. With the help of GNU Radio, a popular, easy-to-use software radios framework, we ascertain that CRTSv.2 has a low memory footprint, fewer dependencies and overall, is a lightweight framework that can potentially be used for real-time signal processing. Several open-source measurement tools such as valgrind, perf, top, etc. are used to evaluate the CPU utilization, memory footprint and to postulate the origins of latencies. Based on our evaluation, we observe that CRTSv.2 shows a CPU utilization of approximately 9% whereas GNU Radio is 59%. CRTSv.2 has lower heap memory consumption of approximately 3MB to GNU Radio's 25MB. This study establishes a methodology to evaluate the performance of two SDR frameworks systematically and quantitatively. / Master of Science / When picking the best person for the job, we rely on the person's performance in past projects of a similar nature. The same can be said for software. Software radios provide the capability to perform signal processing functions in software, making them prime candidates towards solving modern problems such as spectrum scarcity, internet-of-things(IoT) adoption, vehicle-to-vehicle communication etc. In order to operate and configure software radios, software frameworks are provided that let the user make changes to the waveform, perform signal processing and data management. In this thesis, we consider two such frameworks,GNU Radio and CRTSv.2. A software performance evaluation is conducted to assess framework overheads contributing to operation of an orthogonal frequency-division multiplexing (OFDM) digital modulation scheme. This provides a quantitative analysis of a signals-specific use case which can be used by researchers to evaluate the optimal framework for research. This analysis can be generalized for different signal processing capabilities by understanding the total framework overhead removed from signal processing costs.
Page generated in 0.1285 seconds