• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 370
  • 356
  • 40
  • 34
  • 34
  • 32
  • 30
  • 28
  • 8
  • 7
  • 6
  • 4
  • 4
  • 3
  • 2
  • Tagged with
  • 1075
  • 1075
  • 331
  • 274
  • 193
  • 135
  • 117
  • 100
  • 92
  • 91
  • 77
  • 76
  • 75
  • 72
  • 65
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
151

Performance Evaluation of the Biological Aerated Filter

Kus, John 01 1900 (has links)
<p> The Biological Aerated Filter is a novel biological wastewater treatrrent process consisting of.an activated sludge zone followed by an unstratified sand filter for solids separation. Three evaluation studies of the BAF to date have yielded results indicating low solids production or possibly total oxidation. On the basis of these studies, Tymflo Process Limited, the patent holding corrpany, clairred up to 50% cost savings for wastewater.treabrent as there would be no excess biological solids produced, and therefore no sludge disposal costs. The object of this report was to evaluate the BAF capabilities to treat degritted municipal sewage with respect to the above claims. Two pilot scale BAF units were operated at the Canada Centre for Inland Waters continuously for 97 days treating degritted Burlington Skyway sewage at various operating conditions. The conclusions of the tests are that the BAF cannot be operated as a total solids retention system treating degritted nnmicipal sewage on a 24 hour cycle at a 12 hour hydraulic detention ti.Ile. The inert fraction of the influent is retained in the system resulting in high mixed liquor concentrations which overload the filter thereby decreasing treatrrent tine. The system is capable of 88% CDD rerroval, essentially corrplete nitrification and 97% SS rerroval. The system yields are in the order of 0.24 gm MLVSS/grn COD rercoved at organic loadings of approximately 0.08 gm COD removed/gm MLVSS day. </p> / Thesis / Master of Engineering (MEngr)
152

Towards Performance Evaluation and Future Applications of eBPF

Gunturu, Manideep, Aluguri, Rohan January 2024 (has links)
Extended Berkeley Packet Filter (eBPF) is an instruction set and an execution environment inside the Linux kernel. eBPF improves flexibility for data processing and is realized via a virtual machine featuring both a Just-In-Time (JIT) compiler and an interpreter running in the kernel. It executes custom eBPF programs supplied by the user, effectively moving kernel functionality into user space. eBPF has received widespread adoption by companies such as Facebook, Netflix, and academia for a wide range of application domains. eBPF can be used to program the eXpress DataPath (XDP), a kernel network layer that processes packets closer to the NetworkInterface Card (NIC) for fast packet processing. In this thesis, eBPF with XDP, and Iptables, are considered as a Network function(NF), implemented in a Virtual Machine (VM) for packet filtering. The traffic source(source VM) and traffic sink (destination VM) are present in the same subnet. The aim of this thesis is, to understand and investigate the implementation of NFs inVMs and to analyze performance metrics. In VirtualBox, VMs are created to implement the NFs. The results are obtained for the measurements that are essential for the performance evaluation of the NFs, and presented in graphs.
153

Sampling expertise: Incorporating goal establishment and goal enactment into theories of expertise to improve measures of performance

Robinson, Frank Eric 06 June 2017 (has links)
No description available.
154

MEDIUM ACCESS CONTROL PROTOCOLS AND ROUTING STRATEGIES FOR WIRELESS LOCAL AND PERSONAL AREA NETWORKS

CORDEIRO, CARLOS DE MORAIS January 2003 (has links)
No description available.
155

USING A PERFORMANCE EVALUATION TO DETERMINE AN INDIVIDUALIZED INTERVENTION TO INCREASE STAFF TREATMENT INTEGRITY OF DISCRETE TRIAL TEACHING

Dombrowski, Nicholas January 2019 (has links)
Discrete Trial Teaching (DTT) is a teaching method that involves fast-paced trials designed to teach basic skills by breaking them into smaller components, typically conducted in a one-on-one setting. Treatment integrity has proven to be of great importance in DTT, with skill acquisition occurring at higher rates when treatment integrity is high. While research has shown that verbal and written feedback are effective in training staff to conduct DTT, there is still a need for research on the use of individualized interventions based on performance assessments. This study used a multiple-probe across participants design, and demonstrated that a one-on-one session including interventions such as feedback, practice, treatment integrity checklists, and/or antecedent interventions is an effective method for increasing treatment integrity and implementation of DTT. The three participants that took part in the individualized interventions all displayed increases in proficiency of delivering DTT trials. / Applied Behavioral Analysis
156

INFORMATION AND INCENTIVES IN RETAIL SALES

Lee, Soojin January 2019 (has links)
I examine how managers mitigate the side effects of the overly complicated performance evaluation system in the context of a high-end retail industry. The standard performance evaluation system in the industry has evolved to include multiple performance measures. The detailed measures can incentivize employees to perform multiple performance-relevant activities, but they inevitably increase the complexity of the performance evaluation system. The complexity increases the risk of information overload of the employees, decreasing judgment quality and potentially decreasing their performance. Drawing on psychology literature, I postulate two factors moderating the relationship between information overload and performance: 1) disaggregated feedback provides detailed information on each category of performance measures and compensate for each performance measure rather than for overall performance level; 2) feedforward informs employees about how their actions affect their compensation. Both factors mitigate the negative performance effect of information overload by clarifying causality embedded in the complex performance evaluation system to employees. I conduct two field experiments that implement the disaggregated feedback and the feedforward policies for sales outlets of a high-end retail firm, respectively, and examine whether the policies mitigate information overload problem and improve performance. I find that the treatment group exhibits improvement in performance, suggesting that disaggregated feedback and the feedforward reduce information overload. / Business Administration/Accounting
157

Context Aware and Adaptive Security for Wireless Networks

Hager, Creighton Tsuan-Ren 03 December 2004 (has links)
This research investigated methods to determine appropriate security protocols for specific wireless network applications. The specific problem being addressed was that there are tradeoffs between security, performance, and efficiency among current and proposed security protocols. Performance and efficiency issues are particularly important in wireless networks which tend to have constrained network capacity and connect to resource-limited nodes. Existing security protocols address problems such as authentication, availability, confidentiality, integrity, and non-repudiation. However, these protocols use resources and limit the efficient use of node resources. Thus, the overall objective of this research is to improve the efficiency of security mechanisms for wireless networks. A methodology was constructed to satisfy this objective and is an important contribution of this research. The methodology can be used to define the relevant operational parameters of different wireless network applications, classify wireless networks into distinct categories, incorporate appropriate security protocols to a category, and analyze the security protocols through metrics. Three groups of operational parameters were created to classify wireless networks; these are equipment, network topology, and communication characteristics. The wireless network categories include, but are not limited to, fixed broadband wireless networks, wireless local area networks, mobile ad hoc networks, and small device sensor networks. The metrics in the methodology are used to measure end-to-end data throughput and delay, efficiency and overhead, power and energy consumption, and energy consumed per packet transferred. The main advantage of this methodology is the flexibility of how constraints are considered and suitability is analyzed. This approach can identify problems from manageable categories of networks and find or create solutions for each of them. Another advantage of this methodology is that after suitable security protocols are found or created for each category, any new wireless network application that falls into an existing category may be able to use the security protocols from that category and find that they are the most suitable. Another key contribution of this research was the implementation and evaluation of a context aware and adaptive security manager (CASM) that selects appropriate protocols in real-time. CASM was developed using the methodology as a guide. Results from a resource analysis of four encryption algorithms were utilized for the design of CASM. A feasibility study of CASM was then completed. Three different experimental scenarios were used to evaluate CASM's operation. The results and analysis of the experiments indicate that the security manager functions properly and security is provided efficiently with different user settings and environments. Three schemes were deemed the best to use for the decision module of CASM. / Ph. D.
158

The Impact of the Choice of Performance Evaluation System on the Magnitude of the Outcome Effect

Mertins, Lasse 22 April 2009 (has links)
This dissertation examines whether the magnitude of the outcome effect is impacted by the type of performance evaluation system (subjective versus formula-based). The outcome effect is a phenomenon that occurs when an evaluator overemphasizes the outcome of a decision and ignores essential information that is available to the evaluator (e.g., market information, information about the decision-making process). This outcome focus leads to a more positive (negative) performance evaluation when the outcome exceeds (fails to meet) expectations. Prior studies have not examined whether the type of evaluation system (formula-based versus subjective) has an impact on the magnitude of the outcome effect. In a formula-based evaluation system, outcome measures are pre-weighted and an overall variance measure is easily calculated. Conversely, there are no predefined weights or overall variance measures in a subjective system. Instead, evaluators weight the importance of outcome information themselves. For this dissertation, I conducted an experiment in which 99 business professionals enrolled in a MBA program evaluated the performance of a retail store manager. Their evaluation was based on information that they received about the manager's decision, along with situational factors that may have impacted the decision outcome. The results demonstrate that although the magnitude of the outcome effect was larger when a formula-based system was employed relative to a subjective system, this difference was not statistically significant. Nonetheless, this study provides initial evidence that managers using formula-based evaluation systems should be particularly aware of the outcome effect when conducting performance appraisals. In addition, this study documents the perceived controllability of four financial and four non-financial measures that are commonly employed to evaluate performance in the retail industry. As hypothesized, the non-financial measures were perceived to be more controllable than the financial measures. This suggests that non-financial measures should be included in the mix of performance measures used in a performance appraisal system. / Ph. D.
159

Web-based Performance Benchmarking Data Collection and Preliminary Analysis for Drinking Water and Wastewater Utility

Rathor, Ankur 12 January 2013 (has links)
High-quality drinking water and wastewater systems are essential to public health, business, and quality of life in the United States. Even though the current performance of these systems is moderate, the concern is about the future performance. Planning can be done for improvement once the current performance of utilities is evaluated, and areas with a scope of improvement are identified. Benchmarking and performance evaluation are key components in the process of continuous improvement for utility's performance. Benchmarking helps utilities make policies and programmatic decisions that reduce operational expenses and increase productivity by understanding areas of underperformance, understanding customer needs, developing future plans, and setting goals. This study establishes a strong case for implementing benchmarking methodologies among utilities to evaluate and improve performance. There are many initiatives on performance benchmarking of utilities but a few of them focuses on one or few area of performance. There are a few initiatives which use subjective indicators. Additionally, consultants visit the utilities for performance evaluation. This research focuses on creating a web-based benchmarking platform for performance evaluation using holistic and quantitative indicators. Practical and robust methodologies are used and the research presents the current performance comparisons among utilities for areas that impact overall utility's performance. Web based benchmarking consists of two major parts -- data collection and result visualization. A major contribution from this study is the creation of an online performance benchmarking database. With time more data will be collected which will provide utilities an access to a better database for performance evaluation. The future work in this research will be analyzing the data and results for each participant for each set of indicators, and finding possible reasons for under performance, followed by suggesting solutions for improvement using the best practices. / Master of Science
160

Architecture-Aware Mapping and Optimization on Heterogeneous Computing Systems

Daga, Mayank 06 June 2011 (has links)
The emergence of scientific applications embedded with multiple modes of parallelism has made heterogeneous computing systems indispensable in high performance computing. The popularity of such systems is evident from the fact that three out of the top five fastest supercomputers in the world employ heterogeneous computing, i.e., they use dissimilar computational units. A closer look at the performance of these supercomputers reveals that they achieve only around 50% of their theoretical peak performance. This suggests that applications that were tuned for erstwhile homogeneous computing may not be efficient for today's heterogeneous computing and hence, novel optimization strategies are required to be exercised. However, optimizing an application for heterogeneous computing systems is extremely challenging, primarily due to the architectural differences in computational units in such systems. This thesis intends to act as a cookbook for optimizing applications on heterogeneous computing systems that employ graphics processing units (GPUs) as the preferred mode of accelerators. We discuss optimization strategies for multicore CPUs as well as for the two popular GPU platforms, i.e., GPUs from AMD and NVIDIA. Optimization strategies for NVIDIA GPUs have been well studied but when applied on AMD GPUs, they fail to measurably improve performance because of the differences in underlying architecture. To the best of our knowledge, this research is the first to propose optimization strategies for AMD GPUs. Even on NVIDIA GPUs, there exists a lesser known but an extremely severe performance pitfall called partition camping, which can affect application performance by up to seven-fold. To facilitate the detection of this phenomenon, we have developed a performance prediction model that analyzes and characterizes the effect of partition camping in GPU applications. We have used a large-scale, molecular modeling application to validate and verify all the optimization strategies. Our results illustrate that if appropriately optimized, AMD and NVIDIA GPUs can provide 371-fold and 328-fold improvement, respectively, over a hand-tuned, SSE-optimized serial implementation. / Master of Science

Page generated in 0.1043 seconds