Spelling suggestions: "subject:"performanceevaluation."" "subject:"performance.evaluations.""
111 |
Biases in Creativity Assessment: How the Social Setting Influences Observer's Perceptions of Team and Individual CreativityKay, Min January 2013 (has links)
<p>One important aspect of enhancing creativity in organizations is to measure and reward creativity. However, not every creative process can be immediately tied to and measured by numerical standards. In such cases, the manager's subjective impression of employee creativity may replace objective measures as the basis for decision-making. In an organizational context, the social context in which the work occurs must be thoroughly considered as employees often work in groups on major products. As such, this paper examines two questions on how the social setting affects the observer's creativity assessment. Firstly, I demonstrate that observers use surface features of groups to infer the creativity of group output: they expect demographically diverse groups to be more creative than homogeneous groups and this difference in expectation biases the evaluation. Secondly, when observers form impressions of individual creativity based on group output, I demonstrate that they commit the fundamental attribution error in partitioning credit between others in the group and the target individual. In turn this either benefits or costs the perceived creativity of the target, depending on the objective quality of group output. Taken together, the two questions addressed in this paper emphasize the need for further research on factors that influence the observer's perception of creativity in an organizational context.</p> / Dissertation
|
112 |
Comparative Performance Study of LTE Uplink SchedulersSALAH, Mohamed 09 May 2011 (has links)
Long Term Evolution (LTE) constitutes a significant milestone in the evolution of 3G systems towards fourth generation (4G) technologies. The performance targets promised by LTE makes it an ideal solution to accommodate the ever increasing demand for wireless broadband. LTE's promised performance targets were made possible due to improvements such as a simplified system access architecture and a fully IP-based platform. LTE has also great enhancements in its enabling radio technologies by introducing Orthogonal Frequency Division Multiplexing (OFDM) and advanced antenna technologies. In addition, LTE capabilities are further improved with enhanced Quality of Service (QoS) support for multiple data services, such as voice and other multimedia applications.
LTE packet scheduling plays an essential role as part of LTE's Radio Resource Management (RRM) to enhance the system's data rate and to support the diverse QoS requirements of mobile services. LTE packet scheduler should intelligently allocate radio resources to mobile User Equipments (UEs) such that the LTE network adheres to its performance requirements. In our work, we perform a performance evaluation of multiple LTE scheduling algorithms proposed for LTE uplink transmission. The evaluation takes place in single and mixed traffic scenarios to exploit the strengths and weaknesses of proposed algorithms. Simulation results illustrated the importance of a scheduler's awareness of uplink channel conditions and QoS requirements in the presence of single and multiple traffic scenarios. Accordingly, we provide recommendations for future scheduling algorithm proposals, and ways to enhance the existing schedulers. / Thesis (Master, Electrical & Computer Engineering) -- Queen's University, 2011-05-07 12:43:54.983
|
113 |
Methodologies for Many-Input Feedback-Directed OptimizationBerube, Paul N. J. Unknown Date
No description available.
|
114 |
Design Evaluation and Optimization of School Buildings Using Artificial Intelligent ApproachesAlyari Tabrizi, Eilnaz Unknown Date
No description available.
|
115 |
Intrusion and Fraud Detection using Multiple Machine Learning AlgorithmsPeters, Chad 22 August 2013 (has links)
New methods of attacking networks are being invented at an alarming rate, and
pure signature detection cannot keep up. The ability of intrusion detection systems to
generalize to new attacks based on behavior is of increasing value. Machine Learning
algorithms have been successfully applied to intrusion and fraud detection; however
the time and accuracy tradeoffs between algorithms are not always considered when
faced with such a broad range of choices. This thesis explores the time and accuracy metrics of a wide variety of machine learning algorithms, using a purpose-built
supervised learning dataset. Topics covered include dataset dimensionality reduction
through pre-processing techniques, training and testing times, classification accuracy,
and performance tradeoffs. Further, ensemble learning and meta-classification are
used to explore combinations of the algorithms and derived data sets, to examine the
effects of homogeneous and heterogeneous aggregations. The results of this research
are presented with observations and guidelines for choosing learning schemes in this
domain.
|
116 |
Seismic Fragility Analysis and Loss Estimation for Concrete StructuresBai, Jong Wha 2011 December 1900 (has links)
The main objective of this study is to develop a methodology to assess seismic vulnerability of concrete structures and to estimate direct losses related to structural damage due to future seismic events. This dissertation contains several important components including development of more detailed demand models to enhance accuracy of fragility relationships and development of a damage assessment framework to account for uncertainties.
This study focuses on concrete structures in the Mid-America region where a substantial seismic risk exists with potential high intensity earthquakes in this geographic region. The most common types of concrete structures in this area are identified based on the building inventory data and reinforced concrete (RC) frame buildings and tilt-up concrete buildings are selected as case study buildings for further analysis. Using synthetic ground motion records, the structural behavior of the representative case study buildings is analyzed through nonlinear time history analyses. The seismic performance of the case study buildings is evaluated to describe the structural behavior under ground motions. Using more detailed demand models and the corresponding capacity limits, analytical fragility curves are developed based on appropriate failure mechanisms for different structural parameters including different RC frame building heights and different aspect ratios for tilt-up concrete structures. A probabilistic methodology is used to estimate the seismic vulnerability of the case study buildings reflecting the uncertainties in the structural demand and capacity, analytical modeling, and the information used for structural loss estimation. To estimate structural losses, a set of damage states and the corresponding probabilistic framework to map the fragility and the damage state are proposed. Finally, scenario-based assessments are conducted to demonstrate the proposed methodology. Results show that the proposed methodology is successful to evaluate seismic vulnerability of concrete structures and effective in quantifying the uncertainties in the loss estimation process.
|
117 |
Stochastic abstraction of programs : towards performance-driven developmentSmith, Michael James Andrew January 2010 (has links)
Distributed computer systems are becoming increasingly prevalent, thanks to modern technology, and this leads to significant challenges for the software developers of these systems. In particular, in order to provide a certain service level agreement with users, the performance characteristics of the system are critical. However, developers today typically consider performance only in the later stages of development, when it may be too late to make major changes to the design. In this thesis, we propose a performance driven approach to development — based around tool support that allows developers to use performance modelling techniques, while still working at the level of program code. There are two central themes to the thesis. The first is to automatically relate performance models to program code. We define the Simple Imperative Remote Invocation Language (SIRIL), and provide a probabilistic semantics that interprets a program as a Markov chain. To make such an interpretation both computable and efficient, we develop an abstract interpretation of the semantics, from which we can derive a Performance Evaluation Process Algebra (PEPA) model of the system. This is based around abstracting the domain of variables to truncated multivariate normal measures. The second theme of the thesis is to analyse large performance models by means of compositional abstraction. We use two abstraction techniques based on aggregation of states — abstract Markov chains, and stochastic bounds — and apply both of them compositionally to PEPA models. This allows us to model check properties in the three-valued Continuous Stochastic Logic (CSL), on abstracted models. We have implemented an extension to the Eclipse plug-in for PEPA, which provides a graphical interface for specifying which states in the model to aggregate, and for performing the model checking.
|
118 |
DEVELOPMENT OF A DESIGN BASED INTERSECTION SAFETY PERFORMANCE EVALUATION TOOLKirk, Adam J 01 January 2013 (has links)
The purpose of this research is to develop an intersection safety evaluation tool that is capable of assisting designers and planners in the assessment of alternative intersection designs. A conflict exposure model utilizing design hour volumes, intersection configuration and traffic control measures is proposed to achieve this goal. This approach makes use of data typically available for preliminary intersection design. The research goes beyond existing safety performance models which only examine non-directional average daily traffic (ADT) or practices which only account for the geometric and lane configuration of an intersection, such as conflict point analysis.
Conflict prediction models are developed for left-turn angle, right-turn, rear end and sideswipe crashes. These models were developed through the analysis of over 1000 simulation scenarios evaluating a full range of approach and turning volumes, lane configurations and traffic control strategies. The quantifiable metrics provided can be used to inform and improve alternative intersection selection processes by differentiating between alternatives based on a surrogate safety performance. This research may be used in screening of intersection alternatives to select the most beneficial design based on objective safety performance metrics.
|
119 |
Intrusion and Fraud Detection using Multiple Machine Learning AlgorithmsPeters, Chad 22 August 2013 (has links)
New methods of attacking networks are being invented at an alarming rate, and
pure signature detection cannot keep up. The ability of intrusion detection systems to
generalize to new attacks based on behavior is of increasing value. Machine Learning
algorithms have been successfully applied to intrusion and fraud detection; however
the time and accuracy tradeoffs between algorithms are not always considered when
faced with such a broad range of choices. This thesis explores the time and accuracy metrics of a wide variety of machine learning algorithms, using a purpose-built
supervised learning dataset. Topics covered include dataset dimensionality reduction
through pre-processing techniques, training and testing times, classification accuracy,
and performance tradeoffs. Further, ensemble learning and meta-classification are
used to explore combinations of the algorithms and derived data sets, to examine the
effects of homogeneous and heterogeneous aggregations. The results of this research
are presented with observations and guidelines for choosing learning schemes in this
domain.
|
120 |
A comparative evaluation of Web server systems: taxonomy and performanceGaneshan, Manikandaprabhu 29 March 2006 (has links)
The Internet is an essential resource to an ever-increasing number of businesses and home users. Internet access is increasing dramatically and hence, the need for efficient and effective Web server systems is on the rise. These systems are information engines that are accessed through the Internet by a rapidly growing client base. These systems are expected to provide good performance and high availability to the end user. They are also resilient to failures at both the hardware and software levels. These characteristics make them suitable for servicing the present and future information demands of the end consumer.
In recent years, researchers have concentrated on taxonomies of scalable Web server system architectures, and routing and dispatching algorithms for request distribution. However, they have not focused on the classification of commercial products and prototypes, which would be of use to business professionals and software architects. Such a classification would help in selecting appropriate products from the market, based on product characteristics, and designing new products with different combinations of server architectures and dispatching algorithms.
Currently, dispatching algorithms are classified as content-blind, content-aware, and Domain Name Server (DNS) scheduling. These classifications are extended, and organized under one tree structure in this thesis. With the help of this extension, this thesis develops a unified product-based taxonomy that identifies product capabilities by relating them to a classification of scalable Web server systems and to the extended taxonomy of dispatching algorithms. As part of a detailed analysis of Web server systems, generic queuing models, which consist of a dispatcher unit and a Web server unit are built. Some performance metrics, such as throughput, server performance, mean queue size, mean waiting time, mean service time and mean response time of these generic queuing models are measured for evaluation. Finally, the correctness of generic queuing models are evaluated with the help of theoretical and simulation analysis.
|
Page generated in 0.1083 seconds