• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 356
  • 347
  • 40
  • 34
  • 33
  • 30
  • 26
  • 23
  • 8
  • 6
  • 6
  • 4
  • 3
  • 3
  • 2
  • Tagged with
  • 1024
  • 1024
  • 331
  • 274
  • 189
  • 129
  • 112
  • 89
  • 88
  • 87
  • 77
  • 72
  • 71
  • 68
  • 61
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
501

Numerical Simulations Of Reinforced Concrete Frames Tested Using Pseudo-dynamic Method

Mutlu, Mehmet Basar 01 July 2012 (has links) (PDF)
Considering the deficiencies frequently observed in the existing reinforced concrete buildings, detailed assessment and rehabilitation must be conducted to avoid significant life and value loss in seismic zones. In this sense, performance based evaluation methods suggested in the regulations and codes must be examined and revised through experimental and analytical research to provide safe and economical rehabilitation solutions. In this study, seismic behavior of three reinforced concrete frames built and tested in Middle East Technical University Structural Mechanics Laboratory is examined. The specimens are extracted from a typical interior frame of 3-story 3-bay reinforced concrete structure. One of the specimens has compliant design according to Turkish Earthquake Code (2007) and each of the other two specimens represents different types of deficiencies in terms of material strength and detailing. The test specimens were modeled using different modeling approaches and nonlinear dynamic analyses were conducted on the numerical models. Results of continuous pseudo-dynamic testing of three ground motions are presented and compared with the numerical simulations on models. Calibrated finite element models were used for evaluation of performance assessment procedure of Turkish Earthquake Code (2007) and further investigation on local deformation components in light of experimental findings and observations. Deformation sources of columns and joints were studied in terms of their interaction and contributions to the total drift. Estimated plastic hinge lengths of columns were compared with the experimental observations and the proposed expressions in the literature.
502

Experiments with the pentium Performance monitoring counters

Agarwal, Gunjan 06 1900 (has links)
Performance monitoring counters are implemented in most recent microprocessors. In this thesis, we describe various performance measurement experiments for a program and a system that we conducted on a Linux operating system using the Pentium performance counters. We carried out our performance measurements on a Pentium II microprocessor. The Pentium II performance counters can be configured to count events such as cache misses, TLB misses, instructions executed etc. We used a low intrusive overhead technique to access these performance counters. We used these performance counters to measure the cache miss overheads due to context switches in Linux system. Our methodology involves sampling the hardware counters every 50ps. The sampling was set up using signals related to interval timers. We describe an analytical cache performance model under multiprogrammed condition from the literature and validate it using the performance monitoring counters. We next explores the long term performance of a system under different workload conditions. Various performance monitoring events - data cache h, data TLB misses, data cache reads or writes, branches etc. - are monitored over a 24 hour period. This is useful in identifying activities which cause loss of system performance. We used timer interrupts for sampling the performance counters. We develop a profiling methodology to give a perspective of performance of the different functions of a program, not only on the basis of execution-time but also on the data cache misses. Available tools like prof on Unix can be used to pinpoint the regions of performance loss of programs, but they mainly rely on an execution-time profiles. This does not give insight into problems in cache performance for that program. So we develop this methodology to get the performance of each function of the program not only on the basis of its execution time but also on the basis of its cache behavior.
503

Socially Responsible Investments : Are investors paying a price for investing ethically?

Arvidsson, Ulrica, Ljungbergh, Ebba January 2015 (has links)
The aim of this study is to evaluate the difference in performance and management fees between ethical and conventional mutual funds registered in Sweden. Our dataset consists of 49 ethical and 254 conventional funds, estimated on a 10-year period of time between January 2005 to January 2015. Jensen’s alpha is used as a measure for risk-adjusted performance and estimated through CAPM single-index model as well as by Carhart’s four-factor model. By adding back the management fees to the net returns and then estimate Jensen’s alpha by Carhart’s four-factor model once again, evidence of any differences in the impact on return between ethical and conventional funds is found. The results obtained from the study show that there is no difference in neither the risk-adjusted returns nor management fees between ethical and conventional funds. It is concluded that Swedish mutual fund investors are not paying a specific price in terms of reduced returns or higher management fees for putting social and ethical values into their financial investment decision.
504

Improving the Energy Efficiency of IEEE 802.3az EEE and Periodically Paused Switched Ethernet

Mostowfi, Mehrgan 02 July 2010 (has links)
It is estimated that networked devices consumed about 150 TWh of electricity in 2006 in the U.S. which has cost around $15 billion and contributed about 225 billion lbs of CO 2 to greenhouse gas emissions. About 13.5% of this energy is consumed by network equipment such as switches and routers. This thesis addresses the energy consumption of Ethernet, and designs and evaluates improvements on existing methods to reduce the energy consumption of Ethernet links and switches. Energy Efficient Ethernet (EEE) is an emerging IEEE 802.3 standard which allows Ethernet links to sleep when idle. In this thesis, a performance evaluation of EEE is completed. This evaluation replicates previous work by Reviriego et al. in an independent manner. The performance evaluation shows that EEE overhead results in less energy savings than expected. A new method based on Packet Coalescing is developed and evaluated to improve the energy efficiency of EEE. Packet Coalescing bursts packets such that EEE overhead is minimized. The results show that EEE with Packet Coalescing for 10 Gb/s Ethernet can achieve very close to ideal (or energy proportional) performance at the expense of an insignificant added per packet delay. Periodically Paused Switched Ethernet (PPSE) was previously proposed and prototyped by Blanquicet and Christensen in 2008. PPSE uses periodically sent notification packets to halt packet transmission into a LAN Switch and thus allowing the switch to sleep periodically. In this thesis, a first performance evaluation of PPSE is completed. The evaluation in this thesis shows that a PPSE for 10 Gb/s Ethernet LAN Switches achieves either significant energy savings at the expense of an excessive packet delay, or less than expected savings with a less than human response time added per-packet delay. An improvement to PPSE (Adaptive PPSE) is proposed and developed based on an adaptive policy. The adaptive policy considers past traffic load to determine whether to put the switch to sleep or not. The evaluation shows that Adaptive PPSE can achieve very close to ideal performance at the expense of an added average per packet delay which is less than half of the human response time.
505

Some active queue management methods for controlling packet queueing delay : design and performance evaluation of some new versions of active queue management schemes for controlling packet queueing delay in a buffer to satisfy quality of service requirements for real-time multimedia applications

Mohamed, Mahmud H. Etbega January 2009 (has links)
Traditionally the Internet is used for the following applications: FTP, e-mail and Web traffic. However in the recent years the Internet is increasingly supporting emerging applications such as IP telephony, video conferencing and online games. These new applications have different requirements in terms of throughput and delay than traditional applications. For example, interactive multimedia applications, unlike traditional applications, have more strict delay constraints and less strict loss constraints. Unfortunately, the current Internet offers only a best-effort service to all applications without any consideration to the applications specific requirements. In this thesis three existing Active Queue Management (AQM) mechanisms are modified by incorporating into these a control function to condition routers for better Quality of Service (QoS). Specifically, delay is considered as the key QoS metric as it is the most important metric for real-time multimedia applications. The first modified mechanism is Drop Tail (DT), which is a simple mechanism in comparison with most AQM schemes. A dynamic threshold has been added to DT in order to maintain packet queueing delay at a specified value. The modified mechanism is referred to as Adaptive Drop Tail (ADT). The second mechanism considered is Early Random Drop (ERD) and, iii in a similar way to ADT, a dynamic threshold has been used to keep the delay at a required value, the main difference being that packets are now dropped probabilistically before the queue reaches full capacity. This mechanism is referred to as Adaptive Early Random Drop (AERD). The final mechanism considered is motivated by the well known Random Early Detection AQM mechanism and is effectively a multi-threshold version of AERD in which packets are dropped with a linear function between the two thresholds and the second threshold is moveable in order to change the slope of the dropping function. This mechanism is called Multi Threshold Adaptive Early Random Drop (MTAERD) and is used in a similar way to the other mechanisms to maintain delay around a specified level. The main focus with all the mechanisms is on queueing delay, which is a significant component of end-to-end delay, and also on reducing the jitter (delay variation) A control algorithm is developed using an analytical model that specifies the delay as a function of the queue threshold position and this function has been used in a simulation to adjust the threshold to an effective value to maintain the delay around a specified value as the packet arrival rate changes over time. iv A two state Markov Modulated Poisson Process is used as the arrival process to each of the three systems to introduce burstiness and correlation of the packet inter-arrival times and to present sudden changes in the arrival process as might be encountered when TCP is used as the transport protocol and step changes the size of its congestion window. In the investigations it is assumed the traffic source is a mixture of TCP and UDP traffic and that the mechanisms conserved apply to the TCP based data. It is also assumed that this consists of the majority proportion of the total traffic so that the control mechanisms have a significant effect on controlling the overall delay. The three mechanisms are evaluated using a Java framework and results are presented showing the amount of improvement in QoS that can be achieved by the mechanisms over their non-adaptive counterparts. The mechanisms are also compared with each other and conclusions drawn.
506

Micro-Network Processor : A Processor Architecture for Implementing NoC Routers

Martin Rovira, Julia, Manuel Fructoso Melero, Francisco January 2007 (has links)
Routers are probably the most important component of a NoC, as the performance of the whole network is driven by the routers’ performance. Cost for the whole network in terms of area will also be minimised if the router design is kept small. A new application specific processor architecture for implementing NoC routers is proposed in this master thesis, which will be called µNP (Micro-Network Processor). The aim is to offer a solution in which there is a trade-off between the high performance of routers implemented in hardware and the high level of flexibility that could be achieved by loading a software that routed packets into a GPP. Therefore, a study including the design of a hardware based router and a GPP based router has been conducted. In this project the first version of the µNP has been designed and a complete instruction set, along with some sample programs, is also proposed. The results show that, in the best case for all implementation options, µNP was 7.5 times slower than the hardware based router. It has also behaved more than 100 times faster than the GPP based router, keeping almost the same degree of flexibility for routing purposes within NoC.
507

Evaluating the performance of TEWA systems

Johansson, Fredrik January 2010 (has links)
It is in military engagements the task of the air defense to protect valuable assets such as air bases from being destroyed by hostile aircrafts and missiles. In order to fulfill this mission, the defenders are equipped with sensors and firing units. To infer whether a target is hostile and threatening or not is far from a trivial task. This is dealt with in a threat evaluation process, in which the targets are ranked based upon their estimated level of threat posed to the defended assets. Once the degree of threat has been estimated, the problem of weapon allocation comes into the picture. Given that a number of threatening targets have been identified, the defenders need to decide on whether any firing units shall be allocated to the targets, and if so, which firing unit to engage which target. To complicate matters, the outcomes of such engagements are usually stochastic. Moreover, there are often tight time constraints on how fast the threat evaluation and weapon allocation processes need to be executed. There are already today a large number of threat evaluation and weapon allocation (TEWA) systems in use, i.e. decision support systems aiding military decision makers with the threat evaluation and weapon allocation processes. However, despite the critical role of such systems, it is not clear how to evaluate the performance of the systems and their algorithms. Hence, the work in thesis is focused on the development and evaluation of TEWA systems, and the algorithms for threat evaluation and weapon allocation being part of such systems. A number of algorithms for threat evaluation and static weapon allocation are suggested and implemented, and testbeds for facilitating the evaluation of these are developed. Experimental results show that the use of particle swarm optimization is suitable for real-time target-based weapon allocation in situations involving up to approximately ten targets and ten firing units, while it for larger problem sizes gives better results to make use of an enhanced greedy maximum marginal return algorithm, or a genetic algorithm seeded with the solution returned by the greedy algorithm. / Fredrik Johansson forskar också vid Skövde Högskola, Informatics Research Centre / Fredrik Johansson also does research at the University of Skövde, Informatics Research Centre
508

Improving end to end delivery of land administration business processes through performance measurement and comparison.

Chimhamhiwa, Dorman. January 2010 (has links)
The delivery of land administration (LA) systems particularly in urban areas underpins housing, industry and infrastructure development as well as the smooth operation of land and credit markets. However, fragmentation of LA activities across several autonomous organizations generally impairs end to end business processes flow and delivery. To facilitate improved service of LA systems we suggest the end to end measurement and monitoring of their business processes across organizational boundaries. This study proposes a performance measurement system that can facilitate end to end measurement and comparison of cross organizational business processes (CBPs) in LA. The research, which is structured in 2 parts, is based on a multi site study of LA CBPs in 6 urban municipalities across Namibia, Zimbabwe and South Africa. First, a measurement instrument (scorecard) built on six key CBP performance measurement areas of quality and technological innovation (enablers of results), cost and time (measures of results) and customer satisfaction and society (measures of external success (or impact), is presented. To facilitate measurement across organizational boundaries, the proposed dimensions were embedded onto a multi level structural model that link process activities to sub processes and CBPs. For 5 of the 6 municipalities, a conventional case of subdivision of privately owned land within an established township was used to develop CBP descriptions and process models for each municipality. A comparison of CBP and sub process similarities between municipalities was then done using the similarity scenario degree. Our results showed similarities of over 60% for most CBPs while mixed values were obtained for sub processes. The similarity results were further used as a base for the construction of a business process reference model. The second part of the research tested the applicability of quality and time dimensions. Using the survey examination and approval and deeds examination and approval sub processes, the quality of submitted work was measured using performance indicators of process yield and rejection rates at 2 survey examination and 3 deeds registration sites. Our results showed that 80% and 60% of survey records submitted at both survey examination sites were rejected and returned backwards for corrections due to quality deficits. Based on our results, we conducted a root cause analysis at one of the survey examination sites to identify major contributors to lower process yield. In addition, we suggested numerous technological innovations to improve quality. Using the same sites, we then went on to measure and compare cycle times for cadastral survey examination and approval considering quality. Our results showed that 70% and 52% of survey records with good quality had approval times of 20 days or less for the first and second sites, respectively while only 32% and 18% of records with poor quality (for same sites) were approved within 60 days. Furthermore, shorter cycle times appeared to indicate lower process costs. After the separate analysis of the quality and time measurements, a global performance index that aggregates individual measures into a composite value was presented. Overall, the study has shown the potential of end to end CBP performance measurement in improving delivery and service of land administration in a holistic manner. The results are important for initiatives directed at integration and improvement of land administration operations. / Thesis (Ph.D.)-University of KwaZulu-Natal, Pietermaritzburg, 2010.
509

Évolution des performances scolaires lors de la transition secondaire-Cégep : influences respectives des évaluations PISA, des performances scolaires antérieures et des contextes de vie

Cortès, Pierre-Yves 10 1900 (has links)
Dans le domaine des évaluations des performances des élèves en fin de scolarité obligatoire, et à côté des traditionnelles évaluations scolaires, s’est développé le Programme International pour le Suivi des Acquis (PISA), harmonisé par l’Organisation de Coopération et de Développements Économiques (l’OCDE). Ce programme a atteint une grande notoriété internationale et tente de s’imposer comme programme qui évalue les compétences des élèves. Ce mémoire explore dans quelle mesure les évaluations PISA permettent de prédire les performances scolaires des élèves lors de la transition de la fin des études secondaires vers les études collégiales au Québec. Nous avons construit une variable mesurant l’évolution des performances scolaires entre le secondaire et le Cégep. Nos résultats tendent à confirmer que les évaluations PISA sont en mesure de prédire en partie la continuité des bonnes performances scolaires après contrôle des variables contextuelles des parcours de vie. Cependant, les évaluations scolaires antérieures expliquent mieux cette continuité des bonnes performances scolaires réalisées en première année de postsecondaire que les évaluations PISA. Néanmoins, toujours après contrôle des variables contextuelles, les évaluations scolaires antérieures ne sont pas capables de prédire la différence entre des performances scolaires faibles et fortes lors de la transition secondaire-collégial. Seules les évaluations PISA conservent une faible part pour expliquer ces différences. / In order to assess international student performance, the Organisation for Economic Co-operation and Development (OECD) has developed academic assessments, and the Program for International Student Assessment (PISA) for students reaching the end of the compulsory schooling. This program has achieved international reknown and is intended to become a key program for assessing student performance in skills necessary for success in life. This research presented here explores in to what degree the PISA is capable of predicting academic success of students who are moving into postsecondary studies in the province of Quebec. We have defined a variable allowing us to predict performance results for students as they transition from high school to Cégep. Our results tend to confirm that the PISA assessments are able to predict in part the continuation of good grades in school when variables related to student life are controlled for. However, an even better indicator than PISA assessments is the earlier grades, which explain the grades earned in the first year of postsecondary studies. However, after controlling for some environmental variables, the early grades are not able to predict the difference between low and high scores during the transition. Only the PISA assessments preserve a low part to predict a decline in school performance for those making the transition to postsecondary studies.
510

Multidimensional approaches to performance evaluation of competing forecasting models

Xu, Bing January 2009 (has links)
The purpose of my research is to contribute to the field of forecasting from a methodological perspective as well as to the field of crude oil as an application area to test the performance of my methodological contributions and assess their merits. In sum, two main methodological contributions are presented. The first contribution consists of proposing a mathematical programming based approach, commonly referred to as Data Envelopment Analysis (DEA), as a multidimensional framework for relative performance evaluation of competing forecasting models or methods. As opposed to other performance measurement and evaluation frameworks, DEA allows one to identify the weaknesses of each model, as compared to the best one(s), and suggests ways to improve their overall performance. DEA is a generic framework and as such its implementation for a specific relative performance evaluation exercise requires a number of decisions to be made such as the choice of the units to be assessed, the choice of the relevant inputs and outputs to be used, and the choice of the appropriate models. In order to present and discuss how one might adapt this framework to measure and evaluate the relative performance of competing forecasting models, we first survey and classify the literature on performance criteria and their measures – including statistical tests – commonly used in evaluating and selecting forecasting models or methods. In sum, our classification will serve as a basis for the operationalisation of DEA. Finally, we test DEA performance in evaluating and selecting models to forecast crude oil prices. The second contribution consists of proposing a Multi-Criteria Decision Analysis (MCDA) based approach as a multidimensional framework for relative performance evaluation of the competing forecasting models or methods. In order to present and discuss how one might adapt such framework, we first revisit MCDA methodology, propose a revised methodological framework that consists of a sequential decision making process with feedback adjustment mechanisms, and provide guidelines as to how to operationalise it. Finally, we adapt such a methodological framework to address the problem of performance evaluation of competing forecasting models. For illustration purposes, we have chosen the forecasting of crude oil prices as an application area.

Page generated in 0.0709 seconds