• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 385
  • 172
  • 154
  • 37
  • 34
  • 29
  • 28
  • 27
  • 27
  • 19
  • 13
  • 11
  • 10
  • 7
  • 4
  • Tagged with
  • 1093
  • 179
  • 140
  • 128
  • 113
  • 111
  • 104
  • 101
  • 99
  • 97
  • 89
  • 88
  • 87
  • 81
  • 78
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Analysis of Inductor-Coupled Zero-Voltage-Transition Converters

Choi, Jae-Young 06 August 2001 (has links)
As is the case for DC-DC converters, multi-phase converters require both high-quality power control and high power-density. Although a higher switching frequency not only improves the quality of the converter output but also decreases the size of the converter, it increases switching losses and electromagnetic interference (EMI) noise. Since the soft-switching topologies reduce the switching losses of the converter main switches, the topologies make converters partially independent from the switching frequency. However, the conventional soft-switching topologies have already proposed most of the possible ways to improve converter performance. In addition, the trends of the newly generated power devices reduce the advantages of soft-switching topologies. This critical situation surrounding soft-switching topologies gives research motivations: What features of soft-switching topologies facilitate their practical applications? Given this motivation, the dissertation discusses two aspects = simplifying auxiliary circuits and accounting for the effects of soft-switching operations on the converter control. Engineers working with medium- and high-power multi-phase converters require simplified soft-switching topologies that have the same level of performance as the conventional soft-switching topologies. This demand is the impetus behind one of the research objectives = simplifying the auxiliary circuits of Zero-Voltage-Transition (ZVT) inverters. Simplifying the auxiliary circuits results in both a smaller number of and lower cost for auxiliary components, without any negative impact on performance. This dissertation proposes two major concepts for the simplification - the Single-Switch Single-Leg (S3L) ZVT cell and the Phase-Lock (PL) concept. Throughout an effort to eliminate circulating currents of inductor-coupled (IC) ZVT converters, the S3L ZVT cell is developed. The proposed cell allows a single auxiliary switch to achieve zero-voltage conditions for both the top and bottom main switches, and it achieves the same level of performance as the conventional ZVT cell, as well. This proposal makes IC ZVT topologies more attractive to multi-phase converter applications. Because all of the top main switches generally have identical sequences for zero-voltage turn-on commutations, one auxiliary switch might handle the commutations of all of the top main switches. This possibility introduces the PL concept, which allows the two auxiliary switches to provide a zero-voltage condition for any main switch commutation. In order to compensate for restrictions of this concept, a modified space-vector modulation (SVM) scheme also is introduced. A soft-switching topology changes the duty ratios of the converter, which affects the controllability of the converter. Therefore, this dissertation selects resolution of this issue as one of the research objectives. This dissertation derives the generalized timing equations of ZVT operations, and the generalized equations formulize the effect of ZVT operation on both duty ratios and DC current. Moreover, the effect of SVM schemes is also investigated. An average model of the ZVT converter is developed using both the timing analysis and the investigation of SVM schemes, and small-signal analysis using the average model predicts the steady-state characteristics of the converter. / Ph. D.
42

Lönar det sig att följa strömmen? : Påverkan av konformitet på kunskapstest

Stanzl, Ulrika January 2016 (has links)
Mycket forskning har gjorts på konformitet och dess negativa konsekvenser. Denna experimentella studie var en delvis replikering av M. Rosander och O. Erikssons (2012) studie om konformitet på internet men med fokus på hur konformitet kan ge förbättrade resultat. Det övergripande syftet var att undersöka om det lönade sig att göra som andra, relationen mellan konformitet och säkerhet samt förekomsten av överkonfidens. Två enkätstudier i form av kunskapstest genomfördes med 38 respektive 92 studenter. Deltagarna delades upp i en konformitetsgrupp som fick information om tidigare deltagares svar och en kontrollgrupp som inte fick denna information. Resultatet visade att konformitet ledde till fler rätta svar samt att det förekom ett positivt samband mellan säkerhet och antal rätta svar. Ingen överkonfidens kunde bevisas men däremot förekom en underskattning av antal rätta svar. Studiens slutsats var att det mest lönade sig att göra som andra vid medelsvåra uppgifter. Vid mycket svåra uppgifter fanns det ingen fördel i att följa strömmen.
43

Performance Limits of Communication with Energy Harvesting

Znaidi, Mohamed Ridha 04 1900 (has links)
In energy harvesting communications, the transmitters have to adapt transmission to the availability of energy harvested during communication. The performance of the transmission depends on the channel conditions which vary randomly due to mobility and environmental changes. During this work, we consider the problem of power allocation taking into account the energy arrivals over time and the quality of channel state information (CSI) available at the transmitter, in order to maximize the throughput. Differently from previous work, the CSI at the transmitter is not perfect and may include estimation errors. We solve this problem with respect to the energy harvesting constraints. Assuming a perfect knowledge of the CSI at the receiver, we determine the optimal power policy for different models of the energy arrival process (offline and online model). Indeed, we obtain the power allocation scheme when the transmitter has either perfect CSI or no CSI. We also investigate of utmost interest the case of fading channels with imperfect CSI. Moreover, a study of the asymptotic behavior of the communication system is proposed. Specifically, we analyze of the average throughput in a system where the average recharge rate goes asymptotically to zero and when it is very high.
44

An event study : the market reactions to share repurchase announcements on the JSE

Punwasi, Kiran 24 February 2013 (has links)
This study examines the market reactions to share repurchase announcements made by companies listed on the Johannesburg Stock Exchange from 2003 to 2012. We use an event study methodology and the Capital Asset Pricing Model to determine if there is an announcement effect when a share repurchase announcement is made. Our analysis show that consistent with signalling theory and the announcement effect, share repurchase announcements are associated with positive abnormal returns. The average abnormal return and cumulative average abnormal return noted was 0.46% and 3.81% respectively for the event period (t -20, t +20). There was an observable trend of declining share prices before the share repurchase announcement however the decline in the shares prices was not significant. We found some evidence of market timing ability in 2005 and 2010 however as a collective, we found no significant difference in timing a share repurchase announcement. / Dissertation (MBA)--University of Pretoria, 2012. / Gordon Institute of Business Science (GIBS) / unrestricted
45

Neglecting the Predictions of Others: The Effects of Base Rate Neglect andInterhemispheric Interaction on the Above and Below Average Effects

Lanning, Michael D. January 2015 (has links)
No description available.
46

Surveillance of Poisson and Multinomial Processes

Ryan, Anne Garrett 18 April 2011 (has links)
As time passes, change occurs. With this change comes the need for surveillance. One may be a technician on an assembly line and in need of a surveillance technique to monitor the number of defective components produced. On the other hand, one may be an administrator of a hospital in need of surveillance measures to monitor the number of patient falls in the hospital or to monitor surgical outcomes to detect changes in surgical failure rates. A natural choice for on-going surveillance is the control chart; however, the chart must be constructed in a way that accommodates the situation at hand. Two scenarios involving attribute control charting are investigated here. The first scenario involves Poisson count data where the area of opportunity changes. A modified exponentially weighted moving average (EWMA) chart is proposed to accommodate the varying sample sizes. The performance of this method is compared with the performance for several competing control chart techniques and recommendations are made regarding the best preforming control chart method. This research is a result of joint work with Dr. William H. Woodall (Department of Statistics, Virginia Tech). The second scenario involves monitoring a process where items are classified into more than two categories and the results for these classifications are readily available. A multinomial cumulative sum (CUSUM) chart is proposed to monitor these types of situations. The multinomial CUSUM chart is evaluated through comparisons of performance with competing control chart methods. This research is a result of joint work with Mr. Lee J. Wells (Grado Department of Industrial and Systems Engineering, Virginia Tech) and Dr. William H. Woodall (Department of Statistics, Virginia Tech). / Ph. D.
47

From Worst-Case to Average-Case Efficiency – Approximating Combinatorial Optimization Problems

Plociennik, Kai 18 February 2011 (has links) (PDF)
In theoretical computer science, various notions of efficiency are used for algorithms. The most commonly used notion is worst-case efficiency, which is defined by requiring polynomial worst-case running time. Another commonly used notion is average-case efficiency for random inputs, which is roughly defined as having polynomial expected running time with respect to the random inputs. Depending on the actual notion of efficiency one uses, the approximability of a combinatorial optimization problem can be very different. In this dissertation, the approximability of three classical combinatorial optimization problems, namely Independent Set, Coloring, and Shortest Common Superstring, is investigated for different notions of efficiency. For the three problems, approximation algorithms are given, which guarantee approximation ratios that are unachievable by worst-case efficient algorithms under reasonable complexity-theoretic assumptions. The algorithms achieve polynomial expected running time for different models of random inputs. On the one hand, classical average-case analyses are performed, using totally random input models as the source of random inputs. On the other hand, probabilistic analyses are performed, using semi-random input models inspired by the so called smoothed analysis of algorithms. Finally, the expected performance of well known greedy algorithms for random inputs from the considered models is investigated. Also, the expected behavior of some properties of the random inputs themselves is considered.
48

From Worst-Case to Average-Case Efficiency – Approximating Combinatorial Optimization Problems: From Worst-Case to Average-Case Efficiency – Approximating Combinatorial Optimization Problems

Plociennik, Kai 27 January 2011 (has links)
In theoretical computer science, various notions of efficiency are used for algorithms. The most commonly used notion is worst-case efficiency, which is defined by requiring polynomial worst-case running time. Another commonly used notion is average-case efficiency for random inputs, which is roughly defined as having polynomial expected running time with respect to the random inputs. Depending on the actual notion of efficiency one uses, the approximability of a combinatorial optimization problem can be very different. In this dissertation, the approximability of three classical combinatorial optimization problems, namely Independent Set, Coloring, and Shortest Common Superstring, is investigated for different notions of efficiency. For the three problems, approximation algorithms are given, which guarantee approximation ratios that are unachievable by worst-case efficient algorithms under reasonable complexity-theoretic assumptions. The algorithms achieve polynomial expected running time for different models of random inputs. On the one hand, classical average-case analyses are performed, using totally random input models as the source of random inputs. On the other hand, probabilistic analyses are performed, using semi-random input models inspired by the so called smoothed analysis of algorithms. Finally, the expected performance of well known greedy algorithms for random inputs from the considered models is investigated. Also, the expected behavior of some properties of the random inputs themselves is considered.
49

Predicting Achievement in a Graduate School of Education

Hughes, Billie Edward, 1927- 06 1900 (has links)
The purpose of this study was to determine the relationships between selected criteria and academic achievement in graduate studies leading to certain master's degrees in education at North Texas State University. The major areas in education which were included were Elementary Education, Elementary School Administration and Supervision, Secondary Education, Secondary School Administration and Supervision, and Physical Education. Graduate grade-point average was used as the criterion for academic achievement.
50

Koncept průměrného spotřebitele v českém a evropském právu / The concept of an average consumer in Czech and European law

Oriešková, Stanislava January 2014 (has links)
Resumé The concept on an average consumer in Czech and European law The aim of this thesis is to describe actual trends in conceptualization of average customer, both in Czech and European law. The work focuses on examination of differences in demands placed on average customer depending on the area the customer is currently located at. The first chapter deals with definition of average customer. According to the fact Czech law is influenced by European law, firstly, the definitions in European legislation and case law are mentioned and subsequently also the definitions provided by Czech law. The, it is referred to the definitions in specialized literature. The essential point of second chapter is the protection of average customer. It explores the level of the protection provided by European law and it accents on importance of limits in such a protection. It is also underlined that the level of protection depends on determination of target average customer and it is increased if the target group is the group of particularly vulnerable customers. In third chapter, it is analyzed how the average customer is conceptualized when considering if the action was affected by unfair competition or not. It deals with misleading advertising, comparative advertising and misleading identification of goods and services....

Page generated in 0.0346 seconds