• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 387
  • 172
  • 154
  • 37
  • 34
  • 29
  • 29
  • 27
  • 27
  • 19
  • 13
  • 11
  • 10
  • 7
  • 4
  • Tagged with
  • 1097
  • 180
  • 140
  • 128
  • 113
  • 111
  • 104
  • 101
  • 99
  • 97
  • 90
  • 88
  • 87
  • 81
  • 78
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Proft Maximizing Hedging Strategies for Managers and Members of Vertical Beef Alliances

Claus, Lora Hamerschlag 24 May 2005 (has links)
Vertical alliances are an increasingly common form of organization for participants in the beef industry. The implications of combining feeding and packing margins into one alliance are investigated. Moving average based selective hedging strategies are used to hedge the major inputs and outputs for cattle owners and packers to improve the level of mean revenue to the alliance. The success of the hedging program is evaluated from mean-variance and cash-flow perspectives. / Master of Science
42

Lönar det sig att följa strömmen? : Påverkan av konformitet på kunskapstest

Stanzl, Ulrika January 2016 (has links)
Mycket forskning har gjorts på konformitet och dess negativa konsekvenser. Denna experimentella studie var en delvis replikering av M. Rosander och O. Erikssons (2012) studie om konformitet på internet men med fokus på hur konformitet kan ge förbättrade resultat. Det övergripande syftet var att undersöka om det lönade sig att göra som andra, relationen mellan konformitet och säkerhet samt förekomsten av överkonfidens. Två enkätstudier i form av kunskapstest genomfördes med 38 respektive 92 studenter. Deltagarna delades upp i en konformitetsgrupp som fick information om tidigare deltagares svar och en kontrollgrupp som inte fick denna information. Resultatet visade att konformitet ledde till fler rätta svar samt att det förekom ett positivt samband mellan säkerhet och antal rätta svar. Ingen överkonfidens kunde bevisas men däremot förekom en underskattning av antal rätta svar. Studiens slutsats var att det mest lönade sig att göra som andra vid medelsvåra uppgifter. Vid mycket svåra uppgifter fanns det ingen fördel i att följa strömmen.
43

Performance Limits of Communication with Energy Harvesting

Znaidi, Mohamed Ridha 04 1900 (has links)
In energy harvesting communications, the transmitters have to adapt transmission to the availability of energy harvested during communication. The performance of the transmission depends on the channel conditions which vary randomly due to mobility and environmental changes. During this work, we consider the problem of power allocation taking into account the energy arrivals over time and the quality of channel state information (CSI) available at the transmitter, in order to maximize the throughput. Differently from previous work, the CSI at the transmitter is not perfect and may include estimation errors. We solve this problem with respect to the energy harvesting constraints. Assuming a perfect knowledge of the CSI at the receiver, we determine the optimal power policy for different models of the energy arrival process (offline and online model). Indeed, we obtain the power allocation scheme when the transmitter has either perfect CSI or no CSI. We also investigate of utmost interest the case of fading channels with imperfect CSI. Moreover, a study of the asymptotic behavior of the communication system is proposed. Specifically, we analyze of the average throughput in a system where the average recharge rate goes asymptotically to zero and when it is very high.
44

An event study : the market reactions to share repurchase announcements on the JSE

Punwasi, Kiran 24 February 2013 (has links)
This study examines the market reactions to share repurchase announcements made by companies listed on the Johannesburg Stock Exchange from 2003 to 2012. We use an event study methodology and the Capital Asset Pricing Model to determine if there is an announcement effect when a share repurchase announcement is made. Our analysis show that consistent with signalling theory and the announcement effect, share repurchase announcements are associated with positive abnormal returns. The average abnormal return and cumulative average abnormal return noted was 0.46% and 3.81% respectively for the event period (t -20, t +20). There was an observable trend of declining share prices before the share repurchase announcement however the decline in the shares prices was not significant. We found some evidence of market timing ability in 2005 and 2010 however as a collective, we found no significant difference in timing a share repurchase announcement. / Dissertation (MBA)--University of Pretoria, 2012. / Gordon Institute of Business Science (GIBS) / unrestricted
45

Neglecting the Predictions of Others: The Effects of Base Rate Neglect andInterhemispheric Interaction on the Above and Below Average Effects

Lanning, Michael D. January 2015 (has links)
No description available.
46

Surveillance of Poisson and Multinomial Processes

Ryan, Anne Garrett 18 April 2011 (has links)
As time passes, change occurs. With this change comes the need for surveillance. One may be a technician on an assembly line and in need of a surveillance technique to monitor the number of defective components produced. On the other hand, one may be an administrator of a hospital in need of surveillance measures to monitor the number of patient falls in the hospital or to monitor surgical outcomes to detect changes in surgical failure rates. A natural choice for on-going surveillance is the control chart; however, the chart must be constructed in a way that accommodates the situation at hand. Two scenarios involving attribute control charting are investigated here. The first scenario involves Poisson count data where the area of opportunity changes. A modified exponentially weighted moving average (EWMA) chart is proposed to accommodate the varying sample sizes. The performance of this method is compared with the performance for several competing control chart techniques and recommendations are made regarding the best preforming control chart method. This research is a result of joint work with Dr. William H. Woodall (Department of Statistics, Virginia Tech). The second scenario involves monitoring a process where items are classified into more than two categories and the results for these classifications are readily available. A multinomial cumulative sum (CUSUM) chart is proposed to monitor these types of situations. The multinomial CUSUM chart is evaluated through comparisons of performance with competing control chart methods. This research is a result of joint work with Mr. Lee J. Wells (Grado Department of Industrial and Systems Engineering, Virginia Tech) and Dr. William H. Woodall (Department of Statistics, Virginia Tech). / Ph. D.
47

From Worst-Case to Average-Case Efficiency – Approximating Combinatorial Optimization Problems

Plociennik, Kai 18 February 2011 (has links) (PDF)
In theoretical computer science, various notions of efficiency are used for algorithms. The most commonly used notion is worst-case efficiency, which is defined by requiring polynomial worst-case running time. Another commonly used notion is average-case efficiency for random inputs, which is roughly defined as having polynomial expected running time with respect to the random inputs. Depending on the actual notion of efficiency one uses, the approximability of a combinatorial optimization problem can be very different. In this dissertation, the approximability of three classical combinatorial optimization problems, namely Independent Set, Coloring, and Shortest Common Superstring, is investigated for different notions of efficiency. For the three problems, approximation algorithms are given, which guarantee approximation ratios that are unachievable by worst-case efficient algorithms under reasonable complexity-theoretic assumptions. The algorithms achieve polynomial expected running time for different models of random inputs. On the one hand, classical average-case analyses are performed, using totally random input models as the source of random inputs. On the other hand, probabilistic analyses are performed, using semi-random input models inspired by the so called smoothed analysis of algorithms. Finally, the expected performance of well known greedy algorithms for random inputs from the considered models is investigated. Also, the expected behavior of some properties of the random inputs themselves is considered.
48

From Worst-Case to Average-Case Efficiency – Approximating Combinatorial Optimization Problems: From Worst-Case to Average-Case Efficiency – Approximating Combinatorial Optimization Problems

Plociennik, Kai 27 January 2011 (has links)
In theoretical computer science, various notions of efficiency are used for algorithms. The most commonly used notion is worst-case efficiency, which is defined by requiring polynomial worst-case running time. Another commonly used notion is average-case efficiency for random inputs, which is roughly defined as having polynomial expected running time with respect to the random inputs. Depending on the actual notion of efficiency one uses, the approximability of a combinatorial optimization problem can be very different. In this dissertation, the approximability of three classical combinatorial optimization problems, namely Independent Set, Coloring, and Shortest Common Superstring, is investigated for different notions of efficiency. For the three problems, approximation algorithms are given, which guarantee approximation ratios that are unachievable by worst-case efficient algorithms under reasonable complexity-theoretic assumptions. The algorithms achieve polynomial expected running time for different models of random inputs. On the one hand, classical average-case analyses are performed, using totally random input models as the source of random inputs. On the other hand, probabilistic analyses are performed, using semi-random input models inspired by the so called smoothed analysis of algorithms. Finally, the expected performance of well known greedy algorithms for random inputs from the considered models is investigated. Also, the expected behavior of some properties of the random inputs themselves is considered.
49

Predicting Achievement in a Graduate School of Education

Hughes, Billie Edward, 1927- 06 1900 (has links)
The purpose of this study was to determine the relationships between selected criteria and academic achievement in graduate studies leading to certain master's degrees in education at North Texas State University. The major areas in education which were included were Elementary Education, Elementary School Administration and Supervision, Secondary Education, Secondary School Administration and Supervision, and Physical Education. Graduate grade-point average was used as the criterion for academic achievement.
50

Koncept průměrného spotřebitele v českém a evropském právu / The concept of an average consumer in Czech and European law

Oriešková, Stanislava January 2014 (has links)
Resumé The concept on an average consumer in Czech and European law The aim of this thesis is to describe actual trends in conceptualization of average customer, both in Czech and European law. The work focuses on examination of differences in demands placed on average customer depending on the area the customer is currently located at. The first chapter deals with definition of average customer. According to the fact Czech law is influenced by European law, firstly, the definitions in European legislation and case law are mentioned and subsequently also the definitions provided by Czech law. The, it is referred to the definitions in specialized literature. The essential point of second chapter is the protection of average customer. It explores the level of the protection provided by European law and it accents on importance of limits in such a protection. It is also underlined that the level of protection depends on determination of target average customer and it is increased if the target group is the group of particularly vulnerable customers. In third chapter, it is analyzed how the average customer is conceptualized when considering if the action was affected by unfair competition or not. It deals with misleading advertising, comparative advertising and misleading identification of goods and services....

Page generated in 0.0462 seconds