• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 660
  • 161
  • 91
  • 63
  • 57
  • 28
  • 15
  • 14
  • 12
  • 12
  • 5
  • 5
  • 4
  • 4
  • 3
  • Tagged with
  • 1293
  • 367
  • 348
  • 240
  • 156
  • 153
  • 149
  • 125
  • 117
  • 108
  • 97
  • 96
  • 91
  • 88
  • 88
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

The informational efficiency of the Korean stock market excess profits from technical speculations /

Kim, Myung Soo. January 1991 (has links)
Thesis (Ph. D.)--Claremont Graduate School, 1992. / Typescript (photocopy). Includes bibliographical references (leaves [272]-276).
32

The efficiency of the Mexican stock market

Hakim Simon, Miguel. January 1988 (has links)
Thesis (Ph. D.)--Claremont Graduate School, 1988. / Includes bibliographical references (leaves 221-227).
33

Data-Efficient Learning in Image Synthesis and Instance Segmentation

Robb, Esther Anne 18 August 2021 (has links)
Modern deep learning methods have achieve remarkable performance on a variety of computer vision tasks, but frequently require large, well-balanced training datasets to achieve high-quality results. Data-efficient performance is critical for downstream tasks such as automated driving or facial recognition. We propose two methods of data-efficient learning for the tasks of image synthesis and instance segmentation. We first propose a method of high-quality and diverse image generation from finetuning to only 5-100 images. Our method factors a pretrained model into a small but highly expressive weight space for finetuning, which discourages overfitting in a small training set. We validate our method in a challenging few-shot setting of 5-100 images in the target domain. We show that our method has significant visual quality gains compared with existing GAN adaptation methods. Next, we introduce a simple adaptive instance segmentation loss which achieves state-of-the-art results on the LVIS dataset. We demonstrate that the rare categories are heavily suppressed by textit{correct background predictions}, which reduces the probability for all foreground categories with equal weight. Due to the relative infrequency of rare categories, this leads to an imbalance that biases towards predicting more frequent categories. Based on this insight, we develop DropLoss -- a novel adaptive loss to compensate for this imbalance without a trade-off between rare and frequent categories. / Master of Science / Many of the impressive results seen in modern computer vision rely on learning patterns from huge datasets of images, but these datasets may be expensive or difficult to collect. Many applications of computer vision need to learn from a very small number of examples, such as learning to recognize an unusual traffic event and behave safely in a self-driving car. In this thesis we propose two methods of learning from only a few examples. Our first method generates novel, high-quality and diverse images using a model fine-tuned on only 5-100 images. We start with an image generation model that was trained a much larger image set (70K images), and adapts it to a smaller image set (5-100 images). We selectively train only part of the network to encourage diversity and prevent memorization. Our second method focuses on the instance segmentation setting, where the model predicts (1) what objects occur in an image and (2) their exact outline in the image. This setting commonly suffers from long-tail distributions, where some of the known objects occur frequently (e.g. "human" may occur 1000+ times) but most only occur a few times (e.g. "cake" or "parrot" may only occur 10 times). We observed that the "background" label has a disproportionate effect of suppressing the rare object labels. We use this to develop a method to balance suppression from background classes during training.
34

Sustainability and Affordability: How Single-Family Home Retrofits Can Achieve Both

Goff, Jason January 2015 (has links)
Sustainable Built Environments Senior Capstone Project / Climate change and resource availability are arguably the two biggest challenges humanity faces going forward. An unprecedented body of scientific work has been compiled over the past thirty years that indicates humans have and continue to be the largest driver of these environmental concerns, and therefore must also be responsible for any solutions. Buildings and their construction account for nearly 40% of the total energy consumption and greenhouse gas emissions in the United States. Water consumption by both buildings and thermoelectric power generation is also an issue, especially in the Southwest and Western United States. Green building has been gaining steam in the U.S. for the past two decades, but the primary focus has been in the commercial and industrial sectors. The residential markets have not seen the efficiency gains, primarily due to the perception that the cost isn’t worth the benefit. This project examines the need, feasibility, and potential benefits of sustainably retrofitting existing homes as an alternative to new construction. It provides a broad definition of sustainability and then focuses into a more narrow description of its application within the built environment. Using precedents, 3D modeling, and energy simulation software it compares the energy and water savings of a retrofit versus a base case as well as the performance of the average Southern Arizona home. Finally, this capstone project provides a professional cost estimate for the implementation of the proposed changes and a side-by-side look at the available “green” housing market, the utility cost savings for the homeowner, and the environmental benefits of individual as well as large-scale adoption of sustainable retrofitting practices.
35

SMALL VOLUME, FEHER-PATENTED QUADRATURE PHASE SHIFT KEYING, JR VERSION, TELEMETRY TRANSMITTER

Bottenfield, Joe, Moore, Vern 10 1900 (has links)
International Telemetering Conference Proceedings / October 20-23, 2003 / Riviera Hotel and Convention Center, Las Vegas, Nevada / This paper describes the implementation of a Feher-Patented Quadrature Phase Shift Keying (FQPSK) waveform variant that reduces overall design complexity, which in turn results in a telemetry transmitter that provides all the benefits of the existing FQPSK-B waveform, in a 3.0 x 2.0 x 1.0 volume. This variant is referred to the as the FQPSK-JR version. This waveform differs from the “near constant” envelop response of the qualified Herley airborne FQPSK-B telemetry transmitter in terms of the time domain wavelet transition functions and the amplitude scaling term associated with those functions. The end result is a “constant envelop” design, which employs simplified antialias filtering and more efficient digital design techniques.
36

Sorting out a Profitable Strategy from IPO's : A quantitative study about underpricing and different Buy-and-Hold strategies for IPO's on the Swedish Stock Exchange

Johansson, Christoffer January 2016 (has links)
An alternative way to invest on the stock market is to invest in IPO’s. An IPO (InitialPublic Offering) is the first time a company goes public on a stock market, giving outshares to private investors and financial institutions. However, there might be someuncertainties about the share price as it never has been traded on the stock exchangebefore and it could therefore be difficult to determine a reasonable value for the shareprice. Consequently, if the offering price for the investor is significantly lower thanthe “correct valued” price it will generate positive initial return during the first tradingday and this phenomenon is labelled as underpricing, generating more “money on thetable”. Still, previous researches display an underperformance among IPO’s during alonger period after the introduction compared to already established companies withinthe same sector, arguing that investors should sell their shares early after the firsttrading day.The objective of this study is therefore to determine if underpricing exists for IPO’son the Swedish stock exchange and if there are any differentiations amongst sectors,and also to investigate two different Buy-and-Hold strategies. A final objective for thestudy is to determine if the level of underpricing is affected by some explanatoryvariables.With a quantitative study and a longitudinal approach, the results confirm the effect ofunderpricing for IPO’s on the Swedish stock exchange, generating an averageunderpricing of 5.56%. Additionally, this study cannot display any different medianunderpricing between industry sectors. However, it contradicts with theunderperformance phenomenon, indicating an overperformance for longer Buy-and-Hold strategies. Lastly, a regression of explanatory variables trying to explain thelevel of underpricing demonstrates no statistically significant results.
37

Market reaction to bad news : the case of bankruptcy filings

Coelho, Luis January 2008 (has links)
Finance scholars disagree on how real world financial markets work. On the one hand, efficient market hypothesis (EMH) advocates claim that arbitrage ensures that market prices do not systematically deviate from their fundamental value even when some market participants are less than fully rational. Hence, in the EMH world, securities’ prices always reflect all available information. On the other hand, behavioural finance theorists argue that investors suffer important cognitive biases and that arbitrage is both risky and costly. In this alternative setting, prices may not reflect all available information and can systematically deviate from their fundamental value for long periods of time. My thesis contributes to this ongoing debate by exploring how the US equity market reacts to bankruptcy announcements. Using a set of 351 non-financial, non-utility firms filing for Chapter 11 between 1979 and 2005 that remain listed on a main exchange, I first find a strong, negative and statistically significant mean post-bankruptcy announcement drift. This ranges from -24 to -44 percent over the following 12 months depending on the benchmark adopted to measure abnormal returns. A number of robustness tests confirm that this result is not a mere statistical artefact. In fact, the post-bankruptcy drift is not subsumed by known confounding factors like the post-earnings announcement drift, the post-first-time going concern drift, the momentum effect, the book-to-market effect, industry clustering or the level of financial distress. In addition, I show that my main result is robust to different methods for conducting longer-term event studies. My empirical findings are consistent with the previous behavioural finance literature that claims that the market is unable to deal appropriately with acute bad news events. In the second part of this thesis, I investigate how limits to arbitrage impact the stock price of firms undergoing a Chapter 11 reorganization. I find that, despite the apparent large negative abnormal returns, the post-bankruptcy announcement drift offers only an illusory profit opportunity. Moreover, I show that noise trader risk is critical for the pricing of these firms’ stock. Taken together, my results suggest that limits to arbitrage issues can explain the persistence of the market-pricing anomaly I uncover. As such, the market for firms in Chapter 11 appears to be “minimally rational” (Rubinstein, 2001). My work additionally explores whether behavioural finance theory can help clarify why the post-bankruptcy announcement drift occurs in the first place. I find that the Barberis, Shleifer and Vishny (1998) and the Hong and Stein (1999) models do not account well for the typical return pattern associated with the announcement of Chapter 11. My results call into question the reliability of existing theoretical models based on behavioural concepts in explaining how real world financial markets really work. In the last part of this thesis, I show that the different motivations for filing for Chapter 11 Court protection affect the market’s reaction to this extreme event. Solvent firms addressing the Bankruptcy Court not as a last resort but as a planned business strategy characterize a strategic bankruptcy; companies on the verge of imminent failure typify a non-strategic bankruptcy. I find that for non-strategic bankruptcies, there is a negative and statistically significant post-event drift lasting at least twelve months. Conversely, I show that, although the initial market reaction to bankruptcy filing is similar in the case of strategic bankruptcies in terms of viewing all bankruptcies as homogeneous, there is a subsequent reversal in the stock return pattern for these peculiar firms. In effect, abnormal returns become strongly positive and significant suggesting that, over time, the market to recognise strategic bankruptcies as good news events. Overall, the results of my PhD allow me to make some important contributions to finance theory and the finance literature, in particular in the bad news disclosure and market pricing domains.
38

Efficient double auction mechanisms in the energy grid with connected and islanded microgrids

Faqiry, Mohammad January 1900 (has links)
Doctor of Philosophy / Department of Electrical and Computer Engineering / Sanjoy Das / The future energy grid is expected to operate in a decentralized fashion as a network of autonomous microgrids that are coordinated by a Distribution System Operator (DSO), which should allocate energy to them in an efficient manner. Each microgrid operating in either islanded or grid-connected mode may be considered to manage its own resources. This can take place through auctions with individual units of the microgrid as the agents. This research proposes efficient auction mechanisms for the energy grid, with is-landed and connected microgrids. The microgrid level auction is carried out by means of an intermediate agent called an aggregator. The individual consumer and producer units are modeled as selfish agents. With the microgrid in islanded mode, two aggregator-level auction classes are analyzed: (i) price-heterogeneous, and (ii) price homogeneous. Under the price heterogeneity paradigm, this research extends earlier work on the well-known, single-sided Kelly mechanism to double auctions. As in Kelly auctions, the proposed algorithm implements the bidding without using any agent level private infor-mation (i.e. generation capacity and utility functions). The proposed auction is shown to be an efficient mechanism that maximizes the social welfare, i.e. the sum of the utilities of all the agents. Furthermore, the research considers the situation where a subset of agents act as a coalition to redistribute the allocated energy and price using any other specific fairness criterion. The price homogeneous double auction algorithm proposed in this research ad-dresses the problem of price-anticipation, where each agent tries to influence the equilibri-um price of energy by placing strategic bids. As a result of this behavior, the auction’s efficiency is lowered. This research proposes a novel approach that is implemented by the aggregator, called virtual bidding, where the efficiency can be asymptotically maximized, even in the presence of price anticipatory bidders. Next, an auction mechanism for the energy grid, with multiple connected mi-crogrids is considered. A globally efficient bi-level auction algorithm is proposed. At the upper-level, the algorithm takes into account physical grid constraints in allocating energy to the microgrids. It is implemented by the DSO as a linear objective quadratic constraint problem that allows price heterogeneity across the aggregators. In parallel, each aggrega-tor implements its own lower-level price homogeneous auction with virtual bidding. The research concludes with a preliminary study on extending the DSO level auc-tion to multi-period day-ahead scheduling. It takes into account storage units and conven-tional generators that are present in the grid by formulating the auction as a mixed inte-ger linear programming problem.
39

Design study of energy-efficient routing protocol for wireless sensor networks.

Lu, Lifang January 2009 (has links)
Recent advances in wireless sensor networks have led to an emergence of many routing protocols. Limited battery capacity of sensor nodes makes energy efficiency a major and challenge problem in wireless sensor networks. Thus, the routing protocols for wireless sensor networks must be energy efficient in order to maximise the network lifetime. In this thesis, we developed a centralised clustering, energy-efficient routing protocol for wireless sensor networks. Our protocol consists of a cluster head selection algorithm, a cluster formation scheme and a routing algorithm for the data transmission between cluster heads and the base station. The cluster head selection algorithm is performed by the base station using global information of the network. This algorithm aiming at choosing cluster heads that ensure both the intra-cluster data transmission and inter-cluster data transmission are energy-efficient. The cluster formation scheme is accomplished by exchanging messages between non-cluster-head nodes and the cluster head to ensure a balanced energy load among cluster heads. The routing algorithm is based on the optimal transmission range for the data transmission between cluster heads and the base station using multi-hop. The performance of our routing protocol is evaluated by comparing with three existing routing protocols on a simulation platform. The simulation results show that our protocol can achieve better performance in terms of energy efficiency and network lifetime. Because of the centralised algorithm and multi-hop routing, there is a small communication overhead and transmission delay when using our protocol. Since our protocol can save energy and prolong network lifetime, it is well suited for applications where energy and network lifetime are the primary considerations and small overhead and time delay can be tolerated. / http://proxy.library.adelaide.edu.au/login?url= http://library.adelaide.edu.au/cgi-bin/Pwebrecon.cgi?BBID=1456494 / Thesis (M.Eng.Sc.) - University of Adelaide, School of Electrical and Electronic Engineering, 2009
40

Repurchases on the Swedish Stock Market : - A good long-term investment?

Tran, Nguyen, Weigardh, Anton January 2013 (has links)
The purpose of this paper is to investigate the long-term share price effects for Swedish companies that employed repurchases programs during 2000 - 2012. This paper applies a trading strategy where the investor invests in stocks of compa-nies that engage in repurchase of their own equity. We test buy-and-hold abnormal returns versus two different proxies for the control firm, using small sample t-statistics. Abnormal returns for one to five years are insignificant under sta-tistic tests, using the supersector indices. In contrast, they are significant using a proxy for the market index as control firm. Factors hypothesized to contribute to this result are incon-clusive using our method of comparison. As a whole, we suggest that investing in companies that repurchase stock is a solid strategy: It is on par or better than index.

Page generated in 0.0646 seconds