• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 660
  • 162
  • 91
  • 63
  • 57
  • 28
  • 15
  • 14
  • 12
  • 12
  • 5
  • 5
  • 4
  • 4
  • 3
  • Tagged with
  • 1295
  • 368
  • 349
  • 241
  • 157
  • 154
  • 150
  • 125
  • 118
  • 109
  • 97
  • 96
  • 92
  • 89
  • 88
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Optimal machinery use intensity for a large farm in west central Manitoba

Gerrard, William 26 August 2011 (has links)
Farmers in Western Canada are continually assessing where to invest their next dollar. In considering a farm expansion and the machinery assets they need to match their current farm size or a possible expansion. This study attempts to find the optimal farm size by creating a farm budget model that maximizes profit over a range of different farm sizes. As farm size increases there is more risk that inclement weather will lengthen the time needed for crop operations. Previous studies have shown that both seeding and harvest operations have optimum time windows in which they should occur for best yield results. The results of this research showed that net mean profit was maximized around a 9,000 acre grain farm. For farm sizes above 9,000 acres losses associated with lack of field operation time could not be compensated by cropping additional acres.
52

Efficient estimation of parameters of the extreme value distribution

Saha, Sathi Rani January 2014 (has links)
The problem of efficient estimation of the parameters of the extreme value distribution has not been addressed in the literature. We obtain efficient estimators of the parameters of type I (maximum) extreme value distribution without solving the likelihood equations. This research provides for the first time simple expressions for the elements of the information matrix for type II censoring. We construct efficient estimators of the parameters using linear combinations of order statistics of a random sample drawn from the population. We derive explicit formulas for the information matrix for this problem for type II censoring and construct efficient estimators of the parameters using linear combinations of available order statistics with additional weights to the smallest and largest order statistics. We consider numerical examples to illustrate the applications of the estimators. We also perform an extensive Monte Carlo simulation study to examine the performance of the estimators for different sample sizes.
53

Honest Approximations to Realistic Fault Models and Their Applications to Efficient Simulation of Quantum Error Correction

Daniel, Puzzuoli January 2014 (has links)
Understanding the performance of realistic noisy encoded circuits is an important task for the development of large-scale practical quantum computers. Specifically, the development of proposals for quantum computation must be well informed by both the qualities of the low-level physical system of choice, and the properties of the high-level quantum error correction and fault-tolerance schemes. Gaining insight into how a particular computation will play out on a physical system is in general a difficult problem, as the classical simulation of arbitrary noisy quantum circuits is inefficient. Nevertheless, important classes of noisy circuits can be simulated efficiently. Such simulations have led to numerical estimates of threshold errors rates and resource estimates in topological codes subject to efficiently simulable error models. This thesis describes and analyzes a method that my collaborators and I have introduced for leveraging efficient simulation techniques to understand the performance of large quantum processors that are subject to errors lying outside of the efficient simulation algorithm's applicability. The idea is to approximate an arbitrary gate error with an error from the efficiently simulable set in a way that ``honestly'' represents the original error's ability to preserve or distort quantum information. After introducing and analyzing the individual gate approximation method, its utility as a means for estimating circuit performance is studied. In particular, the method is tested within the use-case for which it was originally conceived; understanding the performance of a hypothetical physical implementation of a quantum error-correction protocol. It is found that the method performs exactly as desired in all cases. That is, the circuits composed of the approximated error models honestly represent the circuits composed of the errors derived from the physical models.
54

Optimal machinery use intensity for a large farm in west central Manitoba

Gerrard, William 26 August 2011 (has links)
Farmers in Western Canada are continually assessing where to invest their next dollar. In considering a farm expansion and the machinery assets they need to match their current farm size or a possible expansion. This study attempts to find the optimal farm size by creating a farm budget model that maximizes profit over a range of different farm sizes. As farm size increases there is more risk that inclement weather will lengthen the time needed for crop operations. Previous studies have shown that both seeding and harvest operations have optimum time windows in which they should occur for best yield results. The results of this research showed that net mean profit was maximized around a 9,000 acre grain farm. For farm sizes above 9,000 acres losses associated with lack of field operation time could not be compensated by cropping additional acres.
55

Conventions and the stock market game

Fuggetta, Massimo January 1991 (has links)
Forecasting stock price movements is a notoriously difficult job. Were it not so, it would be easy to get richer. In this case, however, nobody would get poorer. But if nobody gets poorer, nobody will get richer. There are two ways to get out of this vicious circle. The first, and the more well-trodden, is the Efficient Market Theory (EMT), or: Everybody Understands Everything. The second is the Casino Market Theory (CMT), or: Nobody Understands Anything. This work is an attempt to bridge the gap between these two theories. In the first chapter the EMT is analysed in its fundamental constituents, while Chapter 2 contains a discussion of several empirical tests of the theory. Chapter 3 extends the EMT to incorporate variable risk premia and rational speculative bubbles and Chapter 4 presents the available empirical evidence on the extended model. The line of research based on the EMT paradigm is abandoned in Chapter 5, where the central principle of the EMT - the assumption of homogeneous investors with common priors - is investigated and challenged. The basis is there laid for an alternative view of the stock market game, which emphasises the conventional nature of investors' beliefs about future returns and is consistent with the view that stock market prices do not only reflect the fundamental value of underlying companies. In Chapter 6, the hypothesis that non fundamental information (in particular, past information) may have an influence on current stock prices is evaluated against monthly data relative to the US, UK, Japanese and Italian stock markets. Contrary to popular wisdom, we find that past information has a significant effect on current stock returns. Our evidence indicates that, as Keynes suggested in the General Theory, conventional beliefs play a crucial role in the stock market game.
56

Two essays on stock market anomalies /

Lam, Eric Campbell Full Yet. January 2009 (has links)
Includes bibliographical references.
57

Precision farming in South Africa

Rusch, Peter C 07 January 2004 (has links)
Precision Farming is by far the most exciting new agricultural technology developed during the past decade, and although technology transfer is especially difficult in agriculture for a number of reasons, this technology has survived its initial stages of implementation. Historically field boundaries were often along natural soil boundaries, leading to small fields, which were treated homogenously. As agricultural machinery was developed and grew ever larger, fields were often combined to allow for more efficient cultivation. As result, fields with varying properties were created resulting in inefficiencies. Precision Farming was developed to overcome this problem. In this paper some results of initial research undertaken in South Africa under a variety of circumstances will be shown. / Dissertation (MEng)--University of Pretoria, 2005. / Civil Engineering / Unrestricted
58

Robust Target Detection Methods: Performance Analysis and Experimental Validation

January 2020 (has links)
abstract: Constant false alarm rate is one of the essential algorithms in a RADAR detection system. It allows the RADAR system to dynamically set thresholds based on the data power level to distinguish targets with interfering noise and clutters. To have a better acknowledgment of constant false alarm rate approaches performance, three clutter models, Gamma, Weibull, and Log-normal, have been introduced to evaluate the detection's capability of each constant false alarm rate algorithm. The order statistical constant false alarm rate approach outperforms other conventional constant false alarm rate methods, especially in clutter evolved environments. However, this method requires high power consumption due to repeat sorting. In the automotive RADAR system, the computational complexity of algorithms is essential because this system is in real-time. Therefore, the algorithms must be fast and efficient to ensure low power consumption and processing time. The reduced computational complexity implementations of cell-averaging and order statistic constant false alarm rate were explored. Their big O and processing time has been reduced. / Dissertation/Thesis / Masters Thesis Electrical Engineering 2020
59

Breaching the perimeter: Designing for more economically feasible, durable, and sustainable construction within the United States military

January 2018 (has links)
As Americans witness the slow dissociation of the military from the civilian public, the need for a strong design initiative within military installations proves as applicable and necessary as it has always been. The role of the designer within the military is a longstanding and vigorously debated duty; the superficially disparate natures of the professions separate themselves on the premise of individual superiority, and isolate their fields of expertise from one another. However, the two microcosms retain an identity that may serve traditionally different clientele, but their purposes reflect and complement one another. This notion is best exemplified by the pedagogy often associated with architecture and the military: a community working tirelessly to construct a system best adapted to the public, regularly working with a client who does not have a clear vision of the resolution, but instead relies on the services of both occupations to not only visualize the outcome but to design the process as well. The all contingency of accredited designers within a typical military hierarchy have been tasked with creating a conducive living environment centralized around "the mission". While they have toiled endlessly to produce such a product, the unfortunate reality demonstrates that the weight of schematics has been typically relegated to grandsons of civil engineers and civilians with unrelated degrees and very little experience in a headquarter building hundreds of miles away. Bearing this in mind, the purpose of this thesis is to discover the greater organization of a military base, and to standardize it not according to chance doctrine, but soundly informed and localized knowledge of the surrounding environment. Such a design must be informed by a few key aspects; principally, the macro intention of such a layout must be centralized around "the mission", which in the case of most military bases, resembles a training and living environment conducive to deployment and combat effectiveness. Similarly, the determination of design must be within the scope of economic feasibility, which although quite gratuitous at first glance, is meagerly distributed throughout the separate branches and therein the country. Lastly, the design must have tenacity, as the ebb and flow of active duty populations produce an arbitrary fluctuation, but the life expectancy of such buildings is often projected within the fifty- to sixty-year time frame. Through careful research, and the benefit of personal interviews with clients who have spent collective centuries in the modern military, a design solution for the improved daily lives and increased combat effectiveness of the American military will serve to discuss the ways in which we can inform the macroevolution of military installations through dissecting the micro. / 0 / SPK / specialcollections@tulane.edu
60

Thin-Film Photothermal Materials and Their Potentials on Energy Applications

Zhao, Yuan 01 October 2019 (has links)
No description available.

Page generated in 0.0594 seconds