131 |
Development of Concepts of Capital and Income in Financial Reporting in the Nineteenth CenturyRowles, Thomas (Tom), n/a January 2007 (has links)
The study is concerned with the conception of capital and income in the changing economic circumstances of the late nineteenth century. This issue arises as a matter of interest from the confusing accounting for capital assets then followed, and which has become the subject of a small but significant literature. Methodologically the issue, and the literature it has provoked, provide a 'set' in which an accounting calculation is identified, its context considered and consequences evaluated. It introduces the idea that accounting had macroeconomic implications, and meets Hopwood's (1983) injunction that accounting ought to be considered in the context in which it arises. The study illustrates the significance of a flawed accounting founded on an inadequate definition of capital to adversely affect economic life by reference to the legal debate and litigation in English courts about the definition of profit available for distribution as dividends that occurred at the end of the nineteenth century. The study explores nineteenth century understanding of the concept of capital in economic philosophy on the basis that it would be in that body of philosophic literature that such ideas would have to be examined. The study finds that, for most of the nineteenth century, understanding of the nature of capital and income derived from the works of William Petty and Adam Smith. It held that capital and income were separate states of wealth. This conception of capital continued in the work of David Ricardo, Marx and J. S. Mill, and is evident also in the work of Alfred Marshall. The modern, twentieth century, understanding of capital and income as antithetical states of wealth is identified in the study as deriving from the work of the American economist Irving Fisher in 1896. The contribution of this thesis is to Establish that the crisis in late nineteenth century financial reporting derived from the prevailing conception of capital and its relationship to income, note that the conception in legislative requirements determining profit were consistent with that definition, and identify the origin of the modern, twentieth century understanding of capital and income as antithetical states of wealth. The study provides an in-principle view that nineteenth century capital accounting had the capacity to cause misallocation of resources within an economy.
|
132 |
Higher-Dimensional Gravitational Objects with External FieldsAbdolrahimi, Shohreh 11 1900 (has links)
This thesis summarizes a study of higher-dimensional distorted objects such as a distorted 5-dimensional Schwarzschild-Tangherlini black hole. It considers a particular type of distortion corresponding to an external, static distribution of matter and fields around this object. The corresponding spacetime can be presented in the generalized Weyl form which has an RxU(1)xU(1) group of isometries.
This is a natural generalization of the 4-dimensional Weyl form which was presented in the paper by Emparan and Reall [1]. In the frame of this generalized Weyl form one can derive an exact analytic solution to the Einstein equations which describes the non-linear interaction of the black hole with external matter and gravitational fields. This research focuses on the effects of such interaction on the event horizon and the interior of the black hole. A similar study was presented in the papers [2] for 4-dimensional neutral black holes, where special duality relations between a neutral black hole horizon and singularity were derived. In relation to this work it is interesting to study which properties of distorted black holes remain present in the 5-dimensional case.
This thesis also gives an investigation of the d-dimensional Fisher solution which represents a static, spherically symmetric, asymptotically flat spacetime with a massless scalar field. This solution has a naked singularity. It is shown that the d-dimensional Schwarzschild-Tangherlini solution and the Fisher solution are dual to each other.
[1] R. Emparan and H. S. Reall, Phys. Rev. D, 65, 084025 (2002).
[2] V. P. Frolov and A. A. Shoom, Phys. Rev. D, 76, 064037 (2007).
|
133 |
The Impact of Sectoral Change on Income Distribution in TaiwanChu, Chiu-Hui 30 July 2012 (has links)
Abstract
The thesis is from the angle of economic and industrial development, which proves between Taiwan's industrial structure and income distribution are positively correlated. With track of Taiwanese industry change, investigates the change of income distribution economic because of development and industrial structure change. Most of national economic development progress is associated with certain uneven distribution of income. The phenomenon of uneven income distribution is growing, however, is a concern in current worldwide economic development. In this article is the observation of Taiwanese industrial changes from early-day agricultural industry dramatically stepped to industrial industry, then expending toward service industry, to verify which is higher on uneven income distribution among them.
This research takes the theories of Fisher and Clark (1939,1940) as foundation, based on productivity and GDP, to build the pragmatic model of Regression Analysis as proof of income distribution affecting to tertiary industrial sectors change in Taiwan. The substantial evidence finds the growth of the service sector increased by 1%, the impact of Taiwan's economic growth is 0.769%, the agricultural sector is 0.103%, while service sector increased by 1%. The worsening of income distribution, caused the agricultural sector decreased by 0.11%.
Therefore, we can deduce that the service sector growth more has brought Taiwan's economy growth but also income distribution has significant worse. This study also considers foreign trade is an important economic lifeline of Taiwan, according to the IMF (IMF), Taiwan is the closest relations with the United States in East Asian nations, but China is coming up right after. Thus, this study conducts variables of regression analysis by putting in the United States and China, as proof that China or USA has more impact on our economic growth as policy maker¡¦s reference.
|
134 |
Essays on monetary economics and financial economicsKim, Sok Won 02 June 2009 (has links)
In this dissertation three different economic issues have been analyzed. The first
issue is whether monetary policy rules can improve forecasting accuracy of inflation.
The second is whether the preference of a central bank is symmetry or not. The last issue
is whether the behavior of aggregate dividends is asymmetry. Each issue is considered in
Chapter II, III and IV, respectively.
The linkage between monetary policy rules and the prediction of inflation is explored
in Chapter II. Our analysis finds that the prediction performance of the term structure
model hinges on monetary policy rules, which involve the manipulation of the federal
funds rate in response to the change in the price level. As the Fed's reaction to inflation
becomes stronger, the predictive information contained in the term structure becomes
weaker. Using the long-run Taylor rule, a new assessment of the prediction performance
regarding future change in inflation is provided. The empirical results indicate that the
long-run Taylor rule improves forecasting accuracy.
In chapter III, the asymmetric preferences of the central bank of Korea are examined
under New Keynesian sticky prices forward-looking economy framework. To this end, this chapter adopts the central bank's objective functional form as a linear-exponential
function instead of the standard quadratic function. The monetary policy reaction
function is derived and then asymmetric preference parameters are estimated during the
inflation targeting period: 1998:9-2005:12. The empirical evidence supports that while
the objective of output stability is symmetry, but the objective of price stability is not
symmetry. Specifically, it appears that the central bank of Korea aggressively responds
to positive inflation gaps compared to negative inflation gaps.
Chapter IV examines the nonlinear dividend behavior of the aggregate stock market.
We propose a nonlinear dividend model that assumes managers minimize the regime
dependent adjustment costs associated with being away from their target dividend
payout. By using the threshold vector error correction model, we find significant
evidence of a threshold effect in aggregate dividends of S&P 500 Index in quarterly data
when real stock prices are used for the target. We also find that when dividends are
relatively higher than target, the adjustment cost of dividends is much smaller than that
when they are lower.
|
135 |
A Comparation Analysis on the Risk Model for Portfolio that Contains Equity DerivativesLin, Wan-Chun 23 June 2004 (has links)
none
|
136 |
Revisiting The Fisher Effect For Developed And Developing Countries: A Bounds Test ApproachBaci, Duygu 01 April 2007 (has links) (PDF)
This study investigates the Fisher Effect for a sample of ten developed countries and ten developing countries. The study examines whether the nominal interest rate adjusts to the expected inflation rate in the long run. The distinction between the developed countries and developing countries also enables to identify special conditions under which Fisher Effect is more likely to hold. To analyze the long run relationship between the nominal interest rate and expected inflation rate, Bounds test approach of Pesaran et. al. (2001) is utilized. Estimation results show that the adjustment of nominal interest rate to expected inflation is encountered mostly for the developing countries which have inflationary history in their economies.
|
137 |
Applied estimation theory on power cable as transmission line.Mansour, Tony, Murtaja, Majdi January 2015 (has links)
This thesis presents how to estimate the length of a power cable using the MaximumLikelihood Estimate (MLE) technique by using Matlab. The model of the power cableis evaluated in the time domain with additive white Gaussian noise. The statistics havebeen used to evaluate the performance of the estimator, by repeating the experiment fora large number of samples where the random additive noise is generated for each sample.The estimated sample variance is compared to the theoretical Cramer Raw lower Bound(CRLB) for unbiased estimators. At the end of thesis, numerical results are presentedthat show when the resulting sample variance is close to the CRLB, and hence that theperformance of the estimator will be more accurate.
|
138 |
A critical study of Hammer Film Production’s brand of Gothic Horror from 1956 – 1972O'Brien, Morgan Clark 20 November 2013 (has links)
Hammer Film Production’s brand of melodramatic Gothic Horror reinvented horror cinema in 1957. Despite bringing tremendous financial success throughout the 1950s and 1960s, Hammer’s Gothic had run its course by the early 1970s and cinematic production ceased altogether by 1975. After establishing multiple iterations of a markedly recognizable house style, it is generally agreed that Hammer failed to adapt to the demands of a changing marketplace. This thesis investigates the circumstances surrounding Hammer’s demise by conducting neoformal analysis of case study films and examining how they were affected by cultural, historical, and industrial factors. Looking to Hammer’s films themselves helps determine to what extent they were responsible for Hammer’s misfortune and why. This thesis demonstrates how Hammer’s own production setup and early genre success contributed to the studio’s eventual downfall and the outside factors that underscored this process. I argue that Hammer did experiment with house formula but the studio’s attempts to renegotiate the 1970s horror landscape were unsuccessful because of changing audience demographics, an industry in transition, and Hammer’s own perceived corporate identity. / text
|
139 |
Photon Statistics in Scintillation CrystalsBora, Vaibhav Joga Singh January 2015 (has links)
Scintillation based gamma-ray detectors are widely used in medical imaging, high-energy physics, astronomy and national security. Scintillation gamma-ray detectors are field-tested, relatively inexpensive, and have good detection efficiency. Semi-conductor detectors are gaining popularity because of their superior capability to resolve gamma-ray energies. However, they are relatively hard to manufacture and therefore, at this time, not available in as large formats and much more expensive than scintillation gamma-ray detectors. Scintillation gamma-ray detectors consist of: a scintillator, a material that emits optical (scintillation) photons when it interacts with ionization radiation, and an optical detector that detects the emitted scintillation photons and converts them into an electrical signal. Compared to semiconductor gamma-ray detectors, scintillation gamma-ray detectors have relatively poor capability to resolve gamma-ray energies. This is in large part attributed to the "statistical limit" on the number of scintillation photons. The origin of this statistical limit is the assumption that scintillation photons are either Poisson distributed or super-Poisson distributed. This statistical limit is often defined by the Fano factor. The Fano factor of an integer-valued random process is defined as the ratio of its variance to its mean. Therefore, a Poisson process has a Fano factor of one. The classical theory of light limits the Fano factor of the number of photons to a value greater than or equal to one (Poisson case). However, the quantum theory of light allows for Fano factors to be less than one. We used two methods to look at the correlations between two detectors looking at same scintillation pulse to estimate the Fano factor of the scintillation photons. The relationship between the Fano factor and the correlation between the integral of the two signals detected was analytically derived, and the Fano factor was estimated using the measurements for SrI₂:Eu, YAP:Ce and CsI:Na. We also found an empirical relationship between the Fano factor and the covariance as a function of time between two detectors looking at the same scintillation pulse. This empirical model was used to estimate the Fano factor of LaBr₃:Ce and YAP:Ce using the experimentally measured timing-covariance. The estimates of the Fano factor from the time-covariance results were consistent with the estimates of the correlation between the integral signals. We found scintillation light from some scintillators to be sub-Poisson. For the same mean number of total scintillation photons, sub-Poisson light has lower noise. We then conducted a simulation study to investigate whether this low-noise sub-Poisson light can be used to improve spatial resolution. We calculated the Cramér-Rao bound for different detector geometries, position of interactions and Fano factors. The Cramér-Rao calculations were verified by generating simulated data and estimating the variance of the maximum likelihood estimator. We found that the Fano factor has no impact on the spatial resolution in gamma-ray imaging systems.
|
140 |
Subpixel Image Co-Registration Using a Novel Divergence MeasureWisniewski, Wit Tadeusz January 2006 (has links)
Sub-pixel image alignment estimation is desirable for co-registration of objects in multiple images to a common spatial reference and as alignment input to multi-image processing. Applications include super-resolution, image fusion, change detection, object tracking, object recognition, video motion tracking, and forensics.Information theoretical measures are commonly used for co-registration in medical imaging. The published methods apply Shannon's Entropy to the Joint Measurement Space (JMS) of two images. This work introduces into the same context a new set of statistical divergence measures derived from Fisher Information. The new methods described in this work are applicable to uncorrelated imagery and imagery that becomes statistically least dependent upon co-alignment. Both characteristics occur with multi-modal imagery and cause cross-correlation methods, as well as maximum dependence indicators, to fail. Fisher Information-based estimators, together as a set with an Entropic estimator, provide substantially independent information about alignment. This increases the statistical degrees of freedom, allowing for precision improvement and for reduced estimator failure rates compared to Entropic estimator performance alone.The new Fisher Information methods are tested for performance on real remotely-sensed imagery that includes Landsat TM multispectral imagery and ESR SAR imagery, as well as randomly generated synthetic imagery. On real imagery, the co-registration cost function is qualitatively examined for features that reveal the correct point of alignment. The alignment estimates agree with manual alignment to within manual alignment precision. Alignment truth in synthetic imagery is used to quantitatively evaluate co-registration accuracy. The results from the new Fisher Information-based algorithms are compared to Entropy-based Mutual Information and correlation methods revealing equal or superior precision and lower failure rate at signal-to-noise ratios below one.
|
Page generated in 0.0639 seconds