• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 199
  • 65
  • 26
  • 26
  • 16
  • 11
  • 11
  • 10
  • 10
  • 6
  • 4
  • 4
  • 3
  • 2
  • 2
  • Tagged with
  • 460
  • 63
  • 56
  • 56
  • 53
  • 48
  • 44
  • 43
  • 41
  • 39
  • 37
  • 37
  • 35
  • 33
  • 32
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
121

An Optimization Based Approach to Visual Odometry Using Infrared Images

Nilsson, Emil January 2010 (has links)
<p>The goal of this work has been to improve the accuracy of a pre-existing algorithm for vehicle pose estimation, which uses intrinsic measurements of vehicle motion and measurements derived from far infrared images.</p><p>Estimating the pose of a vehicle, based on images from an on-board camera and intrinsic measurements of vehicle motion, is a problem of simultanoeus localization and mapping (SLAM), and it can be solved using the extended Kalman filter (EKF). The EKF is a causal filter, so if the pose estimation problem is to be solved off-line acausal methods are expected to increase estimation accuracy significantly. In this work the EKF has been compared with an acausal method for solving the SLAM problem called smoothing and mapping (SAM) which is an optimization based method that minimizes process and measurement noise.</p><p>Analyses of how improvements in the vehicle motion model, using a number of different model extensions, affects accuracy of pose estimates have also been performed.</p>
122

Resultatutjämning : en studie av taktiska åtgärder i de finansiella rapporterna - / Income smoothing : a study of tactical actions in the financial reports -

Svensson, Susanna, Wahlberg, Karolina January 2005 (has links)
<p>Bakgrund: De finansiella rapporterna som företagen publicerar ligger hos investerare till grund för ekonomiska beslut. Det är därför viktigt att ha kunskap om och förståelse för vilka effekter som olika redovisningsalternativ ger upphov till. Resultatplanering i företag innebär att det redovisade resultatet avsiktligt påverkas i en för företaget önskvärd riktning. Denna planering kan ta sig uttryck i både öppna och dolda taktiska åtgärder. De valmöjligheter som erbjuds i regler och rekommendationer skapar utrymmen för att företagen aktivt kan välja det redovisningsalternativ som passar dem bäst. </p><p>Syfte: Syftet med denna studie är att undersöka resultatutjämning som företeelse i svenska publika företag och koncerner utifrån ett externt perspektiv. </p><p>Metod: En fallstudie har genomförts i form av en liten N-studie för att få en djup förståelse för undersökningsområdet resultatutjämning. Studien innehåller även två dokumentstudier, den ena är en ren empirisk dokumentstudie i form av granskning av publika företags årsredovisningar för år 2003. Den empiriska studien har kompletterats med intervjuer med revisorer på revisionsbolagen KPMG och Ernst&Young samt Skatteverket. Den andra är en teoretisk studie av existerande litteratur rörande fenomenet resultatutjämning. </p><p>Resultat: Studien visar att det finns en närvaro av resultatutjämnande åtgärder i de finansiella rapporter som presenteras för de externa intressenterna. I ett tidsperspektiv kan det utrönas att utvecklingen av redovisningsnormerna även har inneburit att de taktiska åtgärder företag vidtar följer den samma. En mer restriktiv tillämpning av regler och rekommendationer motverkar förekomsten av öppet redovisade åtgärder i syfte att påverka det redovisade resultatet.</p>
123

An Extension to the Tactical Planning Model for a Job Shop: Continuous-Time Control

Teo, Chee Chong, Bhatnagar, Rohit, Graves, Stephen C. 01 1900 (has links)
We develop an extension to the tactical planning model (TPM) for a job shop by the third author. The TPM is a discrete-time model in which all transitions occur at the start of each time period. The time period must be defined appropriately in order for the model to be meaningful. Each period must be short enough so that a job is unlikely to travel through more than one station in one period. At the same time, the time period needs to be long enough to justify the assumptions of continuous workflow and Markovian job movements. We build an extension to the TPM that overcomes this restriction of period sizing by permitting production control over shorter time intervals. We achieve this by deriving a continuous-time linear control rule for a single station. We then determine the first two moments of the production level and queue length for the workstation. / Singapore-MIT Alliance (SMA)
124

Resultatutjämning : en studie av taktiska åtgärder i de finansiella rapporterna - / Income smoothing : a study of tactical actions in the financial reports -

Svensson, Susanna, Wahlberg, Karolina January 2005 (has links)
Bakgrund: De finansiella rapporterna som företagen publicerar ligger hos investerare till grund för ekonomiska beslut. Det är därför viktigt att ha kunskap om och förståelse för vilka effekter som olika redovisningsalternativ ger upphov till. Resultatplanering i företag innebär att det redovisade resultatet avsiktligt påverkas i en för företaget önskvärd riktning. Denna planering kan ta sig uttryck i både öppna och dolda taktiska åtgärder. De valmöjligheter som erbjuds i regler och rekommendationer skapar utrymmen för att företagen aktivt kan välja det redovisningsalternativ som passar dem bäst. Syfte: Syftet med denna studie är att undersöka resultatutjämning som företeelse i svenska publika företag och koncerner utifrån ett externt perspektiv. Metod: En fallstudie har genomförts i form av en liten N-studie för att få en djup förståelse för undersökningsområdet resultatutjämning. Studien innehåller även två dokumentstudier, den ena är en ren empirisk dokumentstudie i form av granskning av publika företags årsredovisningar för år 2003. Den empiriska studien har kompletterats med intervjuer med revisorer på revisionsbolagen KPMG och Ernst&amp;Young samt Skatteverket. Den andra är en teoretisk studie av existerande litteratur rörande fenomenet resultatutjämning. Resultat: Studien visar att det finns en närvaro av resultatutjämnande åtgärder i de finansiella rapporter som presenteras för de externa intressenterna. I ett tidsperspektiv kan det utrönas att utvecklingen av redovisningsnormerna även har inneburit att de taktiska åtgärder företag vidtar följer den samma. En mer restriktiv tillämpning av regler och rekommendationer motverkar förekomsten av öppet redovisade åtgärder i syfte att påverka det redovisade resultatet.
125

Data Filtering and Control Design for Mobile Robots

Karasalo, Maja January 2009 (has links)
In this thesis, we consider problems connected to navigation and tracking for autonomousrobots under the assumption of constraints on sensors and kinematics. We study formation controlas well as techniques for filtering and smoothing of noise contaminated input. The scientific contributions of the thesis comprise five papers.In Paper A, we propose three cascaded, stabilizing formation controls for multi-agent systems.We consider platforms with non-holonomic kinematic constraints and directional rangesensors. The resulting formation is a leader-follower system, where each follower agent tracksits leader agent at a specified angle and distance. No inter-agent communication is required toexecute the controls. A switching Kalman filter is introduced for active sensing, and robustnessis demonstrated in experiments and simulations with Khepera II robots.In Paper B, an optimization-based adaptive Kalman filteringmethod is proposed. The methodproduces an estimate of the process noise covariance matrix Q by solving an optimization problemover a short window of data. The algorithm recovers the observations h(x) from a system˙ x = f (x), y = h(x)+v without a priori knowledge of system dynamics. The algorithm is evaluatedin simulations and a tracking example is included, for a target with coupled and nonlinearkinematics. In Paper C, we consider the problem of estimating a closed curve in R2 based on noisecontaminated samples. A recursive control theoretic smoothing spline approach is proposed, thatyields an initial estimate of the curve and subsequently computes refinements of the estimateiteratively. Periodic splines are generated by minimizing a cost function subject to constraintsimposed by a linear control system. The optimal control problem is shown to be proper, andsufficient optimality conditions are derived for a special case of the problem using Hamilton-Jacobi-Bellman theory.Paper D continues the study of recursive control theoretic smoothing splines. A discretizationof the problem is derived, yielding an unconstrained quadratic programming problem. Aproof of convexity for the discretized problem is provided, and the recursive algorithm is evaluatedin simulations and experiments using a SICK laser scanner mounted on a PowerBot from ActivMedia Robotics. Finally, in Paper E we explore the issue of optimal smoothing for control theoretic smoothingsplines. The output of the control theoretic smoothing spline problem is essentially a tradeoff between faithfulness to measurement data and smoothness. This tradeoff is regulated by the socalled smoothing parameter. In Paper E, a method is developed for estimating the optimal valueof this smoothing parameter. The procedure is based on general cross validation and requires noa priori information about the underlying curve or level of noise in the measurements. / QC 20100722
126

Managerial Incentives and Earnings Management : An Empirical Examination of the Income Smoothing in the Nordic Banking Industry

Tsitinidis, Alexandros, Duru, Kenneth January 2013 (has links)
Prior empirical research, mainly conducted in US under the US GAAP, has indicated that managers in listed banks use loan loss provisions as a primary tool for income smoothing activities. Since 2005 the accounting environment in the European Union (EU) changed, as all listed companies are required to comply with International Financial Reporting Standards (IFRS). Some arguments envisage that IFRS is a set of high quality standards that plug some inconsistencies relative to national General Accepted Accounting Principles (GAAP). The overall objective of the present study is to examine earnings management and in particular income smoothing through the use of loan loss provisions (LLP) to manage earnings under IFRS and national GAAPs. The sample consists of twenty large commercial banks listed in the Nordic countries (Denmark, Finland, Norway and Sweden) for the years 2004-2012 (including early adopters) and sixteen banks for the years 1996-2003 under each country’s national reporting regime. Furthermore we present the body of earning management literature in conjunction with agency theory in order to grasp managers’ opportunistic behavior. Finally we assess the institutional role of financial reporting standards and the arguments of how IFRS could restrict earnings management activities as proposed by some authors. Overall, our results indicate some degree of income smoothing activities through loan loss provisions by bank managers both under national GAAPs and IFRS. The study contributes to the broad literature body on earnings management, while testing income-smoothing activities on a single industry compared to previous studies where the samples comprises a variety of firms in different industries.
127

The Indirect Effects of Conditional Cash Transfer Programs: An Empirical Analysis of Familias En Accion

Ospina, Monica P 15 May 2010 (has links)
Conditional cash transfer (CCT) programs have become the most important social policy in Latin America, and their influence has spread to countries around the world. A number of studies provide strong evidence of the positive impacts of these programs on the main targeted outcomes, education and health, and have proved successful in other outcomes such as nutrition, household income, and child labor. As we expect CCT programs to remain a permanent aspect of social policy for the foreseeable future, demand for evidence of the indirect effects of CCT programs has grown beyond the initial emphasis of these programs. My research pays particular attention to these relevant but unintended outcomes, which have been discussed less extensively in the literature. Familias en Accion (FA), a CCT program in Colombia, started operating in 2002 and has benefited approximately 1,500,000 households since its beginning. The results of the program’s evaluation survey, representative of poor rural households in Colombia, are a very good source or investigating not only the unintended effects of the program but also the microeconomic behavior of poor households and social policy issues in the country. Using a panel dataset from FA, I address three empirical policy questions: (i) to what extent is consumption of beneficiary households better insured against income shocks? (ii) has the program displaced child labor as a risk-coping instrument?, and (iii) are there any incentive effects of the cash transfers and the associated conditionalities on the labor supply of adults in recipient households? Each of my research questions is addressed separately; however, the results, taken together, can be informative in understanding the safety net value of the program and their potentialities to reduce poverty in the long term. I find that the program serves as an instrument for consumption smoothing. In particular, FA is effective in protecting food consumption, but not nonfood consumption, and it reduces consumption fluctuations in response to idiosyncratic shocks but not to covariate shocks. Results also reveal that FA works as insurance for the schooling of the poor but is not able to completely displace child labor. Finally, the results also show that beneficiary mothers are devoting more time to household chores and that girls and female adult labor are complementary. Male labor supply has increased while boys have increased leisure time as a response to the program.
128

Analysing stochastic call demand with time varying parameters

Li, Song 25 November 2005
In spite of increasingly sophisticated workforce management tools, a significant gap remains between the goal of effective staffing and the present difficulty predicting the stochastic demand of inbound calls. We have investigated the hypothesized nonhomogeneous Poisson process model of modem pool callers of the University community. In our case, we tested if the arrivals could be approximated by a piecewise constant rate over short intervals. For each of 1 and 10-minute intervals, based on the close relationship between the Poisson process and the exponential distribution, the test results did not show any sign of homogeneous Poisson process. We have examined the hypothesis of a nonhomogeneous Poisson process by a transformed statistic. Quantitative and graphical goodness-of-fit tests have confirmed nonhomogeneous Poisson process. <p>Further analysis on the intensity function revealed that linear rate intensity was woefully inadequate in predicting time varying arrivals. For sinusoidal rate model, difficulty arose in setting the period parameter. Spline models, as an alternative to parametric modelling, had more control of balance between data fitting and smoothness, which was appealing to our analysis on call arrival process.
129

Multiscale Feature-Preserving Smoothing of Images and Volumes on the GPU

Jibai, Nassim 24 May 2012 (has links) (PDF)
Two-dimensional images and three-dimensional volumes have become a staple ingredient of our artistic, cultural, and scientific appetite. Images capture and immortalize an instance such as natural scenes, through a photograph camera. Moreover, they can capture details inside biological subjects through the use of CT (computer tomography) scans, X-Rays, ultrasound, etc. Three-dimensional volumes of objects are also of high interest in medical imaging, engineering, and analyzing cultural heritage. They are produced using tomographic reconstruction, a technique that combine a large series of 2D scans captured from multiple views. Typically, penetrative radiation is used to obtain each 2D scan: X-Rays for CT scans, radio-frequency waves for MRI (magnetic resonance imaging), electron-positron annihilation for PET scans, etc. Unfortunately, their acquisition is influenced by noise caused by different factors. Noise in two-dimensional images could be caused by low-light illumination, electronic defects, low-dose of radiation, and a mispositioning tool or object. Noise in three-dimensional volumes also come from a variety of sources: the limited number of views, lack of captor sensitivity, high contrasts, the reconstruction algorithms, etc. The constraint that data acquisition be noiseless is unrealistic. It is desirable to reduce, or eliminate, noise at the earliest stage in the application. However, removing noise while preserving the sharp features of an image or volume object remains a challenging task. We propose a multi-scale method to smooth 2D images and 3D tomographic data while preserving features at a specified scale. Our algorithm is controlled using a single user parameter - the minimum scale of features to be preserved. Any variation that is smaller than the specified scale is treated as noise and smoothed, while discontinuities such as corners, edges and detail at a larger scale are preserved. We demonstrate that our smoothed data produces clean images and clean contour surfaces of volumes using standard surface-extraction algorithms. In addition to, we compare our results with results of previous approaches. Our method is inspired by anisotropic diffusion. We compute our diffusion tensors from the local continuous histograms of gradients around each pixel in image
130

Statistical methods for function estimation and classification

Kim, Heeyoung 20 June 2011 (has links)
This thesis consists of three chapters. The first chapter focuses on adaptive smoothing splines for fitting functions with varying roughness. In the first part of the first chapter, we study an asymptotically optimal procedure to choose the value of a discretized version of the variable smoothing parameter in adaptive smoothing splines. With the choice given by the multivariate version of the generalized cross validation, the resulting adaptive smoothing spline estimator is shown to be consistent and asymptotically optimal under some general conditions. In the second part, we derive the asymptotically optimal local penalty function, which is subsequently used for the derivation of the locally optimal smoothing spline estimator. In the second chapter, we propose a Lipschitz regularity based statistical model, and apply it to coordinate measuring machine (CMM) data to estimate the form error of a manufactured product and to determine the optimal sampling positions of CMM measurements. Our proposed wavelet-based model takes advantage of the fact that the Lipschitz regularity holds for the CMM data. The third chapter focuses on the classification of functional data which are known to be well separable within a particular interval. We propose an interval based classifier. We first estimate a baseline of each class via convex optimization, and then identify an optimal interval that maximizes the difference among the baselines. Our interval based classifier is constructed based on the identified optimal interval. The derived classifier can be implemented via a low-order-of-complexity algorithm.

Page generated in 0.0687 seconds