• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 202
  • 65
  • 26
  • 26
  • 16
  • 11
  • 11
  • 10
  • 10
  • 6
  • 4
  • 4
  • 3
  • 2
  • 2
  • Tagged with
  • 464
  • 63
  • 56
  • 56
  • 55
  • 48
  • 44
  • 43
  • 41
  • 40
  • 37
  • 37
  • 35
  • 33
  • 33
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
231

Visualizing and modeling partial incomplete ranking data

Sun, Mingxuan 23 August 2012 (has links)
Analyzing ranking data is an essential component in a wide range of important applications including web-search and recommendation systems. Rankings are difficult to visualize or model due to the computational difficulties associated with the large number of items. On the other hand, partial or incomplete rankings induce more difficulties since approaches that adapt well to typical types of rankings cannot apply generally to all types. While analyzing ranking data has a long history in statistics, construction of an efficient framework to analyze incomplete ranking data (with or without ties) is currently an open problem. This thesis addresses the problem of scalability for visualizing and modeling partial incomplete rankings. In particular, we propose a distance measure for top-k rankings with the following three properties: (1) metric, (2) emphasis on top ranks, and (3) computational efficiency. Given the distance measure, the data can be projected into a low dimensional continuous vector space via multi-dimensional scaling (MDS) for easy visualization. We further propose a non-parametric model for estimating distributions of partial incomplete rankings. For the non-parametric estimator, we use a triangular kernel that is a direct analogue of the Euclidean triangular kernel. The computational difficulties for large n are simplified using combinatorial properties and generating functions associated with symmetric groups. We show that our estimator is computational efficient for rankings of arbitrary incompleteness and tie structure. Moreover, we propose an efficient learning algorithm to construct a preference elicitation system from partial incomplete rankings, which can be used to solve the cold-start problems in ranking recommendations. The proposed approaches are examined in experiments with real search engine and movie recommendation data.
232

Target Classification Based on Kinematics / Klassificering av flygande objekt med hjälp av kinematik

Hallberg, Robert January 2012 (has links)
Modern aircraft are getting more and better sensors. As a result of this, the pilots are getting moreinformation than they can handle. To solve this problem one can automate the information processingand instead provide the pilots with conclusions drawn from the sensor information. An aircraft’smovement can be used to determine which class (e.g. commercial aircraft, large military aircraftor fighter) it belongs to. This thesis focuses on comparing three classification schemes; a Bayesianclassification scheme with uniform priors, Transferable Belief Model and a Bayesian classificationscheme with entropic priors.The target is modeled by a jump Markov linear system that switches between different modes (flystraight, turn left, etc.) over time. A marginalized particle filter that spreads its particles over thepossible mode sequences is used for state estimation. Simulations show that the results from Bayesianclassification scheme with uniform priors and the Bayesian classification scheme with entropic priorsare almost identical. The results also show that the Transferable Belief Model is less decisive thanthe Bayesian classification schemes. This effect is argued to come from the least committed principlewithin the Transferable Belief Model. A fixed-lag smoothing algorithm is introduced to the filter andit is shown that the classification results are improved. The advantage of having a filter that remembersthe full mode sequence (such as the marginalized particle filter) and not just determines the currentmode (such as an interacting multiple model filter) is also discussed.
233

內部控制、法律環境與盈餘平穩化之關聯:以中國證券市場為例 / The Relationship between Internal Control、Legal Environment and Income Smoothing: An Empirical Study of Listed Corporations in China

潘俞自, Pan, Yu Tzu Unknown Date (has links)
本論文以 2009 至2011 年中國上海及深圳上市A股公司為研究樣本,探討內部控制品質、法律環境與盈餘平穩化之間的關聯,本論文採用中國財政部與深圳迪博企業風險技術公司共同研究建立的上市公司內部控制指數來衡量企業之內部控制品質。本研究並進一步探討,中國市場法律環境與盈餘平穩化之間的關係。實證結果發現,中國上市公司之內部控制品質越好,其管理當局藉由盈餘管理使盈餘平穩化之程度越低。亦發現公司所在地市場法律環境發展程度高,律師人口比率高的地區,因可能遭受訴訟的風險增加,管理階層越會利用損益平穩化之方式進行盈餘管理。 / This thesis investigates the relationship between internal control、legal environment and income smoothing. The research sample is based on all listed China companies in Shanghai and Shenzhen Securities Exchange during 2009-2011. First, The empirical results show that the internal control quality is significantly related to income smoothing. It indicates that smoothing earnings is more prevalent for firms with poorer internal control quality. Next, the development of the area legal environment (population of lawyers) might influence the litigation risk of firms with income smoothing behaviors. Therefore, I further investigate how the extent of legal environment would influence the behaviors of income smoothing. The research results show that the extent to which legal environment development is significantly related to income smoothing behaviors. It means that managers are more likely to smooth earnings to reduce the litigation risk when the extent of legal environment development is high.
234

Demand Forecasting : A study at Alfa Laval in Lund

Lobban, Stacey, Klimsova, Hana January 2008 (has links)
Accurate forecasting is a real problem at many companies and that includes Alfa Laval in Lund. Alfa Laval experiences problems forecasting for future raw material demand. Management is aware that the forecasting methods used today can be improved or replaced by others. A change could lead to better forecasting accuracy and lower errors which means less inventory, shorter cycle times and better customer service at lower costs. The purpose of this study is to analyze Alfa Laval’s current forecasting models for demand of raw material used for pressed plates, and then determine if other models are better suited for taking into consideration trends and seasonal variation.
235

Exponential Smoothing for Forecasting and Bayesian Validation of Computer Models

Wang, Shuchun 22 August 2006 (has links)
Despite their success and widespread usage in industry and business, ES methods have received little attention from the statistical community. We investigate three types of statistical models that have been found to underpin ES methods. They are ARIMA models, state space models with multiple sources of error (MSOE), and state space models with a single source of error (SSOE). We establish the relationship among the three classes of models and conclude that the class of SSOE state space models is broader than the other two and provides a formal statistical foundation for ES methods. To better understand ES methods, we investigate the behaviors of ES methods for time series generated from different processes. We mainly focus on time series of ARIMA type. ES methods forecast a time series using only the series own history. To include covariates into ES methods for better forecasting a time series, we propose a new forecasting method, Exponential Smoothing with Covariates (ESCov). ESCov uses an ES method to model what left unexplained in a time series by covariates. We establish the optimality of ESCov, identify SSOE state space models underlying ESCov, and derive analytically the variances of forecasts by ESCov. Empirical studies show that ESCov outperforms ES methods and regression with ARIMA errors. We suggest a model selection procedure for choosing appropriate covariates and ES methods in practice. Computer models have been commonly used to investigate complex systems for which physical experiments are highly expensive or very time-consuming. Before using a computer model, we need to address an important question ``How well does the computer model represent the real system?" The process of addressing this question is called computer model validation that generally involves the comparison of computer outputs and physical observations. In this thesis, we propose a Bayesian approach to computer model validation. This approach integrates together computer outputs and physical observation to give a better prediction of the real system output. This prediction is then used to validate the computer model. We investigate the impacts of several factors on the performance of the proposed approach and propose a generalization to the proposed approach.
236

Chaotic Demodulation Under Interference

Erdem, Ozden 01 September 2006 (has links) (PDF)
Chaotically modulated signals are used in various engineering areas such as communication systems, signal processing applications, automatic control systems. Because chaotically modulated signal sequences are broadband and noise-like signals, they are used to carry binary signals especially in secure communication systems. In this thesis, a target tracking problem under interference at chaotic communication systems is investigated. Simulating the chaotic communication system, noise-like signal sequences are generated to carry binary signals. These signal sequences are affected by Gaussian channel noise and interference while passing through the communication channel. At the receiver side, target tracking is performed using Optimum Decoding Based Smoothing Algorithm. The estimation performances of optimum decoding based smoothing algorithm at one dimensional chaotic systems and nonlinear chaotic algorithm map are presented and compared with the performance of the Extended Kalman Filter application.
237

Direction Finding For Coherent, Cyclostationary Signals Via A Uniform Circular Array

Atalay Cetinkaya, Burcu 01 October 2009 (has links) (PDF)
In this thesis work, Cyclic Root MUSIC method is integrated with spatial smoothing and interpolation techniques to estimate the direction of arrivals of coherent,cyclostationary signals received via a Uniform Circular Array (UCA). Cyclic Root MUSIC and Conventional Root MUSIC algorithms are compared for various signal scenarios by computer simulations. A cyclostationary process is a random process with probabilistic parameters, such as the autocorrelation function, that vary periodically with time. Most of the man-made communication signals exhibit cyclostationarity due to the periodicity arising from their carrier frequencies, chip rates, baud rates, etc. Cyclic Root MUSIC algorithm exploits the cyclostationarity properties of signals to achieve signal selective direction of arrival estimation. Spatial smoothing is presented to overcome the coherent signals problem in a multipath propagation environment. Forward spatial smoothing and forward backward spatial smoothing techniques are investigated. Interpolation method is presented to cope with the restrictions of spatial smoothing on array structure. Although the array structure that is considered in this thesis (Uniform Circular Array), is not suitable for applying spatial smoothing directly, using interpolation method makes it possible. Performance of Cyclic Root MUSIC and Conventional Root MUSIC algorithms are compared under variation of various factors by computer simulations. Effects of signal type on the performance of the algorithms are observed by using different signal scenarios.
238

Demand Forecasting : A study at Alfa Laval in Lund

Lobban, Stacey, Klimsova, Hana January 2008 (has links)
<p>Accurate forecasting is a real problem at many companies and that includes Alfa Laval in Lund. Alfa Laval experiences problems forecasting for future raw material demand. Management is aware that the forecasting methods used today can be improved or replaced by others. A change could lead to better forecasting accuracy and lower errors which means less inventory, shorter cycle times and better customer service at lower costs.</p><p>The purpose of this study is to analyze Alfa Laval’s current forecasting models for demand of raw material used for pressed plates, and then determine if other models are better suited for taking into consideration trends and seasonal variation.</p>
239

Die Re-Analyse von Monitor-Schwellenwerten und die Entwicklung ARIMA-basierter Monitore für die exponentielle Glättung /

Becker, Claudia. January 2006 (has links) (PDF)
Katholische Universiẗat, Diss.--Eichstätt-Ingolstadt, 2006.
240

Nedskrivning av goodwill : Reella företagsekonomiska omständigheter, som verktyg för resultatmanipulering eller påverkad av finanskrisen?

Gustafsson, Jonas, Sjöbom, Oscar January 2015 (has links)
Denna studie behandlar nedskrivning av goodwill i svenska börsnoterade företag, och beaktar fyra möjliga förklaringar till detta. Relevant regelverk för den finansiella rapporteringen är IFRS. Genom en kvantitativ metod och ett deduktivt angreppssätt uppmärksammar vi problematiken kring att forskningen genererat olika förklaringar till att en nedskrivning äger rum.   Studien omfattar 1260 observationer i form av räkenskapsår, som sträcker sig mellan åren 2006 till 2013. Vårt empiriska material är inhämtat via databaser och omfattar finansiell information för vart och ett av företagen, fördelat på de olika åren. Nedskrivningskostnaden för goodwill är manuellt insamlat från de företag där goodwill som tillgångspost minskat från år t-1 till år t.   Från vårt teoretiska ramverk deduceras hypoteser som tillhandahåller eventuella orsaker till varför en nedskrivning kan ha ägt rum. Studien undersöker inledningsvis om en nedskrivning kan härledas till rådande företagsekonomiska förhållanden, där vedertagna undersökningsmått baserat på nyckeltal beräknade från årsredovisningar används. Vidare undersöks om nedskrivning av goodwill kan kopplas till resultatmanipulerande åtgärder genom stålbad eller vinstutjämning. Dessa kompententer i studien undersöks utifrån befintlig metodik tillhandahållen genom likartad forskning, men som utförts på andra geografiska marknader. Slutligen testas även, baserat på en egenhändigt framtagen operationalisering, finanskrisens inverkan på nedskrivning av goodwill.   Vår studies statistiska moment innehåller binära regressionsanalyser som med ett antal förklarande variabler prövar vad som kan förklara om en nedskrivning äger rum eller inte, samt en multipel regression som söker determinanter för nedskrivningens storlek.   Studiens resultat påvisar ett signifikant negativt samband mellan nedskrivning av goodwill och räntabilitet på totala tillgångar, vilket implicerar att bolag med sämre avkastning mer sannolikt kan komma att utföra en nedskrivning av goodwill. Även bolag med en hög andel goodwill i förhållande till totala tillgångar kan förväntas genomföra en nedskrivning, vilket visas genom ett signifikant positivt samband mellan beroende och förklarande variabel. Samma parameter, andel goodwill, kan även förklara storleken på en nedskrivning.   Vidare kan studien, genom statistisk signifikans, visa att nedskrivning av goodwill sker i samband med den resultatmanipulerande åtgärden stålbad, vilket innebär att bolag med ett redan dåligt resultat försämrar det ytterligare genom en nedskrivning. Att stålbadets motsats, vinstutjämning, förekommer som resultatmanipulerande åtgärd kan inte påvisas. Finanskrisens eventuella inverkan på vår beroende variabel kan inte styrkas.     Sammantaget visar studien att bolag med svag avkastningseffekt tenderar att oftare skriva ned goodwill än andra bolag, vilket eventuellt kan indikera att de följer de rekommendationer som standarden, IFRS, förespråkar. Indikationer på att resultatmanipulering genom stålbad förekommer på stockholmsbörsen kan också identifieras.

Page generated in 0.0263 seconds