• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 201
  • 65
  • 26
  • 26
  • 16
  • 11
  • 11
  • 10
  • 10
  • 6
  • 4
  • 4
  • 3
  • 2
  • 2
  • Tagged with
  • 463
  • 63
  • 56
  • 56
  • 55
  • 48
  • 44
  • 43
  • 41
  • 40
  • 37
  • 37
  • 35
  • 33
  • 33
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
201

Improving The Production Forecasts : Developing a Forecasting Model Using Exponential Smoothing

Ada Fatemeh, Rezai January 2024 (has links)
This research is motivated by identified gaps in contemporary planning practices and production processes within firms. Relying solely on experiential knowledge has proven limiting, necessitating a more systematic approach. Previous instances of data anomalies, particularly ongoing challenges in achieving satisfactory delivery reliability, have underlined the need for deeper insights into underlying patterns. The objectives of this study are: • To identify and analyze specific obstacles and challenges affecting load balance and delivery security in Borl.nge's production system. • To explore various methods or strategies aimed at enhancing the process of generating reliable capacity forecasting methods. Both primary and secondary research methods were employed. Primary methods included interviews and the development of a forecast model, while secondary studies encompassed the latest research in the field. The thesis revealed five primary factors hindering capacity attainment: 1. WIP(work in progress)/ slabs material shortages disrupt production flow and escalate costs due to the need for external sourcing of slabs. 2. Transport issues, including incorrect internal deliveries and the weather conditions, pose challenges. 3. Personnel shortages hinder the efficient utilization of production capacity. 4. Machine breakdowns result in production interruptions, leading to capacity loss and inefficiency. 5. Inventory problems, such as insufficient capacity and poor management, impede smooth production operations. Additionally, the second objective was addressed by implementing exponential smoothing for capacity planning forecasts. By updating forecasts every 13 weeks, this study improves the production forecast.
202

Cure Rate Models with Nonparametric Form of Covariate Effects

Chen, Tianlei 02 June 2015 (has links)
This thesis focuses on development of spline-based hazard estimation models for cure rate data. Such data can be found in survival studies with long term survivors. Consequently, the population consists of the susceptible and non-susceptible sub-populations with the latter termed as "cured". The modeling of both the cure probability and the hazard function of the susceptible sub-population is of practical interest. Here we propose two smoothing-splines based models falling respectively into the popular classes of two component mixture cure rate models and promotion time cure rate models. Under the framework of two component mixture cure rate model, Wang, Du and Liang (2012) have developed a nonparametric model where the covariate effects on both the cure probability and the hazard component are estimated by smoothing splines. Our first development falls under the same framework but estimates the hazard component based on the accelerated failure time model, instead of the proportional hazards model in Wang, Du and Liang (2012). Our new model has better interpretation in practice. The promotion time cure rate model, motivated from a simplified biological interpretation of cancer metastasis, was first proposed only a few decades ago. Nonetheless, it has quickly become a competitor to the mixture models. Our second development aims to provide a nonparametric alternative to the existing parametric or semiparametric promotion time models. / Ph. D.
203

Dividend policy in the banking sector in G-7 and GCC countries: A comparative study

Hanifa, H., Hamdan, M., Haffar, Mohamed 2018 November 1923 (has links)
Yes / Dividend policy has been a puzzling question for many years. This study attempts to identify the key factors affecting it in the financial sector that have been neglected in the literature. Using panel data on 621 Group of Seven (G-7) banks and 68 Gulf Cooperation Council (GCC) banks, five main factors namely, banks’ size, profitability, growth, leverage, and last year’s dividend were empirically tested regarding their impact on dividend payout ratios. In addition to comparing the two economies descriptively, the researchers employed panel data analysis using multiple regression with random effects. The findings revealed that the dividend payout ratio for the GCC countries is higher than G-7 countries in every year of the examined period (2010-2015). Furthermore, for both G-7 and GCC banks, profitability and last year dividend had a significant positive influence while banks’ leverage had a significant negative influence on the dividend payout. It was found also that banks’ size is an important dividend determinant in the G-7 countries only.
204

Aspects of bivariate time series

Seeletse, Solly Matshonisa 11 1900 (has links)
Exponential smoothing algorithms are very attractive for the practical world such as in industry. When considering bivariate exponential smoothing methods, in addition to the properties of univariate methods, additional properties give insight to relationships between the two components of a process, and also to the overall structure of the model. It is important to study these properties, but even with the merits the bivariate exponential smoothing algorithms have, exponential smoothing algorithms are nonstatistical/nonstochastic and to study the properties within exponential smoothing may be worthless. As an alternative approach, the (bivariate) ARIMA and the structural models which are classes of statistical models, are shown to generalize the exponential smoothing algorithms. We study these properties within these classes as they will have implications on exponential smoothing algorithms. Forecast properties are studied using the state space model and the Kalman filter. Comparison of ARIMA and structural model completes the study. / Mathematical Sciences / M. Sc. (Statistics)
205

Ανάπτυξη μεθόδων ψηφιακής ισοστάθμισης για ηλεκτρακουστικές εφαρμογές / Development of digital equalization methods for audio applications

Χατζηαντωνίου, Παναγιώτης 25 June 2007 (has links)
H Διδακτορική Διατριβή μελετά το πρόβλημα της ψηφιακής ισοστάθμισης,σκοπεύοντας στην ανάπτυξη αποτελεσματικών μεθόδων εξάλειψης των ηχητικών παραμορφώσεων, που εισάγονται κατά την ηχητική αναπαραγωγή εξαιτίας της απόκρισης, είτε των ηχείων (ανηχωική ισοστάθμιση), είτε των χώρων ακρόασης (εξάλειψη αντήχησης). Αναπτύσσονται πρωτότυπες μέθοδοι που αφενός εξασφαλίζουν ακριβείς μετρήσεις των ανηχωικών ηλεκτρακουστικών αποκρίσεων μέσα σε μη ανηχωικούς χώρους, αφετέρου πετυχαίνουν κατάλληλη εξομάλυνση των πολύπλοκων αποκρίσεων των ακουστικών συστημάτων για χρήση στην ψηφιακή ισοστάθμιση αλλά και για χρήση σε άλλες εφαρμογές της ακουστικής χώρων που απαιτούν ανάλυση συγκεκριμένων ιδιοτήτων αυτών των συστημάτων. Η συστηματική μελέτη της μεθόδου εξάλειψης αντήχησης που βασίζεται στην ιδανική αντιστροφή των αποκρίσεων χώρων οδηγεί στο πρωτότυπο συμπέρασμα ότι τα ακουστά οφέλη από την εφαρμογή της μεθόδου σε πραγματικό χρόνο είναι σημαντικά υποδεέστερα από τα αναμενόμενα που προκύπτουν από τα αντίστοιχα πειράματα εξομοίωσης αυτής της μεθόδου. Το πρόβλημα της εξάλειψης αντήχησης αντιμετωπίζεται για πρώτη φορά με έναν πρακτικά βιώσιμο τρόπο, με την εισαγωγή πρωτότυπης μεθόδου ισοστάθμισης που βασίζεται στην Μιγαδική Εξομάλυνση των αποκρίσεων χώρων. / The dissertation studies the digital audio equalization problem, in order to develop methods that would effectively eliminate the audio distortions being introduced during the sound reproduction by either the loudspeakers(anechoic equalization) or the room response (dereverberation). Novel methods are introduced that ensure precise measurements of anechoic electracoustic responses inside reverberant enclosures and on the other hand, achieve appropriately smoothed acoustic responses, for use in digital equalization and also in other applications of room acoustics that require analysis of concrete properties of these systems. Novel conclusions have been drawn by the analytic study of the room acoustics dereverberation based on ideal inverse filtering, indicating that the application of such a method in real time yields a significantly degraded performance compared to that achieved by the corresponding simulated dereverberation experiments. The problem of dereverberation is faced with a practically viable solution, with the introduction of a novel method based on the room response Complex Smoothing.
206

Aspects of bivariate time series

Seeletse, Solly Matshonisa 11 1900 (has links)
Exponential smoothing algorithms are very attractive for the practical world such as in industry. When considering bivariate exponential smoothing methods, in addition to the properties of univariate methods, additional properties give insight to relationships between the two components of a process, and also to the overall structure of the model. It is important to study these properties, but even with the merits the bivariate exponential smoothing algorithms have, exponential smoothing algorithms are nonstatistical/nonstochastic and to study the properties within exponential smoothing may be worthless. As an alternative approach, the (bivariate) ARIMA and the structural models which are classes of statistical models, are shown to generalize the exponential smoothing algorithms. We study these properties within these classes as they will have implications on exponential smoothing algorithms. Forecast properties are studied using the state space model and the Kalman filter. Comparison of ARIMA and structural model completes the study. / Mathematical Sciences / M. Sc. (Statistics)
207

Séparation et détection des trajets dans un guide d'onde en eau peu profonde / multi-dimensional source separation algorithm and application

Jiang, Long Yu 22 November 2012 (has links)
En acoustique sous marine, les ´etudes sur les zones en eau peu profondes sontredevenues strat´egiques. Cette th`ese porte sur l’ ´etude de la s´eparation et la d´etectionde trajet dans le cadre des eaux peu profondes tomographie acoustique oc´eanique. Dansune premi´ere ´etape de notre travail, nous avons donn´e un bref aperc¸u sur les techniquesexistantes de traitement acoustique sous-marine afin de trouver la difficult´e toujoursconfront´es `a ce type de m´ethodes. Par cons´equent, nous avons fait une conclusion qu’ilest encore n´e cessaire d’am´eliorer la r´esolution de s´eparation afin de fournir des informationsplus utiles pour l’ ´etape inverse de la tomographie acoustique oc´eanique.Ainsi, une enquˆete sur les mthodes haute r´esolution est effecut´ee. Enfin, nous avonspropos´e une m´ethode `a haute r´esolution appel´ee lissage MUSICAL (MUSIC Active largeband), qui combine le lissage de fr´equence spatiale avec l’algorithme MUSICAL, pourune s´eparation efficace de trajet coh´erentes ou totalement corr´el´es. Cependant, cettem´ethode est bas´ee sur la connaissance a priori du nombre de trajet. Ainsi, nous introduisonsun test (exponential fitting test) (EFT) `a l’aide de courte longueur des ´echantillonspour d´eterminer le nombre de trajets. Ces deux m´ethodes sont appliqu´ees `a la fois desdonn´ees synth´etiques et les donn´ees r´eelles acquises dans un r´eservoir `a petite ´echelle.Leurs performances sont compar´ees avec les m´ethodes conventionnelles pertinentes. / As the studies on shallow-water acoustics became an active field again, this dissertationfocuses on studying the separation and detection of raypaths in the context of shallowwaterocean acoustic tomography. As a first step of our work, we have given a briefreview on the existing array processing techniques in underwater acoustics so as to findthe difficulties still faced by these methods. Consequently, we made a conclusion thatit is still necessary to improve the separation resolution in order to provide more usefulinformation for the inverse step of ocean acoustic tomography. Thus, a survey on highresolutionmethod is provided to discover the technique which can be extended to separatethe raypaths in our application background. Finally, we proposed a high-resolutionmethod called smoothing-MUSICAL (MUSIC Actif Large band), which combines thespatial-frequency smoothing with MUSICAL algorithm, for efficient separation of coherentor fully correlated raypaths. However, this method is based on the prior knowledgeof the number of raypaths. Thus, we introduce an exponential fitting test (EFT)using short-length samples to determine the number of raypaths. These two methodsare both applied to synthetic data and real data acquired in a tank at small scale. Theirperformances are compared with the relevant conventional methods respectively.
208

Application of Modern Principles to Demand Forecasting for Electronics, Domestic Appliances and Accessories

Noble, Gregory Daniel 30 June 2009 (has links)
No description available.
209

Non-global regression modelling

Huang, Yunkai 21 June 2016 (has links)
In this dissertation, a new non-global regression model - the partial linear threshold regression model (PLTRM) - is proposed. Various issues related to the PLTRM are discussed. In the first main section of the dissertation (Chapter 2), we define what is meant by the term “non-global regression model”, and we provide a brief review of the current literature associated with such models. In particular, we focus on their advantages and disadvantages in terms of their statistical properties. Because there are some weaknesses in the existing non-global regression models, we propose the PLTRM. The PLTRM combines non-parametric modelling with the traditional threshold regression models (TRMs), and hence can be thought of as an extension of the later models. We verify the performance of the PLTRM through a series of Monte Carlo simulation experiments. These experiments use a simulated data set that exhibits partial linear and partial nonlinear characteristics, and the PLTRM out-performs several competing parametric and non-parametric models in terms of the Mean Squared Error (MSE) of the within-sample fit. In the second main section of this dissertation (Chapter 3), we propose a method of estimation for the PLTRM. This requires estimating the parameters of the parametric part of the model; estimating the threshold; and fitting the non-parametric component of the model. An “unbalanced penalized least squares” approach is used. This involves using restricted penalized regression spline and smoothing spline techniques for the non-parametric component of the model; the least squares method for the linear parametric part of the model; together with a search procedure to estimate the threshold value. This estimation procedure is discussed for three mutually exclusive situations, which are classified according to the way in which the two components of the PLTRM “join” at the threshold. Bootstrap sampling distributions of the estimators are provided using the parametric bootstrap technique. The various estimators appear to have good sampling properties in most of the situations that are considered. Inference issues such as hypothesis testing and confidence interval construction for the PLTRM are also investigated. In the third main section of the dissertation (Chapter 4), we illustrate the usefulness of the PLTRM, and the application of the proposed estimation methods, by modelling various real-world data sets. These examples demonstrate both the good statistical performance, and the great application potential, of the PLTRM. / Graduate
210

Statistical Methods for Dating Collections of Historical Documents

Tilahun, Gelila 31 August 2011 (has links)
The problem in this thesis was originally motivated by problems presented with documents of Early England Data Set (DEEDS). The central problem with these medieval documents is the lack of methods to assign accurate dates to those documents which bear no date. With the problems of the DEEDS documents in mind, we present two methods to impute missing features of texts. In the first method, we suggest a new class of metrics for measuring distances between texts. We then show how to combine the distances between the texts using statistical smoothing. This method can be adapted to settings where the features of the texts are ordered or unordered categoricals (as in the case of, for example, authorship assignment problems). In the second method, we estimate the probability of occurrences of words in texts using nonparametric regression techniques of local polynomial fitting with kernel weight to generalized linear models. We combine the estimated probability of occurrences of words of a text to estimate the probability of occurrence of a text as a function of its feature -- the feature in this case being the date in which the text is written. The application and results of our methods to the DEEDS documents are presented.

Page generated in 0.1299 seconds