• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4
  • Tagged with
  • 8
  • 8
  • 8
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Topics in univariate time series analysis with business applications

Khachatryan, Davit 01 January 2010 (has links)
Recent technological advances in sensor and computer technology allow the observation of business and industrial processes at fairly high frequencies. For example, data used for monitoring critical parameters of industrial furnaces, conveyor belts or chemical processes may be sampled every minute or second. A high sampling rate is also possible in business related processes such as mail order distribution, fast food restaurant operations, and electronic commerce. Data obtained from frequently monitored business processes are likely to be autocorrelated time series that may or may not be stationary. If left alone, processes will typically not be stable, and hence they will usually not posses a fixed mean, thus exhibiting homogeneous non-stationarity. For monitoring, control, and forecasting purposes of such potentially non-stationary processes it is often important to develop an understanding of the dynamic properties of processes. However, it is sometimes difficult if not impossible to conduct deliberate experiments on full scale industrial plants or business processes to gain the necessary insight of their dynamic properties. Fortunately, intentional or inadvertent process changes that occur in the course of normal operation sometimes offer an opportunity to identify and estimate aspects of the dynamic behavior. To determine if a time series is stationary, the standard exploratory data analytic approach is to check that the sample autocorrelation function (ACF) fades out relatively quickly. An alternative, and at times a sounder approach is to use the variogram – a data exploratory tool widely used in spatial (geo) statistics for the investigation of spatial correlation of data. The first objective of this dissertation is to derive the basic properties of the variogram and to provide the literature on confidence intervals for the variogram. We then show how to use the multivariate Delta method to derive asymptotic confidence intervals for the variogram that are both practical and computationally appealing. The second objective of this dissertation is to review the theory of dynamic process modeling based on time series intervention analysis and to show how this theory can be used for an assessment of the dynamic properties of business and industrial processes. This is accompanied by a detailed example of the study of a large scale ceramic plant that was exposed to an intentional but unplanned structural change (a quasi experiment). The third objective of this dissertation concerns the analysis of multiple interventions. Multiple interventions occur either as a result of multiple changes made to the same process or because of a single change having non-homogeneous effects on time series. For evaluating the effects of undertaken structural changes, it is important to assess and compare the effects, such as gains or losses, of multiple interventions. A statistical hypothesis test for comparing the effects among multiple interventions on process dynamics is developed. Further, we investigate the statistical power of the suggested test and elucidate the results with examples.
2

Understanding the low volatility anomaly in the South African equity market

Khuzwayo, Bhekinkosi January 2015 (has links)
The Capital Asset Pricing Model (CAPM) advocates that expected return has a linear proportional relationship with beta (and subsequently volatility). As such, the higher the systematic risk of a security the higher the CAPM expected return. However, empirical results have hardly supported this view as argued as early as Black (1972). Instead, an anomaly has been evidenced across a multitude of developed and emerging markets, where portfolios constructed to have lower volatility have outperformed their higher volatility counterparts as found by Baker and Haugen (2012). This result has been found to exist in most Equity markets globally. In the South African market the studies of Khuzwayo (2011), Panulo (2014) and Oladele (2014) focused on establishing whether low volatility portfolios had outperformed market-cap weighted portfolios in the South African market. While they found this to be the case, it is important to understand if this is truly an anomaly or just a result of prevailing market conditions that have rewarded lower volatility stocks over the back-test period. As such, those conditions might not exist in the future and low volatility portfolios might then underperform. This research does not aim to show, yet again, the existence of this 'anomaly'; instead the aim is to dissect if there is any theoretical backing for low volatility portfolios to outperform high volatility portfolios. If this can be uncovered, then it should help one understand if the 'anomaly' truly exists and also if it can be expected to continue into the future.
3

Improving non-linear approaches to anomaly detection, class separation, and visualization

Paciencia, Todd J. 17 January 2015 (has links)
<p> Linear approaches for multivariate data analysis are popular due to their lower complexity, reduced computational time, and easier interpretation. In many cases, linear approaches produce adequate results; however, non-linear methods may generate more robust transformations, features, and decision boundaries. Of course, these non-linear methods present their own unique challenges that often inhibit their use. </p><p> In this research, improvements to existing non-linear techniques are investigated for the purposes of providing better, timely class separation and improved anomaly detection on various multivariate datasets, culminating in application to anomaly detection in hyperspectral imagery. Primarily, kernel-based methods are investigated, with some consideration towards other methods. Improvements to existing linear-based algorithms are also explored. Here, it is assumed that classes in the data have minimal overlap in the originating space or can be made to have minimal overlap in a transformed space, and that class information is unknown <i>a priori.</i> Further, improvements are demonstrated for global anomaly detection on a variety of hyperspectral imagery, utilizing fusion of spatial and spectral information, factor analysis, clustering, and screening. Additionally, new approaches for n-dimensional visualization of data and decision boundaries are developed.</p>
4

Analyzing Fact Based Preventive Approach to Address Foreign Material Contamination in the Food Industry

Osuagwu, Stanley 29 November 2017 (has links)
<p> Analyzing fact based preventive approach to address foreign material contamination in the food industry is a necessary analysis, focused on using statistical evaluations to analyze the disturbing trend of foreign material food recall in the meat and poultry industry today. Even though recalls due to microbiological contaminations and allergens seem to get more media attention, foreign material meat and poultry food contaminations are also on the rise and the market is starting to show disturbing trends of increasing volumes of products that are recalled due to potential adulteration. </p><p> Unfortunately, tighter regulatory oversight and new food safety modernization acts have not demonstrated significant success towards reducing the occurrence of these foreign material food recalls. It appears that the incident rates have remained somewhat flat year over year, but the volume of product that are being destroyed due to extraneous material contaminations continues to show exponential increase. </p><p> Food producers cannot continue to conduct business as usual in a world that is constantly changing. They must begin to adapt and invest in technology and in fact based foreign material prevention initiatives, in order to close the disparity in technology between the production equipment and the foreign material detection equipment. </p><p> The outcome of my research has suggested that maintaining a status quo to foreign material prevention has not proven successful in limiting recall occurrences in the food industry. The research supports that a switch to fact based prevention approach can yield superior outcomes that are beneficial to both the food manufacturers and food consumers.</p><p>
5

Critical systems thinking, theory and practice : a case study of an intervention in two British local authorities

Munlo, Isaac January 1997 (has links)
This thesis reports an intervention informed by critical systems thinking. The intervention drew upon a variety of systems and operational research methods to systemically explore the problems facing housing services for older people. Stakeholders were then supported in developing a response to these problems in the form of an integrated model of user involvement and multi-agency working. The methods used in this study included Cognitive Mapping, Critical Systems Heuristics, Interactive Planning and Viable System Modelling. Following a description of the project and its outcomes, the author's practical experiences are used to reflect back on critical systems thinking. Five innovations are presented in the thesis: First a new method called 'Problem Mapping' is developed. This has five stages: (i) interviewing stakeholders to surface problems and identify further potential interviewees; (ii) listing the problems as seen through the eyes of the various stakeholders; (iii) consolidating the list by removing duplicate problems and synthesising similar problems into larger 'problem statements'; (iv) mapping the relationships between problems; and (v) presenting the results back to stakeholders to inform the development of proposals for improvement. Reflection upon the use of this method indicates that it is particularly valuable where there are multiple stakeholders who are not initially visible to researchers, each of whom sees different aspects of a problem situation. Second, Problem Mapping is used to systemically express the problems facing housing services for older people in two geographical areas in the UK. This shows how problems in the areas of assessment, information provision and planning are mutually reinforcing, making a strong case for change. Third, a process of evolving an integrated model of user involvement and multi-agency working is presented. The model was designed in facilitated workshops by managers from statutory agencies, based on specifications developed by a variety of stakeholders (including service users and carers). Fourth, the strengths and weaknesses of Cognitive Mapping (one of the methods used in the project) are discussed. Significant limitations of this method are highlighted. Fifth, contributions and reflections on the theoretical and practical basis of the research are presented. These among others focus on the theory of boundary critique, which is an important aspect of critical systems thinking. It is often assumed that boundary critique is only undertaken at the start of an intervention to ensure that its remit has been adequately defined. However, this project shows that it is both possible and desirable to use the theory of boundary critique in an on-going basis in interventions to inform the creative design of methods.
6

Explicit alternating direction methods for problems in fluid dynamics

Al-Wali, Azzam Ahmad January 1994 (has links)
Recently an iterative method was formulated employing a new splitting strategy for the solution of tridiagonal systems of difference equations. The method was successful in solving the systems of equations arising from one dimensional initial boundary value problems, and a theoretical analysis for proving the convergence of the method for systems whose constituent matrices are positive definite was presented by Evans and Sahimi [22]. The method was known as the Alternating Group Explicit (AGE) method and is referred to as AGE-1D. The explicit nature of the method meant that its implementation on parallel machines can be very promising. The method was also extended to solve systems arising from two and three dimensional initial-boundary value problems, but the AGE-2D and AGE-3D algorithms proved to be too demanding in computational cost which largely reduces the advantages of its parallel nature. In this thesis, further theoretical analyses and experimental studies are pursued to establish the convergence and suitability of the AGE-1D method to a wider class of systems arising from univariate and multivariate differential equations with symmetric and non symmetric difference operators. Also the possibility of a Chebyshev acceleration of the AGE-1D algorithm is considered. For two and three dimensional problems it is proposed to couple the use of the AGE-1D algorithm with an ADI scheme or an ADI iterative method in what is called the Explicit Alternating Direction (EAD) method. It is then shown through experimental results that the EAD method retains the parallel features of the AGE method and moreover leads to savings of up to 83 % in the computational cost for solving some of the model problems. The thesis also includes applications of the AGE-1D algorithm and the EAD method to solve some problems of fluid dynamics such as the linearized Shallow Water equations, and the Navier Stokes' equations for the flow in an idealized one dimensional Planetary Boundary Layer. The thesis terminates with conclusions and suggestions for further work together with a comprehensive bibliography and an appendix containing some selected programs.
7

The viability of Weibull analysis of small samples in process manufacturing

Abughazaleh, Tareq Ali Ibrahim January 2002 (has links)
This research deals with some Statistical Quality Control (SQC) methods, which are used in quality testing. It investigates the problem encountered with statistical process control (SPC) tools when small sample sizes are used. Small sample size testing is a new area of concern especially when using expensive (or large) products, which are produced in small batches (low volume production). Critical literature review and analysis of current technologies and methods in SPC with small samples testing failed to show a conformance with conventional SPC techniques, as the confidence limits for averages and standard deviation are too wide. Therefore, using such sizes will provide unsecured results with a lack in accuracy. The current research demonstrates such problems in manufacturing by using examples, in order to show the lack and the difficulties faced with conventional SPC tools (control charts). Weibull distribution has always shown a clear and acceptable prediction of failure and life behaviour with small sample size batches. Using such distribution enables the accuracy needed with small sample size to be obtained. With small sample control charts generate inaccurate confidence limits, which are low. On the contrary, Weibull theory suggests that using small samples enable achievement of accurate confidence limits. This research highlights these two aspects and explains their features in more depth. An outline of the overall problem and solution point out success of Weibull analysis when Weibull distribution is modified to overcome the problems encountered when small sample sizes are used. This work shows the viability of Weibull distribution to be used as a quality tool and construct new control charts, which will provide accurate result and detect nonconformance and variability with the use of small sample sizes. Therefore, the new proposed Weibull deduction control charts shows a successful replacement of the conventional control chart, and these new charts will compensate the errors in quality testing when using small size samples.
8

Modelling of maximal and submaximal oxygen uptake in men and women

Johnson, Patrick J. January 2002 (has links)
No description available.

Page generated in 0.1523 seconds