• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 209
  • 31
  • 29
  • 13
  • 12
  • 10
  • 7
  • 5
  • 4
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • Tagged with
  • 408
  • 158
  • 59
  • 58
  • 57
  • 57
  • 55
  • 52
  • 49
  • 45
  • 42
  • 41
  • 39
  • 35
  • 34
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

EM-Based Joint Detection and Estimation for Two-Way Relay Network

Yen, Kai-wei 01 August 2012 (has links)
In this paper, the channel estimation problem for a two-way relay network (TWRN) based on two different wireless channel assumptions is considered. Previous works have proposed a training-based channel estimation method to obtain the channel state information (CSI). But in practice the channel change from one data block to another, which may cause the performance degradation due to the outdated CSI. To enhance the performance, the system has to insert more training signal. In order to improve the bandwidth efficiency, we propose a joint channel estimation and data detection method based on expectation-maximization (EM) algorithm. From the simulation results, the proposed method can combat the effect of fading channel and still the MSE results are very close to Cramer-Rao Lower Bound (CRLB) at the high signal-to-noise ratio (SNR) region. Additionally, as compare with the previous work, the proposed scheme also has a better detection performance for both time-varying and time-invariant channels.
102

A Study of Optimal Portfolio Decision and Performance Measures

Chen, Hsin-Hung 03 June 2004 (has links)
Since most financial institutions use the Sharpe Ratio to evaluate the performance of mutual funds, the objective of most fund managers is to select the portfolio that can generate the highest Sharpe Ratio. Traditionally, they can revise the objective function of the Markowitz mean-variance portfolio model and resolve non-linear programming to obtain the maximum Sharpe Ratio portfolio. In the scenario with short sales allowed, this project will propose a closed-form solution for the optimal Sharpe Ratio portfolio by applying Cauchy-Schwarz maximization. This method without using a non-linear programming computer program is easier than traditional method to implement and can save computing time and costs. Furthermore, in the scenarios with short sales disallowed, we will use Kuhn-Tucker conditions to find the optimal Sharpe Ratio portfolio. On the other hand, an efficient frontier generated by Markowitz mean-variance portfolio model normally has higher risk higher return characteristic, which often causes dilemma for decision maker. This research applies generalized loss function to create a family of decision-aid performance measures called IRp which can well tradeoff return with risk. We compare IRp with Sharpe Ratio and utility functions to confirm that IRp measures are approapriate to evaluate portfolio performance on efficient frontier and to improve asset allocation decisions. In addition, empirical data of domestic and international investment instruments will be used to examine the feasibility and fitness of the new proposed method and IRp measures. This study applies the methods of Cauchy-Schwarz maximization in multivariate statistical analysis and loss function in quality engineering to portfolio decisions. We believe these new applications will complete portfolio model theory and will be meaningful for academic and business fields.
103

Advanced control for power density maximization of the brushless DC generator

Lee, Hyung-Woo 17 February 2005 (has links)
This dissertation proposes a novel control technique for power density maximization of the brushless DC (BLDC) generator which is a nonsinusoidal power supply system. In a generator of given rating, the weight and size of the system affect the fuel consumption directly, therefore power density is one of the most important issues in a stand-alone generator. Conventional rectification methods cannot achieve the maximum power possible because of a distorted or unsuitable current waveform. The optimal current waveform for maximizing power density and minimizing machine size and weight in a nonsinusoidal power supply system has been proposed theoretically and verified by simulation and experimental work. Also, various attributes of practical interest are analyzed and simulated to investigate the impact on real systems.
104

The biological and economical analysis of the resource of South Pacific albacore

Chiu, szu-wei 14 June 2009 (has links)
Abstract This study used the Gordon-Schaefer model to do resource economic analysis on the South Pacific albacore fishery in 1967-2007 . Evaluated the equilibrium standard of open access model and present value maximization model, and then compared them with the real data. The results indicates that the fishing yield, resource stock, effort and catch-per-unit-effort of south Pacific albacore is close to the equilibrium level of present value maximization model after year 2002, which means the South Pacific albacore fishing is under appropriate development. Following that, this paper did sensitivity analysis to understand the impact of the changed parameters on stock size and effort. Finally, using the simulation analysis on open access model and present value maximization model. In open access model, the result shows that resources will face extinction crisis if the fishery is not controlled well. In present value maximization model, the albacore fishery would sustainable management. This result is valuable for the fishery management authorities to maintain the development of fishery and cherishing ocean resources at the same time.
105

Bioeconomic analysis of Argentine shortfin squid, Illex argentinus in the Southwest Atlantic.

Wang, Bi-yi 14 June 2009 (has links)
This research is based on Gordon-Schaefer model, using the statistic data from the FAO between 1983 to 2007 to conduct an assessment on Argentine shortfin squid, Illex argentinus. First of all, calculate and compare the equilibrium levels of open access fishery and present value maximization fishery, then evaluate the stock size of Illex argentinus and compare the equilibrium levels of two models with the statistic readings, the result shows that Illex argentinus has no sign of depletion, but it has not yet reached the best status for development. By using sensitivity analysis,we understand the changes on the effort and stock effected by varying different parameters. Finally, by simulating the stock size of open access fishery and present value maximization fishery, we find that unrestricted developing can end up the resources, but Illex argentinus will receive sustainable development, if it can be effectively managed.
106

Computer aided diagnosis in digital mammography [electronic resource]: classification of mass and normal tissue / by Monika Shinde.

Shinde, Monika. January 2003 (has links)
Title from PDF of title page. / Document formatted into pages; contains 63 pages. / Thesis (M.S.C.S.)--University of South Florida, 2003. / Includes bibliographical references. / Text (Electronic thesis) in PDF format. / ABSTRACT: The work presented here is an important component of an on going project of developing an automated mass classification system for breast cancer screening and diagnosis for Digital Mammogram applications. Specifically, in this work the task of automatically separating mass tissue from normal breast tissue given a region of interest in a digitized mammogram is investigated. This is the crucial stage in developing a robust automated classification system because the classification depends on the accurate assessment of the tumor-normal tissue border as well as information gathered from the tumor area. In this work the Expectation Maximization (EM) method is developed and applied to high resolution digitized screen-film mammograms with the aim of segmenting normal tissue from mass tissue. / ABSTRACT: Both the raw data and summary data generated by Laws' texture analysis are investigated. Since the ultimate goal is robust classification, the merits of the tissue segmentation are assessed by its impact on the overall classification performance. Based on the 300 image dataset consisting of 97 malignant and 203 benign cases, a 63% sensitivity and 89% specificity was achieved. Although, the segmentation requires further investigation, the development and related computer coding of the EM algorithm was successful. The method was developed to take in account the input feature correlation. This development allows other researchers at this facility to investigate various input features without having the intricate understanding of the EM approach. / System requirements: World Wide Web browser and PDF reader. / Mode of access: World Wide Web.
107

Analysis of circular data in the dynamic model and mixture of von Mises distributions

Lan, Tian, active 2013 10 December 2013 (has links)
Analysis of circular data becomes more and more popular in many fields of studies. In this report, I present two statistical analysis of circular data using von Mises distributions. Firstly, the maximization-expectation algorithm is reviewed and used to classify and estimate circular data from the mixture of von Mises distributions. Secondly, Forward Filtering Backward Smoothing method via particle filtering is reviewed and implemented when circular data appears in the dynamic state-space models. / text
108

Weakly supervised part-of-speech tagging for Chinese using label propagation

Ding, Weiwei, 1985- 02 February 2012 (has links)
Part-of-speech (POS) tagging is one of the most fundamental and crucial tasks in Natural Language Processing. Chinese POS tagging is challenging because it also involves word segmentation. In this report, research will be focused on how to improve unsupervised Part-of-Speech (POS) tagging using Hidden Markov Models and the Expectation Maximization parameter estimation approach (EM-HMM). The traditional EM-HMM system uses a dictionary, which is used to constrain possible tag sequences and initialize the model parameters. This is a very crude initialization: the emission parameters are set uniformly in accordance with the tag dictionary. To improve this, word alignments can be used. Word alignments are the word-level translation correspondent pairs generated from parallel text between two languages. In this report, Chinese-English word alignment is used. The performance is expected to be better, as these two tasks are complementary to each other. The dictionary provides information on word types, while word alignment provides information on word tokens. However, it is found to be of limited benefit. In this report, another method is proposed. To improve the dictionary coverage and get better POS distribution, Modified Adsorption, a label propagation algorithm is used. We construct a graph connecting word tokens to feature types (such as word unigrams and bigrams) and connecting those tokens to information from knowledge sources, such as a small tag dictionary, Wiktionary, and word alignments. The core idea is to use a small amount of supervision, in the form of a tag dictionary and acquire POS distributions for each word (both known and unknown) and provide this as an improved initialization for EM learning for HMM. We find this strategy to work very well, especially when we have a small tag dictionary. Label propagation provides a better initialization for the EM-HMM method, because it greatly increases the coverage of the dictionary. In addition, label propagation is quite flexible to incorporate many kinds of knowledge. However, results also show that some resources, such as the word alignments, are not easily exploited with label propagation. / text
109

Statistical Analysis of Operational Data for Manufacturing System Performance Improvement

Wang, Zhenrui January 2013 (has links)
The performance of a manufacturing system relies on its four types of elements: operators, machines, computer system and material handling system. To ensure the performance of these elements, operational data containing various aspects of information are collected for monitoring and analysis. This dissertation focuses on the operator performance evaluation and machine failure prediction. The proposed research work is motivated by the following challenges in analyzing operational data. (i) the complex relationship between the variables, (ii) the implicit information important to failure prediction, and (iii) data with outliers, missing and erroneous measurements. To overcome these challenges, the following research has been conducted. To compare operator performance, a methodology combining regression modeling and multiple comparisons technique is proposed. The regression model quantifies and removes the complex effects of other impacting factors on the operator performance. A robust zero-inflated Poisson (ZIP) model is developed to reduce the impacts of the excessive zeros and outliers in the performance metric, i.e. the number of defects (NoD), on regression analysis. The model residuals are plotted in non-parametric statistical charts for performance comparison. The estimated model coefficients are also used to identify under-performing machines. To detect temporal patterns from operational data sequence, an algorithm is proposed for detecting interval-based asynchronous periodic patterns (APP). The algorithm effectively and efficiently detects pattern through a modified clustering and a convolution-based template matching method. To predict machine failures based on the covariates with erroneous measurements, a new method is proposed for statistical inference of proportional hazard model under a mixture of classical and Berkson errors. The method estimates the model coefficients with an expectation-maximization (EM) algorithm with expectation step achieved by Monte Carlo simulation. The model estimated with the proposed method will improve the accuracy of the inference on machine failure probability. The research work presented in this dissertation provides a package of solutions to improve manufacturing system performance. The effectiveness and efficiency of the proposed methodologies have been demonstrated and justified with both numerical simulations and real-world case studies.
110

Įmonės vertės maksimizavimo modelis vertės veiksnių kotekste / The Model of the company’s Value maximisation in the Context of Value Drivers

Bačkytė, Agnė 17 June 2010 (has links)
Atlikus įmonės vertės maksimizavimo analizę, buvo tiriama, įmonės vertės maksimizavimo svarba, kokie vertės veiksniai labiausiai įtakoja įmonės vertę, koks įmonės vertės nustatymo metodas yra tinkamiausias vertės veiksnių inkorporavimo atžvilgiu. Atsižvelgiant į atliktus tyrimus sukurtas įmonės vertės maksimizavimo modelis vertės veiksnių kontekste. / After carrying out the analysis of the company’s value maximization, the company’s value was researched and found which value drivers have most influence to the value of the company. Also it was found which method of the company’s value evaluation is most suitable in relation to the incorporation of value drivers. Taking into account the research done, the model of company’s value maximisation in the context of value drivers was created.

Page generated in 0.0754 seconds