• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 304
  • 139
  • 34
  • 31
  • 23
  • 19
  • 16
  • 16
  • 14
  • 12
  • 7
  • 5
  • 4
  • 3
  • 2
  • Tagged with
  • 743
  • 743
  • 743
  • 141
  • 118
  • 112
  • 102
  • 86
  • 68
  • 65
  • 59
  • 57
  • 55
  • 54
  • 52
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
501

Construct validity of a managerial assessment centre

Nako, Zovuyo Chulekazi 12 1900 (has links)
This was a correlation study exploring the relationships between scores on various dimensions within and across different exercises in the leadership assessment and development centre (LADC) of an auditing firm in Johannesburg. The study specifically aimed at investigating the discriminant and convergent validity of the LADC. LADC ratings collected from a sample of 138 were analysed using a Pearson Product Moment Correlation (r) and principal component analysis (PCA) was conducted to discover the main dimensions or constructs. Twenty one dimensions were measured using six different exercises in the LADC. The large correlations found in the study showed lack of discriminant validity amongst the majority of different dimensions measured in same exercises whilst, the PCA showed some convergent validity among various dimensions measured across exercises for the LADC. Lastly, the findings of the principal component analysis (PCA) supported a two-factor structure, indicating that assessors are able to differentiate between interpersonal and performance-related dimensions. / Industrial and Organisational Psychology / M.A. (Industrial and Organisational Psychology)
502

Characterisation and classification of protein sequences by using enhanced amino acid indices and signal processing-based methods

Chrysostomou, Charalambos January 2013 (has links)
Protein sequencing has produced overwhelming amount of protein sequences, especially in the last decade. Nevertheless, the majority of the proteins' functional and structural classes are still unknown, and experimental methods currently used to determine these properties are very expensive, laborious and time consuming. Therefore, automated computational methods are urgently required to accurately and reliably predict functional and structural classes of the proteins. Several bioinformatics methods have been developed to determine such properties of the proteins directly from their sequence information. Such methods that involve signal processing methods have recently become popular in the bioinformatics area and been investigated for the analysis of DNA and protein sequences and shown to be useful and generally help better characterise the sequences. However, there are various technical issues that need to be addressed in order to overcome problems associated with the signal processing methods for the analysis of the proteins sequences. Amino acid indices that are used to transform the protein sequences into signals have various applications and can represent diverse features of the protein sequences and amino acids. As the majority of indices have similar features, this project proposes a new set of computationally derived indices that better represent the original group of indices. A study is also carried out that resulted in finding a unique and universal set of best discriminating amino acid indices for the characterisation of allergenic proteins. This analysis extracts features directly from the protein sequences by using Discrete Fourier Transform (DFT) to build a classification model based on Support Vector Machines (SVM) for the allergenic proteins. The proposed predictive model yields a higher and more reliable accuracy than those of the existing methods. A new method is proposed for performing a multiple sequence alignment. For this method, DFT-based method is used to construct a new distance matrix in combination with multiple amino acid indices that were used to encode protein sequences into numerical sequences. Additionally, a new type of substitution matrix is proposed where the physicochemical similarities between any given amino acids is calculated. These similarities were calculated based on the 25 amino acids indices selected, where each one represents a unique biological protein feature. The proposed multiple sequence alignment method yields a better and more reliable alignment than the existing methods. In order to evaluate complex information that is generated as a result of DFT, Complex Informational Spectrum Analysis (CISA) is developed and presented. As the results show, when protein classes present similarities or differences according to the Common Frequency Peak (CFP) in specific amino acid indices, then it is probable that these classes are related to the protein feature that the specific amino acid represents. By using only the absolute spectrum in the analysis of protein sequences using the informational spectrum analysis is proven to be insufficient, as biologically related features can appear individually either in the real or the imaginary spectrum. This is successfully demonstrated over the analysis of influenza neuraminidase protein sequences. Upon identification of a new protein, it is important to single out amino acid responsible for the structural and functional classification of the protein, as well as the amino acids contributing to the protein's specific biological characterisation. In this work, a novel approach is presented to identify and quantify the relationship between individual amino acids and the protein. This is successfully demonstrated over the analysis of influenza neuraminidase protein sequences. Characterisation and identification problem of the Influenza A virus protein sequences is tackled through a Subgroup Discovery (SD) algorithm, which can provide ancillary knowledge to the experts. The main objective of the case study was to derive interpretable knowledge for the influenza A virus problem and to consequently better describe the relationships between subtypes of this virus. Finally, by using DFT-based sequence-driven features a Support Vector Machine (SVM)-based classification model was built and tested, that yields higher predictive accuracy than that of SD. The methods developed and presented in this study yield promising results and can be easily applied to proteomic fields.
503

利用主成份分析法探討外匯市場風險 / Discussions of Risks in Currency Markets from the Perspective of Principal Component Analysis

郭芝岑, Kuo, Chih Chin Unknown Date (has links)
本文主要在探討在較為短的時間段以及不同的金融環境之下,是否仍然能捕捉到匯率市場中主要解釋投組報酬變動的共同風險因子-平均超額報酬以及利差報酬。我們依據重要金融事件將全樣本分為八個子樣本;總共使用39種幣別並將1983年11月至2015年10月的遠期貼水由小到大排序後,依序建構六個投資組合。全文以美國投資者的觀點出發。結果顯示平均超額報酬無論是在長期或短期的時間段下,仍然為匯率市場中解釋匯率報酬變動的主要風險因子。然而,利差報酬則不然。在銀行危機期間,利差報酬與第二主要成分之相關係數皆為高度負相關。近期自2008年次貸危機開始,利差報酬與解釋投組變動的第二主要成分之相關係數也從先前的0.8~0.9降至-0.80.此結果顯示利差交易似乎在次貸危機之後有所轉變。此外利差風險因子無法有效的解釋動能報酬。 / This paper investigates whether or not the common risk factors, dollar and carry trade risk, in currency markets proposed by Lustig, Roussanov and Verdelhan (2011) will still exist even under a short-run period with a concern of different financial backgrounds. A split of full sample into eight subsamples with respect of financial events is made. A total of 39 currencies is used to build six portfolios on the basis of the forward discounts from November 1983 to October 2015. The whole paper is in the view of an American investor. The finding suggests that under both long-run and short-run period, the dollar return is always the common factor in currency markets. However, it is not the same case for the carry trade return. During bank crises, the carry trade return is strongly negative correlated with the second component. The carry trade return turns out to have a negative correlation with the second component during and after the subprime crisis, decreasing from 0.8~0.9 in the previous subsamples to -0.80. It indicates that the desirability of carry trade activities has changed since the subprime crisis. Besides, the carry trade risk has a little power to explain the variations of momentum returns.
504

多資產結構型商品之評價與避險--利用Quasi-Monte Carlo模擬法

粘哲偉 Unknown Date (has links)
結構型商品,這種風險介於固定收益證券和股票之間的產品,甫上市以來,便廣受投資人的青睞,不僅提供保障本金的需求,更賦予參與股市上漲的獲利。且自從2004年之後,隨著目前景氣逐步回升,股票市場也預期會跟著上揚,於是連結股權的結構型商品也不斷地被推出,而其所隱含選擇權逐漸以連動多資產和具有新奇路徑相依條款為主,而使得在評價上,我們所面對的是高維度的問題,一般在處理高維度問題上,皆以傳統蒙地卡羅模擬法來因應。但因其緩慢的收斂速度,成為應用上的最大缺點,而且在處理高維度問題上所需耗費的模擬時間更為顯著。 本論文主要貢獻可分為兩點:第一,在應用準蒙地卡羅法來對多資產結構型商品評價,並採用Silva(2003)和Acworth, Broadie, and Glasserman(1998)的方法,來對準蒙地卡羅法作改善,並利用二檔市面上存在的結構型商品---高收益鎖定型連動債券和優選鎖定連動債券進行評價,結果發現改善後的準地卡羅法,其評價效率高於蒙地卡羅法和反向變異蒙地卡羅法。第二,本文還對高收益鎖定型連動債券提出delta避險策略,透過先計算選擇權對報酬率的delta,再轉換為所需持有股票的部位,最後發現所建立的避險組合能夠完全支應每年到期時所應付給投資人的債息,以及在避險時所需借款的部份,表示此一策略應為可行的避險策略,可供券商作避險上的參考。
505

"Spaghetti" 主成份分析應用於時間相關區間型資料的研究---以台灣股價為例 / A study of Spaghetti PCA for time dependent interval data applied to stock prices in Taiwan

邱大倞, Chiu, Ta Ching Unknown Date (has links)
區間型資料一般定義為由一個連續型變數的上限及下限所構成,本文中,我們特別引進了一種與時間相關的區間型資料,在Irpino (2006, Pattern Recognition Letters, 27, 504-513),他提出每個觀測值皆是由某個時段的起始值及終點值之有方向性的區間所組成,譬如某一支股票在某一周的開盤價和收盤價。過去已經有許多方法運用在區間型資料,但尚未有方法來處理有方向性的區間型資料,然而Irpino 延伸主成分方法來處理有方向性的區間資料。Irpino提出的方法以幾何學的觀點來說,可視為定義在多維度空間上對有方向性線段(一般都稱作“spaghetti”)的分析,在本文中我們有更作進一步的延伸,不僅引入股票的開盤價及收盤價,且引入當周的最高價及最低價來探索Irpino所遺漏的資訊。此外,我們也嘗試用貝他分配來取代Irpino所使用的均勻分配來檢測是否有明顯的改善。延伸的方法需要計算大量複雜的式子,包含了平均數,變異數,共變異數等,最後利用相關係數矩陣進行主成分分析。然而最後的結論為若考慮資訊的價值,以加入最高值和最小值的延伸方法是較好的選擇。 / Interval data are generally defined by the upper and the lower value assumed by a unit for a continuous variable. In this study, we introduce a special type of interval description depending on time. The original idea (Irpino, 2006, Pattern Recognition Letters, 27, 504-513) is that each observation is characterized by an oriented interval of values with a starting and a closing value for each period of observation: for example, the beginning and the closing price of a stock in a week. Several factorial methods have been discovered in order to treat interval data, but not yet for oriented intervals. Irpino presented an extension of principle component analysis to time dependent interval data, or, in general, to oriented intervals. From a geometrical point of view, the proposed approach can be considered as an analysis of oriented segments (nicely called “spaghetti”) defined in a multidimensional space identified by periods. In this paper, we make further extension not only provide the opening and the closing value but also the highest and the lowest value in a week to find out more information that cannot simply obtained from the original idea. Besides, we also use beta distribution to see if there is any improvement corresponding to the original ones. After we make these extensions, many complicated computations should be calculated such as the mean, variance, covariance in order to obtain correlation matrix for PCA. With regard to the value of information, the extended idea with the highest prices and the lowest price is the best choice.
506

Biogeochemical Defluoridation

Evans-Tokaryk, Kerry 09 June 2011 (has links)
Fluoride in drinking water can lead to a crippling disease called fluorosis. As there is no cure for fluorosis, prevention is the only means of controlling the disease and research into fluoride remediation is critical. This work begins by providing a new approach to assessing fluoride remediation strategies using a combination of groundwater chemistry, saturation indices, and multivariate statistics based on the results of a large groundwater survey performed in a fluoride-contaminated region of India. From the Indian groundwater study, it was noted that one technique recommended for defluoridation involved using hydrous ferric oxide (HFO) as a solid phase sorbent for fluoride. This prompted investigation of bacteriogenic iron oxides (BIOS), a biogenic form of HFO, as a means of approaching bioremediation of fluoride. Batch sorption experiments at ionic strengths ranging from 0.001 to 0.1 M KNO3 and time course kinetic studies with BIOS and synthetic HFO were conducted to ascertain total sorption capacities (ST), sorption constants (Ks), and orders of reaction (n), as well as forward (kf) and reverse (kr) rate constants. Microcosm titration experiments were also conducted with BIOS and HFO in natural spring water from a groundwater discharge zone to evaluate fluoride sorption under field conditions. This thesis contributes significant, new information regarding the interaction between fluoride and BIOS, advancing knowledge of fluoride remediation and covering new ground in the uncharted field of fluoride bioremediation.
507

Construct validity of a managerial assessment centre

Nako, Zovuyo Chulekazi 12 1900 (has links)
This was a correlation study exploring the relationships between scores on various dimensions within and across different exercises in the leadership assessment and development centre (LADC) of an auditing firm in Johannesburg. The study specifically aimed at investigating the discriminant and convergent validity of the LADC. LADC ratings collected from a sample of 138 were analysed using a Pearson Product Moment Correlation (r) and principal component analysis (PCA) was conducted to discover the main dimensions or constructs. Twenty one dimensions were measured using six different exercises in the LADC. The large correlations found in the study showed lack of discriminant validity amongst the majority of different dimensions measured in same exercises whilst, the PCA showed some convergent validity among various dimensions measured across exercises for the LADC. Lastly, the findings of the principal component analysis (PCA) supported a two-factor structure, indicating that assessors are able to differentiate between interpersonal and performance-related dimensions. / Industrial and Organisational Psychology / M.A. (Industrial and Organisational Psychology)
508

Vibrační spektroskopie ve farmaceutické analýze / Vibrational spectroscopy in pharmaceutical analysis

Průchová, Kristýna January 2012 (has links)
The aim of this diploma thesis is the application of vibrational spectroscopy in pharmaceutical analysis in studying solid pharmaceutical forms. The surface of tablet samples containing the active substance from the group of statins has been studied especially by the methods infrared microscopy. Spectral maps of samples were collected thanks to the techniques of specular reflection, attenuated total reflection (ATR) and "inverse" ATR after determining optimal conditions for measurements. In order to evaluate these measured maps, one-dimensional analysis and principal component analysis were used. As the same samples of tablets were also measured by Raman microscopy, the comparison has been provided. The measured distribution maps enable both a determination of substances in the sample and conclusion concerned a method of tablets' preparation. The method in this case is a granulation, which has been found out from a comparison of maps of generic and original medicament. The specular reflection method was selected to be the most appropriate technique for obtaining the maps of the surface of a tablet, via confrontation of particular methods consequently with consideration of their advantages and disadvantages in the measurement and data processing.
509

Prise en compte de la flexibilité des ressources humaines dans la planification et l’ordonnancement des activités industrielles / Considering the flexibility of human resources in planning and scheduling industrial activities

Atalla El-Awady Attia, El-Awady 05 April 2013 (has links)
Le besoin croissant de réactivité dans les différents secteurs industriels face à la volatilité des marchés soulève une forte demande de la flexibilité dans leur organisation. Cette flexibilité peut être utilisée pour améliorer la robustesse du planning de référence d’un programme d’activités donné. Les ressources humaines de l’entreprise étant de plus en plus considérées comme le coeur des structures organisationnelles, elles représentent une source de flexibilité renouvelable et viable. Tout d’abord, ce travail a été mis en oeuvre pour modéliser le problème d’affectation multi-périodes des effectifs sur les activités industrielles en considérant deux dimensions de la flexibilité: L’annualisation du temps de travail, qui concerne les politiques de modulation d’horaires, individuels ou collectifs, et la polyvalence des opérateurs, qui induit une vision dynamique de leurs compétences et la nécessité de prévoir les évolutions des performances individuelles en fonction des affectations successives. La nature dynamique de l’efficacité des effectifs a été modélisée en fonction de l’apprentissage par la pratique et de la perte de compétence pendant les périodes d’interruption du travail. En conséquence, nous sommes résolument placés dans un contexte où la durée prévue des activités n’est plus déterministe, mais résulte du nombre des acteurs choisis pour les exécuter, en plus des niveaux de leur expérience. Ensuite, la recherche a été orientée pour répondre à la question : « quelle genre, ou quelle taille, de problème pose le projet que nous devons planifier? ». Par conséquent, les différentes dimensions du problème posé sont classées et analysés pour être évaluées et mesurées. Pour chaque dimension, la méthode d’évaluation la plus pertinente a été proposée : le travail a ensuite consisté à réduire les paramètres résultants en composantes principales en procédant à une analyse factorielle. En résultat, la complexité (ou la simplicité) de la recherche de solution (c’est-à-dire de l’élaboration d’un planning satisfaisant pour un problème donné) peut être évaluée. Pour ce faire, nous avons développé une plate-forme logicielle destinée à résoudre le problème et construire le planning de référence du projet avec l’affectation des ressources associées, plate-forme basée sur les algorithmes génétiques. Le modèle a été validé, et ses paramètres ont été affinés via des plans d’expériences pour garantir la meilleure performance. De plus, la robustesse de ces performances a été étudiée sur la résolution complète d’un échantillon de quatre cents projets, classés selon le nombre de leurs tâches. En raison de l’aspect dynamique de l’efficacité des opérateurs, le présent travail examine un ensemble de facteurs qui influencent le développement de leur polyvalence. Les résultats concluent logiquement qu’une entreprise en quête de flexibilité doit accepter des coûts supplémentaires pour développer la polyvalence de ses opérateurs. Afin de maîtriser ces surcoûts, le nombre des opérateurs qui suivent un programme de développement des compétences doit être optimisé, ainsi que, pour chacun d’eux, le degré de ressemblance entre les nouvelles compétences développées et les compétences initiales, ou le nombre de ces compétences complémentaires (toujours pour chacun d’eux), ainsi enfin que la façon dont les heures de travail des opérateurs doivent être réparties sur la période d’acquisition des compétences. Enfin, ce travail ouvre la porte pour la prise en compte future des facteurs humains et de la flexibilité des effectifs pendant l’élaboration d’un planning de référence. / The growing need of responsiveness for manufacturing companies facing the market volatility raises a strong demand for flexibility in their organization. This flexibility can be used to enhance the robustness of a baseline schedule for a given programme of activities. Since the company personnel are increasingly seen as the core of the organizational structures, they provide the decision-makers with a source of renewable and viable flexibility. First, this work was implemented to model the problem of multi-period workforce allocation on industrial activities with two degrees of flexibility: the annualizing of the working time, which offers opportunities of changing the schedules, individually as well as collectively. The second degree of flexibility is the versatility of operators, which induces a dynamic view of their skills and the need to predict changes in individual performances as a result of successive assignments. The dynamic nature of workforce’s experience was modelled in function of learning-by-doing and of oblivion phenomenon during the work interruption periods. We firmly set ourselves in a context where the expected durations of activities are no longer deterministic, but result from the number and levels of experience of the workers assigned to perform them. After that, the research was oriented to answer the question “What kind of problem is raises the project we are facing to schedule?”: therefore the different dimensions of the project are inventoried and analysed to be measured. For each of these dimensions, the related sensitive assessment methods have been proposed. Relying on the produced correlated measures, the research proposes to aggregate them through a factor analysis in order to produce the main principal components of an instance. Consequently, the complexity or the easiness of solving or realising a given scheduling problem can be evaluated. In that view, we developed a platform software to solve the problem and construct the project baseline schedule with the associated resources allocation. This platform relies on a genetic algorithm. The model has been validated, moreover, its parameters has been tuned to give the best performance, relying on an experimental design procedure. The robustness of its performance was also investigated, by a comprehensive solving of four hundred instances of projects, ranked according to the number of their tasks. Due to the dynamic aspect of the workforce’s experience, this research work investigates a set of different parameters affecting the development of their versatility. The results recommend that the firms seeking for flexibility should accept an amount of extra cost to develop the operators’ multi functionality. In order to control these over-costs, the number of operators who attend a skill development program should be optimised, as well as the similarity of the new developed skills relative to the principal ones, or the number of the additional skills an operator may be trained to, or finally the way the operators’ working hours should be distributed along the period of skill acquisition: this is the field of investigations of the present work which will, in the end, open the door for considering human factors and workforce’s flexibility in generating a work baseline program.
510

Classification of Carpiodes Using Fourier Descriptors: A Content Based Image Retrieval Approach

Trahan, Patrick 06 August 2009 (has links)
Taxonomic classification has always been important to the study of any biological system. Many biological species will go unclassified and become lost forever at the current rate of classification. The current state of computer technology makes image storage and retrieval possible on a global level. As a result, computer-aided taxonomy is now possible. Content based image retrieval techniques utilize visual features of the image for classification. By utilizing image content and computer technology, the gap between taxonomic classification and species destruction is shrinking. This content based study utilizes the Fourier Descriptors of fifteen known landmark features on three Carpiodes species: C.carpio, C.velifer, and C.cyprinus. Classification analysis involves both unsupervised and supervised machine learning algorithms. Fourier Descriptors of the fifteen known landmarks provide for strong classification power on image data. Feature reduction analysis indicates feature reduction is possible. This proves useful for increasing generalization power of classification.

Page generated in 0.1007 seconds