• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 56
  • 21
  • 6
  • 6
  • 5
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 124
  • 65
  • 37
  • 24
  • 19
  • 18
  • 15
  • 14
  • 12
  • 9
  • 9
  • 9
  • 9
  • 8
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

Geostatistical three-dimensional modeling of the subsurface unconsolidated materials in the Göttingen area / The transitional-probability Markov chain versus traditional indicator methods for modeling the geotechnical categories in a test site.

Ranjineh Khojasteh, Enayatollah 27 June 2013 (has links)
Das Ziel der vorliegenden Arbeit war die Erstellung eines dreidimensionalen Untergrundmodells der Region Göttingen basierend auf einer geotechnischen Klassifikation der unkosolidierten Sedimente. Die untersuchten Materialen reichen von Lockersedimenten bis hin zu Festgesteinen, werden jedoch in der vorliegenden Arbeit als Boden, Bodenklassen bzw. Bodenkategorien bezeichnet. Diese Studie evaluiert verschiedene Möglichkeiten durch geostatistische Methoden und Simulationen heterogene Untergründe zu erfassen. Derartige Modellierungen stellen ein fundamentales Hilfswerkzeug u.a. in der Geotechnik, im Bergbau, der Ölprospektion sowie in der Hydrogeologie dar. Eine detaillierte Modellierung der benötigten kontinuierlichen Parameter wie z. B. der Porosität, der Permeabilität oder hydraulischen Leitfähigkeit des Untergrundes setzt eine exakte Bestimmung der Grenzen von Fazies- und Bodenkategorien voraus. Der Fokus dieser Arbeit liegt auf der dreidimensionalen Modellierung von Lockergesteinen und deren Klassifikation basierend auf entsprechend geostatistisch ermittelten Kennwerten. Als Methoden wurden konventionelle, pixelbasierende sowie übergangswahrscheinlichkeitsbasierende Markov-Ketten Modelle verwendet. Nach einer generellen statistischen Auswertung der Parameter wird das Vorhandensein bzw. Fehlen einer Bodenkategorie entlang der Bohrlöcher durch Indikatorparameter beschrieben. Der Indikator einer Kategorie eines Probepunkts ist eins wenn die Kategorie vorhanden ist bzw. null wenn sie nicht vorhanden ist. Zwischenstadien können ebenfalls definiert werden. Beispielsweise wird ein Wert von 0.5 definiert falls zwei Kategorien vorhanden sind, der genauen Anteil jedoch nicht näher bekannt ist. Um die stationären Eigenschaften der Indikatorvariablen zu verbessern, werden die initialen Koordinaten in ein neues System, proportional zur Ober- bzw. Unterseite der entsprechenden Modellschicht, transformiert. Im neuen Koordinatenraum werden die entsprechenden Indikatorvariogramme für jede Kategorie für verschiedene Raumrichtungen berechnet. Semi-Variogramme werden in dieser Arbeit, zur besseren Übersicht, ebenfalls als Variogramme bezeichnet. IV Durch ein Indikatorkriging wird die Wahrscheinlichkeit jeder Kategorie an einem Modellknoten berechnet. Basierend auf den berechneten Wahrscheinlichkeiten für die Existenz einer Modellkategorie im vorherigen Schritt wird die wahrscheinlichste Kategorie dem Knoten zugeordnet. Die verwendeten Indikator-Variogramm Modelle und Indikatorkriging Parameter wurden validiert und optimiert. Die Reduktion der Modellknoten und die Auswirkung auf die Präzision des Modells wurden ebenfalls untersucht. Um kleinskalige Variationen der Kategorien auflösen zu können, wurden die entwickelten Methoden angewendet und verglichen. Als Simulationsmethoden wurden "Sequential Indicator Simulation" (SISIM) und der "Transition Probability Markov Chain" (TP/MC) verwendet. Die durchgeführten Studien zeigen, dass die TP/MC Methode generell gute Ergebnisse liefert, insbesondere im Vergleich zur SISIM Methode. Vergleichend werden alternative Methoden für ähnlichen Fragestellungen evaluiert und deren Ineffizienz aufgezeigt. Eine Verbesserung der TP/MC Methoden wird ebenfalls beschrieben und mit Ergebnissen belegt, sowie weitere Vorschläge zur Modifikation der Methoden gegeben. Basierend auf den Ergebnissen wird zur Anwendung der Methode für ähnliche Fragestellungen geraten. Hierfür werden Simulationsauswahl, Tests und Bewertungsysteme vorgeschlagen sowie weitere Studienschwerpunkte beleuchtet. Eine computergestützte Nutzung des Verfahrens, die alle Simulationsschritte umfasst, könnte zukünftig entwickelt werden um die Effizienz zu erhöhen. Die Ergebnisse dieser Studie und nachfolgende Untersuchungen könnten für eine Vielzahl von Fragestellungen im Bergbau, der Erdölindustrie, Geotechnik und Hydrogeologie von Bedeutung sein.
102

Fast uncertainty reduction strategies relying on Gaussian process models

Chevalier, Clément 18 September 2013 (has links) (PDF)
Cette thèse traite de stratégies d'évaluation séquentielle et batch-séquentielle de fonctions à valeurs réelles sous un budget d'évaluation limité, à l'aide de modèles à processus Gaussiens. Des stratégies optimales de réduction séquentielle d'incertitude (SUR) sont étudiées pour deux problèmes différents, motivés par des cas d'application en sûreté nucléaire. Tout d'abord, nous traitons le problème d'identification d'un ensemble d'excursion au dessus d'un seuil T d'une fonction f à valeurs réelles. Ensuite, nous étudions le problème d'identification de l'ensemble des configurations "robustes, contrôlées", c'est à dire l'ensemble des inputs contrôlés où la fonction demeure sous T quelle que soit la valeur des différents inputs non-contrôlés. De nouvelles stratégies SUR sont présentés. Nous donnons aussi des procédures efficientes et des formules permettant d'utiliser ces stratégies sur des applications concrètes. L'utilisation de formules rapides pour recalculer rapidement le posterior de la moyenne ou de la fonction de covariance d'un processus Gaussien (les "formules d'update de krigeage") ne fournit pas uniquement une économie computationnelle importante. Elles sont aussi l'un des ingrédient clé pour obtenir des formules fermées permettant l'utilisation en pratique de stratégies d'évaluation coûteuses en temps de calcul. Une contribution en optimisation batch-séquentielle utilisant le Multi-points Expected Improvement est également présentée.
103

Lasso顯著性檢定與向前逐步迴歸變數選取方法之比較 / A Comparison between Lasso Significance Test and Forward Stepwise Selection Method

鄒昀庭, Tsou, Yun Ting Unknown Date (has links)
迴歸模式的變數選取是很重要的課題,Tibshirani於1996年提出最小絕對壓縮挑選機制(Least Absolute Shrinkage and Selection Operator;簡稱Lasso),主要特色是能在估計的過程中自動完成變數選取。但因為Lasso本身並沒有牽扯到統計推論的層面,因此2014年時Lockhart et al.所提出的Lasso顯著性檢定是重要的突破。由於Lasso顯著性檢定的建構過程與傳統向前逐步迴歸相近,本研究接續Lockhart et al.(2014)對兩種變數選取方法的比較,提出以Bootstrap來改良傳統向前逐步迴歸;最後並比較Lasso、Lasso顯著性檢定、傳統向前逐步迴歸、以AIC決定變數組合的向前逐步迴歸,以及以Bootstrap改良的向前逐步迴歸等五種方法變數選取之效果。最後發現Lasso顯著性檢定雖然不容易犯型一錯誤,選取變數時卻過於保守;而以Bootstrap改良的向前逐步迴歸跟Lasso顯著性檢定一樣不容易犯型一錯誤,而選取變數上又比起Lasso顯著性檢定更大膽,因此可算是理想的方法改良結果。 / Variable selection of a regression model is an essential topic. In 1996, Tibshirani proposed a method called Lasso (Least Absolute Shrinkage and Selection Operator), which completes the matter of selecting variable set while estimating the parameters. However, the original version of Lasso does not provide a way for making inference. Therefore, the significance test for lasso proposed by Lockhart et al. in 2014 is an important breakthrough. Based on the similarity of construction of statistics between Lasso significance test and forward selection method, continuing the comparisons between the two methods from Lockhart et al. (2014), we propose an improved version of forward selection method by bootstrap. And at the second half of our research, we compare the variable selection results of Lasso, Lasso significance test, forward selection, forward selection by AIC, and forward selection by bootstrap. We find that although the Type I error probability for Lasso Significance Test is small, the testing method is too conservative for including new variables. On the other hand, the Type I error probability for forward selection by bootstrap is also small, yet it is more aggressive in including new variables. Therefore, based on our simulation results, the bootstrap improving forward selection is rather an ideal variable selecting method.
104

Logistic regression to determine significant factors associated with share price change

Muchabaiwa, Honest 19 February 2014 (has links)
This thesis investigates the factors that are associated with annual changes in the share price of Johannesburg Stock Exchange (JSE) listed companies. In this study, an increase in value of a share is when the share price of a company goes up by the end of the financial year as compared to the previous year. Secondary data that was sourced from McGregor BFA website was used. The data was from 2004 up to 2011. Deciding which share to buy is the biggest challenge faced by both investment companies and individuals when investing on the stock exchange. This thesis uses binary logistic regression to identify the variables that are associated with share price increase. The dependent variable was annual change in share price (ACSP) and the independent variables were assets per capital employed ratio, debt per assets ratio, debt per equity ratio, dividend yield, earnings per share, earnings yield, operating profit margin, price earnings ratio, return on assets, return on equity and return on capital employed. Different variable selection methods were used and it was established that the backward elimination method produced the best model. It was established that the probability of success of a share is higher if the shareholders are anticipating a higher return on capital employed, and high earnings/ share. It was however, noted that the share price is negatively impacted by dividend yield and earnings yield. Since the odds of an increase in share price is higher if there is a higher return on capital employed and high earning per share, investors and investment companies are encouraged to choose companies with high earnings per share and the best returns on capital employed. The final model had a classification rate of 68.3% and the validation sample produced a classification rate of 65.2% / Mathematical Sciences / M.Sc. (Statistics)
105

The association between working capital measures and the returns of South African industrial firms

Smith, Marolee Beaumont 12 1900 (has links)
This study investigates the association between traditional and alternative working capital measures and the returns of industrial firms listed on the Johannesburg Stock E"change. Twenty five variables for all industrial firms listed for the most recent 10 years were derived from standardised annual balance sheet data of the University of Pretoria's Bureau of Financial Analysis. Traditional liquidity ratios measuring working capital position, activity and leverage, and alternative liquidity measures, were calculated for each of the 135 participating firms for the 1 0 years. These working capital measures were tested for association with five return measures for every firm over the same period. This was done by means of a chi-square test for association, followed by stepwise multiple regression undertaken to quantify the underlying structural relationships between the return measures and the working capital measures. The results of the tests indicated that the traditional working capital leverage measures, in particular, total current liabilities divided by funds flow, and to a lesser e"tent, long-term loan capital divided by net working capital, displayed the greatest associations, and e"plained the majority of the variance in the return measures. At-test, undertaken to analyse the size effect on the working capital measures employed by the participating firms, compared firms according to total assets. The results revealed significant differences between the means of the top quartile of firms and the bottom quartile, for eight of the 13 working capital measures included in the study. A nonparametric test was applied to evaluate the sector effect on the working capital measures employed by the participating firms. The rank scores indicated significant differences in the means across the sectors for si" of the 13 working capital measures. A decrease in the working capital leverage measures of current liabilities divided by funds flow, and long-term loan capital divided by net working capital, should signal an increase in returns, and vice versa. It is recommended that financial managers consider these findings when forecasting firm returns. / Business Management / D. Com. (Business Management)
106

CRISTA : um apoio computacional para atividades de inspeção e compreensão de código

Porto, Daniel de Paula 18 May 2009 (has links)
Made available in DSpace on 2016-06-02T19:05:38Z (GMT). No. of bitstreams: 1 2434.pdf: 10415904 bytes, checksum: cf49390a38715c53ffe39e9881ed117c (MD5) Previous issue date: 2009-05-18 / Financiadora de Estudos e Projetos / Software inspection is a key activity of software quality assurance that can be applied in the whole development process since it is a static activity essentially based on reading. Depending on the artifact that is being inspected, we need to apply the appropriated reading technique. Stepwise Abstraction (SA) is a reading technique commonly used in code inspections. However, its application is laborious and time consuming. Aiming to help and facilitate the application of SA, this work presents CRISTA (Code Reading Implemented with Stepwise Abstraction), a tool to support SA-based inspection processes. This tool uses a visual metaphor to facilitate code navigation and has several resources to help program understanding and documentation. Due to these resources, CRISTA is also helpful for reverse engineering, re-engineering and maintenance activities. Three experimental studies were carried out to get feedback on the tool usability and usefulness for inspections and maintenance activities. The results provide insights that CRISTA is easy to use and adequately supports the inspection process as well as code reading by Stepwise Abstraction. Besides, in the context of maintenance, its resources make this activity less time-consuming. / Inspeção de software é uma atividade chave de garantia de qualidade de software que pode ser aplicada durante todo o processo de desenvolvimento uma vez que é uma atividade estática, baseada essencialmente em técnica de leitura. Dependendo do artefato inspecionado, é preciso aplicar a técnica apropriada. No caso de inspeção de código uma técnica comumente utilizada é a Stepwise Abstraction (SA). No entanto, sua aplicação é trabalhosa e consome muito tempo. Com o objetivo de auxiliar e facilitar a aplicação da SA, este trabalho apresenta a CRISTA (Code Reading Implemented with Stepwise Abstraction), uma ferramenta que apóia o processo de inspeção baseado em SA. Essa ferramenta usa uma metáfora visual para facilitar a navegação pelo código e possui vários recursos que ajudam na compreensão do código e em sua documentação. Devido a esses recursos, a CRISTA também auxilia nas atividades de engenharia reversa, re-engenharia e manutenção. Foram realizados três estudos experimentais com o objetivo de se obter uma realimentação sobre a usabilidade e a utilidade da ferramenta em atividades de inspeção e manutenção. Os resultados fornecem evidências de que a CRISTA é fácil de ser utilizada e apóia adequadamente o processo de inspeção, bem como a leitura de código utilizando a Stepwise Abstraction. Além disso, no contexto de manutenção, os recursos da ferramenta ajudam a diminuir o tempo dessa atividade.
107

Geography of College Opportunity: Situating Community College Baccalaureates across Demographic Differences

Leonard, Michael B. January 2020 (has links)
No description available.
108

Stanovení obsahu ligninu v jehlicích smrku ztepilého (Picea abies L. Karst.) pomocí laboratorní a obrazové spektroskopie / Assessment of lignin content in needles of Norway Spruce (Picea abies L. Karst.) using laboratory and image spectroscopy

Suchá, Renáta January 2013 (has links)
The master thesis deals with determination of selected biochemicals (lignin, carotenoids, water) content in Norway spruce needles using laboratory and imaging spectroscopy. The first part of thesis summarizes literature dealing with methods of estimating lignin and other biochemicals content. Three types of data are used in this thesis: 1. spectra measured by contact probe and ASD FieldSpec 4 Wide Res spectroradiometer, 2. spectra measured by integrating sphere and spectroradiometer and 3. aerial hyperspectral image data acquired by APEX sensor. The most useful transformation methods - first derivative and continuum removal are applied to the spectrum. Further the linear relationship between measured spectrum and content of biochemicals is analysed. Stepwise multiple linear regression is applied to select suitable wavelengths for modeling of biochemicals content in spruce needles. The model is also calculated and applied on the level of image hyperspectral data. Maps of lignin content in Norway spruce are the final output of these part of this. Next part of the thesis compares spectra measured by contact probe and spectra measured by integrating sphere. Diffrerence between the studied areas based on biochemicals content in spruce needles and several chemical elements in the soil and based on...
109

Antenna Optimization in Long-Term Evolution Networks

Deng, Qichen January 2013 (has links)
The aim of this master thesis is to study algorithms for automatically tuning antenna parameters to improve the performance of the radio access part of a telecommunication network and user experience. There are four dierent optimization algorithms, Stepwise Minimization Algorithm, Random Search Algorithm, Modied Steepest Descent Algorithm and Multi-Objective Genetic Algorithm to be applied to a model of a radio access network. The performances of all algorithms will be evaluated in this thesis. Moreover, a graphical user interface which is developed to facilitate the antenna tuning simulations will also be presented in the appendix of the report.
110

Human Fatigue in Prolonged Mentally Demanding Work-Tasks: An Observational Study in the Field

Ahmed, Shaheen 17 August 2013 (has links)
Worker fatigue has been the focus of research for many years. However, there is limited research available on the evaluation and measurement of fatigue for prolonged mentally demanding activities. The objectives of the study are (1 )to evaluate fatigue for prolonged, mentally demanding work-tasks by considering task-dependent, task-independent and personal factors, (2) to identify effective subjective and objective fatigue measures, (3) to establish a relationship between time and factors that affect fatigue (4) to develop models to predict fatigue. A total of 16 participants, eight participants with western cultural backgrounds and eight participants with eastern cultural backgrounds, currently employed in mentally demanding work-tasks (e.g., programmers, computer simulation experts, etc.) completed the study protocols. Each participant was evaluated during normal working hours in their workplace for a 4-hour test session, with a 15-minute break provided after two hours. Fatigue was evaluated using subjective questionnaires (Borg Perceived Level of Fatigue Scale and the Swedish Occupational Fatigue Index (SOFI)); and objective measures (change in resting heart rate and salivary cortisol excretion). Workload was also assessed using the NASA-TLX. Fatigue and workload scales were collected every 30 minutes, cortisol at the start and finish of each 2-hour work block, and heart rate throughout the test session. Fatigue significantly increased over time (p-value <0.0001). All measures, except cortisol hormone, returned to near baseline level following the 15-minute break (p-value <0.0001). Ethnicity was found to have limited effects on fatigue development. Poor to moderate (Rho = 0.35 to 0.75) significant correlations were observed between the subjective and objective measures. Time and fatigue load (a factor that impacts fatigue development) significantly interact to explain fatigue represented by a hyperbolic relationship. Predictive models explained a maximum of 87% of the variation in the fatigue measures. As expected, fatigue develops over time, especially when considering other factors that can impact fatigue (e.g. hours slept, hours of work), providing further evidence of the complex nature of fatigue. As the 15-minute break was found to reduce all measures of fatigue, the development of appropriate rest breaks may mitigate some of the negative consequences of fatigue.

Page generated in 0.049 seconds