• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 6
  • 1
  • 1
  • Tagged with
  • 9
  • 9
  • 3
  • 3
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Legacy system upgrade for software risk assessment

Alexander, Byron Vernon Terry 12 1900 (has links)
Over the past 40 years limited progress has been made to help practitioners estimate the risk and the required effort necessary to deliver software solutions. Recent developments improve this outlook, one in particular, the research of Juan Carlos Nogueira. Dr. Nogueira developed a formal model for risk assessment that can be used to estimate a software projectαs risk when examined against a desired development time-line. Dr. Nogueira developed his model based on data collected from a series of experiments conducted on the ViteÌ Project simulation. This unique approach provides a starting point towards a proven formal model for risk assessment. Another issue with software development, especially in the Department of Defense (DoD), is dealing with aging legacy software systems. These systems perform the functionality of their design, but their interfaces are obsolete and changing requirements limit their functional usefulness. This thesis is an exercise in upgrading a legacy system licensed to the DoD, ViteÌ Project, for use with ongoing DoD research that seeks to discern truly quantifiable criteria that can be used to more accurately estimate the length of time needed to complete any software project. Accurately projecting software development times and accurate software development costs have eluded software developers for decades. / US Navy (USN) author
2

Legacy system upgrade for software risk assessment /

Alexander, Byron Vernon Terry. January 2001 (has links) (PDF)
Thesis (M.S in Computer Science) Naval Postgtaduate School, December 2001. / Thesis Advisor(s): Berzins, Valdis ; Murrah, Michael. "December 2001." Includes bibliographical references (p. 91). Also available online.
3

Large-scale snowpack estimation using ensemble data assimilation methodologies, satellite observations and synthetic datasets

Su, Hua 03 June 2010 (has links)
This work focuses on a series of studies that contribute to the development and test of advanced large-scale snow data assimilation methodologies. Compared to the existing snow data assimilation methods and strategies, which are limited in the domain size and landscape coverage, the number of satellite sensors, and the accuracy and reliability of the product, the present work covers the continental domain, compares single- and multi-sensor data assimilations, and explores uncertainties in parameter and model structure. In the first study a continental-scale snow water equivalent (SWE) data assimilation experiment is presented, which incorporates Moderate Resolution Imaging Spectroradiometer (MODIS) snow cover fraction (SCF) data to Community Land Model (CLM) estimates via the ensemble Kalman filter (EnKF). The greatest improvements of the EnKF approach are centered in the mountainous West, the northern Great Plains, and the west and east coast regions, with the magnitude of corrections (compared to the use of model only) greater than one standard deviation (calculated from SWE climatology) at given areas. Relatively poor performance of the EnKF, however, is found in the boreal forest region. In the second study, snowpack related parameter and model structure errors were explicitly considered through a group of synthetic EnKF simulations which integrate synthetic datasets with model estimates. The inclusion of a new parameter estimation scheme augments the EnKF performance, for example, increasing the Nash-Sutcliffe efficiency of season-long SWE estimates from 0.22 (without parameter estimation) to 0.96. In this study, the model structure error is found to significantly impact the robustness of parameter estimation. In the third study, a multi-sensor snow data assimilation system over North America was developed and evaluated. It integrates both Gravity Recovery and Climate Experiment (GRACE) Terrestrial water storage (TWS) and MODIS SCF information into CLM using the ensemble Kalman filter (EnKF) and smoother (EnKS). This GRACE/MODIS data assimilation run achieves a significantly better performance over the MODIS only run in Saint Lawrence, Fraser, Mackenzie, Churchill & Nelson, and Yukon river basins. These improvements demonstrate the value of integrating complementary information for continental-scale snow estimation. / text
4

Should you optimize your portfolio? : On portfolio optimization: The optimized strategy versus the naïve and market strategy on the Swedish stock market

Ramilton, Alan January 2014 (has links)
In this paper, I evaluate the out-of-sample performance of the portfolio optimizer relative to the naïve and market strategy on the Swedish stock market from January 1998 to December 2012. Recent studies suggest that simpler strategies, such as the naïve strategy, outperforms optimized strategies and that they should be implemented in the absence of better estimation models. Of the 12 strategies I evaluate, 11 of them significantly outperform both benchmark strategies in terms of Sharpe ratio. I find that the no-short-sales constrained minimum-variance strategy is preferred over the mean-variance strategy, and that the historical sample estimator creates better minimum-variance portfolios than the single-factor model and the three-factor model. My results suggest that there are considerable gains to optimization in terms of risk reduction and return in the context of portfolio selection.
5

A Software Benchmarking Methodology For Effort Estimation

Nabi, Mina 01 September 2012 (has links) (PDF)
Software project managers usually use benchmarking repositories to estimate effort, cost, and duration of the software development which will be used to appropriately plan, monitor and control the project activities. In addition, precision of benchmarking repositories is a critical factor in software effort estimation process which plays subsequently a critical role in the success of the software development project. In order to construct such a precise benchmarking data repository, it is important to have defined benchmarking data attributes and data characteristics and to have collected project data accordingly. On the other hand, studies show that data characteristics of benchmark data sets have impact on generalizing the studies which are based on using these datasets. Quality of data repository is not only depended on quality of collected data, but also it is related to how these data are collected. In this thesis, a benchmarking methodology is proposed for organizations to collect benchmarking data for effort estimation purposes. This methodology consists of three main components: benchmarking measures, benchmarking data collection processes, and benchmarking data collection tool. In this approach results of previous studies from the literature were used too. In order to verify and validate the methodology project data were collected in two middle size software organizations and one small size organization by using automated benchmarking data collection tool. Also, effort estimation models were constructed and evaluated for these projects data and impact of different characteristics of the projects was inspected in effort estimation models.
6

PERDAS DE ÁGUA POR ESCOAMENTO SUPERFICIAL DE UM SOLO COM DIFERENTES NÍVEIS DE RESÍDUOS VEGETAIS E DECLIVIDADES DO TERRENO / WATER LOSSES THROUGH SURFACE RUNOFF OF A SOIL WITH DIFFERENT LEVELS OF CROP RESIDUE AND SLOPE STEEPNESS

Santa, Cleiton Dalla 26 February 2010 (has links)
The search for information and technology that can contribute for an adequate management of soil and water utilization has been in an increasing need, as their conservation is of great matter for a sustainable agriculture. The goal of the present work was determining and modelling the water losses through surface runoff, of a soil with different levels of crop residue and declivity, using simulated rainfall. The work was conducted inside an area for experiments of the Departamento de Engenharia Rural da UFSM, in four locations, which had slope of zero, 2.5, 5 e 8%, respectively. The design of the experiment was completely randomized, with three levels of crop residue of oat in the surface (0, 2.5 e 5 Mg ha-1) in three replicates. The parcels of the experiment of 0.5 m2 were delimited using galvanized metal sheets implanted in the soil with a gutter at the bottom, to collect the water of the surface runoff (measuring in 5 minutes intervals). The rainfall intensities of 30, 80 and 120 mm h-1 were applied using a rainfall simulator of multiple and oscillating nozzles, doing two simulations (Rainfall 1 and 2, applied alternately each day) in each slope for each intensity. For each simulated rainfall it was determined the starting time and the runoff rate, besides rain (amount, duration and intensity), initial humidity and soil saturation. For estimating the surface runoff, it was used the modified Smith model. The model parameters were adjusted through multivariate equations. In the Rainfall 1, with intensity of 30 mm h-1 there was no runoff for the zero slope, and for the remaining slopes, the runoff represented 1.0, 8.8 e 11.5% of the amount of rain applied. In the rainfalls with intensity of 80 and 120 mm h-1, the water losses through surface runoff represented about 59 and 53% of the amount of rain applied, respectively. In Rainfall 2, the water losses through surface runoff represented 33, 45 e 73%, of the amount applied, respectively, for the intensities of 30, 80 and 120 mm h-1. The presence of crop residue delays the start of surface runoff and reduces the surface runoff rate, for different rain intensities (30, 80 e 120 mm h-1) and slope steepness (zero, 2.5, 5 and 8%). The multivariate equations generated from the rain characteristics and soil humidity showed good precision in estimating the surface runoff rate and the starting time of the runoff. / A conservação da água e do solo constitui um aspecto de grande relevância para uma produção agrícola mais sustentável, assim, a busca de tecnologias e informações que contribuam para um adequado manejo do solo e o uso da água se faz cada vez mais necessária. O objetivo desse trabalho foi determinar e modelar as perdas de água por escoamento superficial, de um solo com diferentes níveis de resíduos vegetais na superfície e declividades do terreno, a partir de chuvas simuladas. O trabalho foi desenvolvido em área experimental do Departamento de Engenharia Rural da UFSM, em quatro locais, os quais possuíam declividades de zero, 2,5, 5 e 8%, respectivamente. O delineamento experimental utilizado foi o inteiramente casualizado, com três níveis de resíduos vegetais de aveia na superfície do solo (0, 2,5 e 5 Mg ha-1) em três repetições. As parcelas experimentais de 0,5m2 foram delimitadas por chapas metálicas galvanizadas cravadas no solo contendo na parte inferior uma calha coletora, para coletar a água do escoamento superficial (mensurado em intervalos de cinco minutos). A intensidade de precipitação de 30, 80 e 120 mm h-1 foram aplicadas utilizando um simulador de chuvas de bicos múltiplos e oscilantes, fazendo duas simulações (chuva 1 e 2, aplicados alternadamente, a cada dia) em cada declividade para cada intensidade. . Em cada intensidade de chuva simulada simulação foi determinado o tempo de início e a taxa de escoamento superficial, além da chuva (quantidade, duração e intensidade), umidade inicial e de saturação do solo. Utilizou-se o modelo modificado de Smith para estimativa do escoamento superficial. Os parâmetros do modelo foram ajustados através de equações multivariadas. Na chuva 1, intensidade de 30 mm h-1 não houve escoamento superficial na declividade zero, nas demais declividades, o escoamento superficial representou 1,0, 8,8 e 11,5%, da quantidade de chuva aplicada. Nas chuvas de intensidades de 80 e 120 mm h-1, as perdas de água por escoamento superficial representaram em média 59 e 53% da quantidade de chuva aplicada, respectivamente. Na chuva 2, as perdas de água por escoamento superficial representaram 33, 45 e 73%, da quantidade aplicada, respectivamente, para as intensidades de 30, 80 e 120 mm h-1. A presença de resíduo vegetal na superfície retarda o início do escoamento superficial e reduz a taxa de escoamento superficial constante, para diferentes intensidades de chuva (30, 80 e 120 mm h-1) e declividades do terreno (zero, 2,5, 5 e 8%). As equações multivariadas geradas a partir das características da chuva e do teor de umidade no solo apresentaram boa precisão na estimativa da taxa de escoamento superficial constante e o tempo de início do escoamento superficial.
7

Yield curve estimation models with real market data implementation and performance observation

Cheng Andersson, Penny Peng January 2020 (has links)
It always exists different methods/models to build a yield curve from a set of observed market rates even when the curve completely reproduces the price of the given instruments. To create an accurate and smooth interest rate curve has been a challenging all the time. The purpose of this thesis is to use the real market data to construct the yield curves by the bootstrapping method and the Smith Wilson model in order to observe and compare the performance ability between the models. Furthermore, the extended Nelson Siegel model is introduced without implementation. Instead of implementation I compare the ENS model and the traditional bootstrapping method from a more theoretical perspective in order to perceive the performance capabilities of them.
8

Indirect emissions estimation model for investments in the automobile sector, fossil fuel sector and utilities sector / Estimering av indirekta emissioner i fordonssektorn, fossila-bränslen-sektorn och energisektorn

Thungström, Kerstin January 2018 (has links)
To combat climate change multiple initiatives have been launched to steer the financial market towards a more sustainable and resilient path. For example the Montreal Pledge that have committed over 120 investors to measure and disclose their carbon footprints of their portfolios. ISS-Ethix Climate Solution provides climate change related services to investors. In order to evaluate companies’ sustainability ISS-Ethix Climate Solution estimates companies’ direct and indirect greenhouse gas emissions. To simplify these estimations, the emissions from corporations are divided into three scopes, where scope 1 and 2 cover the emissions from the combustion of fuels used in the company and electricity generation. Scope 3 corresponds to all other emissions generated upstream and downstream the companies’ supply chain. The aim of this study was to help ISS-Ethix Climate Solution to develop a model that estimates the indirect scope-3-emission intensity for companies in the automobile sector, fossil fuel sector and utility sector. The first objective was to examine if the variations within the sectors could be explained and categorized. To carry this out each sector was defined and their emission sources identified. The emissions could be explained and categorized for the automobile sector and fossil fuel sector. However, the emissions for the utility sector could only partly be explained and categorized. The second objective was to examine which parameters and subcategories were relevant for estimating the emissions. Two methods were investigated to carry out the second objective; correlation analysis and the average-data method. No correlations could be found between any of the sectors and the selected parameters. The estimated emissions using the average-data method were verified to the companies reported emissions. For the automobile and the fossil fuel companies the estimated emissions followed the same trend as the reported data. However, no trend could be found for the utility companies. Estimating emissions using the average-data method requires a certain corporation structure. The method can be used for corporations with a specific output, but does not suit corporations with a more complex structure. The largest limitation with the models was the information shortages from the corporations. Therefore increased transparency from the companies is a necessity in order to develop the models. / För att minska klimatförändringen har ett flertal initiativ lanserats för att göra finanssektorn mer hållbar. Tillexempel Montreal förbindelsen som har fått över 120 investerare att mäta och publicera klimatutsläppen i sina aktieportföljer. Företaget ISS- Ethix Climate Solution erbjuder klimatrelaterade tjänster för investerare. För att värdera hur hållbart ett företag är estimerar ISS-Ethix deras direkta och indirekta utsläpp av växthusgaser. För att förenkla dessa estimeringar är utsläppen indelade i tre så kallade scopes (områden), där scope 1 och 2 motsvarar emissionerna som genereras av att företaget förbränner fossila bränslen och deras elanvändning. Scope 3 motsvarar alla utsläpp som sker uppströms och nedströms företagens leverantörskedja. Syftet med denna studie var att hjälpa ISS-Ethix Climate Solution att utveckla en modell som estimerade scope 3 utsläppen från företag inom fordonssektorn, fossila- bränslen-sektorn och energisektorn. Det första målet var att undersöka om variationerna inom sektorerna kunde förklaras och kategoriseras. Detta utfördes genom att varje sektor först definierades och utsläppskällorna identifierades. Emissionerna kunde förklaras och kategoriseras för fordonssektorn och fossila-bränslen-sektorn. Däremot kunde utsläppen från energisektorn bara delvis förklaras och kategoriseras. Det andra målet var att undersöka vilka parametrar och sub-kategorier som var viktiga för att estimera sektorernas emissioner. För att göra detta undersöktes två olika metoder; korrelationsanalys och medelvärdesmetoden. Inga korrelationer kunde hittas mellan någon av sektorerna och de undersökta parametrarna. De estimerade emissionerna när medelvärdesmetoden användes, verifierades mot företagens självrapporterade utsläpp. För fordonssektorn och fossila-bränslen-sektorn följde de estimerade och rapporterade utsläppen samma trend. Däremot påträffades ingen trend för energibolagen. Att estimera växthusgasutsläpp med hjälp av en medelvärdesmetod kräver en viss typ av företagsstruktur. Metoden kan användas för företag med en specifik produkt, men är inte lämplig för företag med en mer komplex struktur. Modellernas största begränsning var informationsbristen från företagen. Därför behövs mer transparens från företagen för att kunna utveckla modellerna.
9

Spatially Correlated Data Accuracy Estimation Models in Wireless Sensor Networks

Karjee, Jyotirmoy January 2013 (has links) (PDF)
One of the major applications of wireless sensor networks is to sense accurate and reliable data from the physical environment with or without a priori knowledge of data statistics. To extract accurate data from the physical environment, we investigate spatial data correlation among sensor nodes to develop data accuracy models. We propose three data accuracy models namely Estimated Data Accuracy (EDA) model, Cluster based Data Accuracy (CDA) model and Distributed Cluster based Data Accuracy (DCDA) model with a priori knowledge of data statistics. Due to the deployment of high density of sensor nodes, observed data are highly correlated among sensor nodes which form distributed clusters in space. We describe two clustering algorithms called Deterministic Distributed Clustering (DDC) algorithm and Spatial Data Correlation based Distributed Clustering (SDCDC) algorithm implemented under CDA model and DCDA model respectively. Moreover, due to data correlation in the network, it has redundancy in data collected by sensor nodes. Hence, it is not necessary for all sensor nodes to transmit their highly correlated data to the central node (sink node or cluster head node). Even an optimal set of sensor nodes are capable of measuring accurate data and transmitting the accurate, precise data to the central node. This reduces data redundancy, energy consumption and data transmission cost to increase the lifetime of sensor networks. Finally, we propose a fourth accuracy model called Adaptive Data Accuracy (ADA) model that doesn't require any a priori knowledge of data statistics. ADA model can sense continuous data stream at regular time intervals to estimate accurate data from the environment and select an optimal set of sensor nodes for data transmission to the network. Data transmission can be further reduced for these optimal sensor nodes by transmitting a subset of sensor data using a methodology called Spatio-Temporal Data Prediction (STDP) model under data reduction strategies. Furthermore, we implement data accuracy model when the network is under a threat of malicious attack.

Page generated in 0.1066 seconds