• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 65
  • 16
  • 11
  • 9
  • 5
  • 4
  • 4
  • 3
  • 3
  • 2
  • 1
  • Tagged with
  • 148
  • 148
  • 24
  • 14
  • 13
  • 13
  • 10
  • 10
  • 9
  • 9
  • 9
  • 8
  • 8
  • 8
  • 8
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Definition and validation of requirements management measures

Loconsole, Annabella January 2007 (has links)
<p>The quality of software systems depends on early activities in the software development process, of which the management of requirements is one. When requirements are not managed well, a project can fail or become more costly than intended, and the quality of the software developed can decrease. Among the requirements management practices, it is particularly important to quantify and predict requirements volatility, i.e., how much the requirements are likely to change over time. Software measures can help in quantifying and predicting requirements attributes like volatility. However, few measures have yet been defined, due to the fact that the early phases are hard to formalise. Furthermore, very few requirements measures have been validated, which would be needed in order to demonstrate that they are useful. The approach to requirements management in this thesis is quantitative, i.e. to monitor the requirements management activities and requirements volatility through software measurement. In this thesis, a set of 45 requirements management measures is presented. The measures were defined using the goal question metrics framework for the two predefined goals of the requirements management key process area of the capability maturity model for software. A subset of these measures was validated theoretically and empirically in four case studies. Furthermore, an analysis of validated measures in the literature was performed, showing that there is a lack of validated process, project, and requirements measures in software engineering. The studies presented in this thesis show that size measures are good estimators of requirements volatility. The important result is that size is relevant: increasing the size of a requirements document implies that the number of changes to requirements increases as well. Furthermore, subjective estimations of volatility were found to be inaccurate assessors of requirements volatility. These results suggest that practitioners should complement the subjective estimations for assessing volatility with the objective ones. Requirements engineers and project managers will benefit from the research presented in this thesis because the measures defined, proved to be predictors of volatility, can help in understanding how much requirements will change. By deploying the measures, the practitioners would be prepared for possible changes in the schedule and cost of a project, giving them the possibility of creating alternative plans, new cost estimates, and new software development schedules.</p>
42

Development of LCF life prediction model for wrinkled steel pipes

Zhang, Jianmin 06 1900 (has links)
This research program focused on the behaviour of low cycle fatigue (LCF) of wrinkled pipes, and was designed to develop the LCF life prediction models for the wrinkled pipes. It consisted of three phases of work, which are strip tests, full-scale pipe tests, and finite element analysis (FEA). In strip tests, 39 strip specimens were tested by a complete-reversed stroke-controlled method to investigate the effects of bend angle, bend radius, and stroke range on the low-cycle fatigue (LCF) life. Also, the LCF behaviour was explored by viewing the spectra of key variables and their corresponding hysteresis loops. The failure mechanism was discussed by examining the fracture surfaces. Two LCF life prediction models, life-based and deterioration rate-based, were developed and their prediction results were evaluated. In full-scale pipe tests, two specimens were tested according to a complicated loading procedure. The loading was a combination of axial load, bending moment, and internal pressure; and it consisted of monotonic loading stage and cyclic loading stage. Based on those two tests, the global and local behaviour were investigated, the failure mechanism was studied and the application of the developed LCF life prediction models was discussed. In FEA, three numerical models were developed and they were the strip model, the half-pipe model and the full-scale pipe model. In the strip model, the residual stresses and strains were analyzed and discussed. In the half-pipe model, the effects of pipe geometry, internal pressure, and global deformation on the wrinkle geometry were studied and discussed. In the full-scale pipe model, the full-scale pipe tests were simulated and both the global behaviour and local behaviour were discussed. From this research program, some important conclusions were obtained. The wrinkle geometry is found to be greatly related to the pipe geometry, internal pressure, and global deformation. The global deformation has become localized after the wrinkle is fully developed. The opening deformation cycle is more detrimental to wrinkled pipes than the closing deformation cycle. The test results also show that the seam weld governs the failure of wrinkled pipes if the pipes are subjected to cyclic axial deformation. The LCF life prediction models developed from this research program demonstrate good prediction capacity when they are applied to both strip tests and full-scale pipe tests. / Structural Engineering
43

Immunologic Risk Prediction Model for Kidney Graft Function

Bishop, Christina Diane 01 August 2011 (has links)
Clinicians lack appropriate non-invasive methods to be able to predict, diagnose, and reduce the risk of rejection in the years following kidney transplantation. Protocol biopsies and monitoring of serum creatinine levels are the most common methods of monitoring graft function after transplant; however, they have several negative aspects. Use of traditional factors regarding donors and recipients such as Human Leukocyte Antigen (HLA) DNA typing, pre-transplant anti-HLA antibody levels, and basic demographics (age, ethnicity/race, gender), has proved inadequate for post-transplant graft monitoring past the first few years. We propose that by utilizing immunologic factors available to clinicians across the United States, development of a non-invasive model for predicting renal graft outcome will provide a useful tool for post-transplant patient monitoring. We advocate an expanded model which incorporates both the traditional factors, as well as new factors, which have shown promise in predicting kidney outcome and are widely available for testing using commercial kits. These additional factors include major histocompatibility complex class I chain-related gene A (MICA) typing of donor and recipient, degree of matching for killer cell immunoglobulin-like receptors (KIRs) between donor and recipient, detection of MICA antibodies, and soluble CD30 level (sCD30). This proposed graft-function prediction model is the first to include all of these factors. Using multi-center data from adult recipients of standard-criteria deceased-donor (SCD) kidneys, we were able to construct models, containing the traditional factors only, for prediction of outcome at 1 year and 3 years post-transplant. Using single-center data from adult recipients of standard-criteria deceased-donor kidneys, we developed comparison models containing traditional factors only, as well as, expanded models containing the new suggested variables for prediction of outcome post-transplant. These additional variables, when incorporated into the expanded models provided greater positive predictive values, greater negative predictive values, and lower false negative rates for graft outcome at 1 year and at 3 years post-transplant than the models utilizing traditional factors only. Our results indicate that evaluation of sCD30, MICA and KIR as part of routine protocol testing, is helpful to clinicians for predicting risk of kidney graft rejection.
44

Definition and validation of requirements management measures

Loconsole, Annabella January 2007 (has links)
The quality of software systems depends on early activities in the software development process, of which the management of requirements is one. When requirements are not managed well, a project can fail or become more costly than intended, and the quality of the software developed can decrease. Among the requirements management practices, it is particularly important to quantify and predict requirements volatility, i.e., how much the requirements are likely to change over time. Software measures can help in quantifying and predicting requirements attributes like volatility. However, few measures have yet been defined, due to the fact that the early phases are hard to formalise. Furthermore, very few requirements measures have been validated, which would be needed in order to demonstrate that they are useful. The approach to requirements management in this thesis is quantitative, i.e. to monitor the requirements management activities and requirements volatility through software measurement. In this thesis, a set of 45 requirements management measures is presented. The measures were defined using the goal question metrics framework for the two predefined goals of the requirements management key process area of the capability maturity model for software. A subset of these measures was validated theoretically and empirically in four case studies. Furthermore, an analysis of validated measures in the literature was performed, showing that there is a lack of validated process, project, and requirements measures in software engineering. The studies presented in this thesis show that size measures are good estimators of requirements volatility. The important result is that size is relevant: increasing the size of a requirements document implies that the number of changes to requirements increases as well. Furthermore, subjective estimations of volatility were found to be inaccurate assessors of requirements volatility. These results suggest that practitioners should complement the subjective estimations for assessing volatility with the objective ones. Requirements engineers and project managers will benefit from the research presented in this thesis because the measures defined, proved to be predictors of volatility, can help in understanding how much requirements will change. By deploying the measures, the practitioners would be prepared for possible changes in the schedule and cost of a project, giving them the possibility of creating alternative plans, new cost estimates, and new software development schedules.
45

An Empirical Study on Corporate Governance and the Financial Failure Prediction Model¡ÐConsidering Industry Relative Ratios

Siao, Yu-Cing 27 June 2011 (has links)
A financial failure prediction model should be dynamic by adding latest information in an effort to improve the current predictive power, and this model also can be applied to different industries and periods. That is, it has prominent goodness of fit and stable parameter. In this study, I testify that if the modified independent variables, industry relative ratios, can improve the prediction rate by using logistic regression. My research is based on public information. This study constructs two kinds of model¡GModel I is constructed with original financial ratios and Model II with relative industry ratios. Both models incorporate additional variables related to corporate governance. My empirical results suggest that relative industry ratios enhance the predictive power of financial failure prediction within three partially overlapping periods. Further study focus on Model II, I isolated firms which are confronted with financial difficulties and they can¡¦t be discriminated from other normal firms by using the prediction model. My result demonstrates that the main difference between the former and the latter is debt/equity ratio. Those firms which can¡¦t be detected afford less liability. In addition, my studies also compare these undetected firms with their control group and find they still can be distinguished from their control group by using logit model. The accuracy rate of prediction can reach 92.42%. Last study we use event study to research the links between the default possibilities of firms and their stock prices. My results demonstrate that the default possibilities may cause abnormal returns.
46

Prediction of THMs formation in water distribution system of Downtown Kaohsiung City

Shin, Jian-Ming 09 July 2012 (has links)
Chlorine is always used in disinfection process of water treatment plant (WTP) in Kaohsiung, and the major of disinfection by-products (DBPs) is THMs. Reaction of THMs producing were found in relation to lots of factors of water quality. Therefore we used SPSS to analysis the relation between water quality and THMs content for making prediction model of THMs formation in water distribution system in downtown Kaohsiung city. It is expected to estimate the concentration of THMs with a simple method, and to reduce THMs by controlling some items of water quality. While the variance of TOC value is low, we found that concentration of THMs is related to temperature, pH, and residual chlorine by regression method. In these items, residual chlorine had the highest correlation. While the variance of TOC value is high, the correlation between the three items and THMs is weaker than the front one. From this finding, TOC is best one fitting factor in prediction of THMs. In this study, we found that R-square value of Multivariate regression prediction is higher than the value of Univariate regression prediction probably for THMs producing is a complex reaction, so formation model was hard using only one item of water quality.
47

Toward an Improved Model of Asphalt Binder Oxidation in Pavements

Prapaitrakul, Nikornpon 2009 December 1900 (has links)
Asphalt binder oxidation in pavements has been proven to be an ongoing process throughout a pavement's service life. Understanding the nature of the oxidation process is a critical step toward better pavement design to achieve greater pavement durability. The main component in asphalt binder oxidation in pavements is binder oxidative hardening. As the aromatic compounds in asphalt binders are oxidized, more polar carbonyl compounds are created, which results in stronger associations between asphalt components and eventually leads to an increase in asphalt elastic modulus and viscosity. Consequently, the performance of pavements is affected directly by asphalt binder hardening. Also, low levels of accessible air voids in pavements potentially relate to binder oxidation according to a recent research study. When the pavements have sufficiently high accessible air voids (4 percent or greater), the oxidation rate is largely determined by the temperature in the pavement. On the other hand, when the percentage of accessible air voids in the pavement is considerably lower (2 percent or less), the hardening rate of binders in pavements is reduced significantly. Field evidence is mounting that asphalt binder oxidization in pavements produces a binder that is more susceptible to thermal and fatigue cracking. While the fundamentals of this oxidation process are fairly well known, predicting quantitatively the rate of oxidation as a function of depth in the pavement, is not straightforward. A thermal and oxygen transport model, coupled with binder reaction kinetics, provides the basis for such calculations. A one-dimensional thermal transport model, coupled with site-specific model parameters and recent improvements in the availability of required input climate data, enables calculation of pavement temperatures throughout the year, which then is used in an asphalt binder oxidation and transport model to calculate binder properties in the pavement over time. Calculated binder property changes with depth and time are compared to measurements of binder oxidation in the field. The work in this study is aimed at understanding the oxidation kinetics of asphalt binders in pavements, determining the impact of accessible air void levels on asphalt hardening, and ultimately developing an improved model of asphalt binder oxidation in pavements.
48

The prediction of bus arrival time using Automatic Vehicle Location Systems data

Jeong, Ran Hee 17 February 2005 (has links)
Advanced Traveler Information System (ATIS) is one component of Intelligent Transportation Systems (ITS), and a major component of ATIS is travel time information. The provision of timely and accurate transit travel time information is important because it attracts additional ridership and increases the satisfaction of transit users. The cost of electronics and components for ITS has been decreased, and ITS deployment is growing nationwide. Automatic Vehicle Location (AVL) Systems, which is a part of ITS, have been adopted by many transit agencies. These allow them to track their transit vehicles in real-time. The need for the model or technique to predict transit travel time using AVL data is increasing. While some research on this topic has been conducted, it has been shown that more research on this topic is required. The objectives of this research were 1) to develop and apply a model to predict bus arrival time using AVL data, 2) to identify the prediction interval of bus arrival time and the probabilty of a bus being on time. In this research, the travel time prediction model explicitly included dwell times, schedule adherence by time period, and traffic congestion which were critical to predict accurate bus arrival times. The test bed was a bus route running in the downtown of Houston, Texas. A historical based model, regression models, and artificial neural network (ANN) models were developed to predict bus arrival time. It was found that the artificial neural network models performed considerably better than either historical data based models or multi linear regression models. It was hypothesized that the ANN was able to identify the complex non-linear relationship between travel time and the independent variables and this led to superior results. Because variability in travel time (both waiting and on-board) is extremely important for transit choices, it would also be useful to extend the model to provide not only estimates of travel time but also prediction intervals. With the ANN models, the prediction intervals of bus arrival time were calculated. Because the ANN models are non parametric models, conventional techniques for prediction intervals can not be used. Consequently, a newly developed computer-intensive method, the bootstrap technique was used to obtain prediction intervals of bus arrival time. On-time performance of a bus is very important to transit operators to provide quality service to transit passengers. To measure the on-time performance, the probability of a bus being on time is required. In addition to the prediction interval of bus arrival time, the probability that a given bus is on time was calculated. The probability density function of schedule adherence seemed to be the gamma distribution or the normal distribution. To determine which distribution is the best fit for the schedule adherence, a chi-squared goodness-of-fit test was used. In brief, the normal distribution estimates well the schedule adherence. With the normal distribution, the probability of a bus being on time, being ahead schedule, and being behind schedule can be estimated.
49

Autonomous Hopping Rotochute

Aksaray, Derya 05 April 2011 (has links)
The Hopping Rotochute is a promising micro vehicle with the capability of exploring rough and complex terrains with minimum energy consumption. While it is able to fly over obstacles via thrust produced by its coaxial rotor, its physical architecture, inspired from a "Weebles Wooble," provides re-orientation wherever it hits the ground. Therefore, this aerial and ground vehicle represents a potential hybrid vehicle capable of reconnaissance and surveillance missions in complex environments. The most recent version of the Hopping Rotochute is manually controlled to follow a trajectory. The control commands, listed in a file prior to the particular mission, are executed exactly as defined, like a "batch job," regardless of the uncertain external events. This control scheme is likely to cause great deviations from the route. Consequently, the vehicle may finish the mission very far away from the desired end point. However, if a vehicle is capable of receiving the control commands during a mission, "interactive processing" can be realized and efficient path tracking would be achieved. Hence, the development of the Hopping Rotochute that follows a trajectory autonomously reveals the foundation of this thesis. Two control approaches inspired the proposed methodology for developing an autonomous trajectory-following algorithm. The first approach is rule-based control that enables decision making through conditional statements. In this thesis, rule-based control is used to select a target point for a particular hop based on the existence of an obstacle and/or wind in the environment. The second approach is model predictive control employed to predict future outputs from hop performance models. In other words, this technique approaches the problem by providing intelligence pertaining to how a particular hop will end up before being attempted. Hence, the optimum control commands are selected based on the predicted performance of a particular hop. This research demonstrates that the autonomous Hopping Rotochute can be realized by rule-based control embedded with some performance models. In the assumption of known boundaries such as wall and ceiling information, this study has two aims: (1) to avoid obstacles by creating a smaller operational volume inside the real boundaries so that the vehicle is restricted from exiting the operational volume and no violation occurs within the real boundaries; (2) to estimate the wind by previous hops to select the next hopping point with respect to the estimated wind information. Based on the developed methodology, simulations are conducted for four different scenarios in the existence of obstacles and/or wind, and the results of the simulations are analyzed. Finally, based on the statistics of simulation results, the effectiveness of the proposed methodology is discussed.
50

A framework for developing road risk indices using quantile regression based crash prediction model

Wu, Hui, doctor of civil engineering 13 October 2011 (has links)
Safety reviews of existing roads are becoming a popular practice of many agencies nationally and internationally. Knowing road safety information is of great importance to both policymakers in addressing safety concerns and travelers in managing their trips. There have been various efforts in developing methodologies to measure and assess road safety in an effective manner. However, the existing research and practices are still constrained by their subjective and reactive nature. The goal of this research is to develop a framework of Road Risk Indices (RRIs) to assess road risks of existing highway infrastructure for both road users and agencies based on road geometrics, traffic conditions, and historical crash data. The proposed RRIs are intended to give a comprehensive and objective view of road safety, so that safety problems can be identified at an early stage before they rise in the form of accidents. A methodological framework of formulating RRIs that integrates results from crash prediction models and historical crash data is proposed, and Linear Referencing tools in the ArcGIS software are used to develop digital maps to publish estimated RRIs. These maps provide basic Geographic Information System (GIS) functions, including viewing and querying RRIs, and performing spatial analysis tasks. A semi-parameter count model and quantile regression based estimation are proposed to capture the specific characteristics of crash data and provide more robust and accurate predictions on crash counts. Crash data collected on Interstate Highways in Washington State for the year 2002 was extracted from the Highway Safety Information System (HSIS) and used for the case study. The results from the case study show that the proposed framework is capable of capturing statistical correlations between traffic crashes and influencing factors, leading to the effective integration of safety information in composite indices. / text

Page generated in 0.1153 seconds