Spelling suggestions: "subject:"fineline degression"" "subject:"fineline aregression""
1 |
TEST CASES REDUCTION IN SOFTWARE PRODUCT LINE USING REGRESSION TESTING28 March 2012 (has links)
Application Engineering is a field where software organizations develop software products from a predefined Software Product Line. The time and cost allotted to come up with a new product variant is limited. Lack of systematic support in testing leads to redundancy. Redundancy in this context can be found in test-cases that do not contribute towards fault-detection and testing leads to an increased testing effort. This thesis work proposes a framework to reduce the testing effort, aimed at avoiding testing redundancy. Feature Model diagrams have been constructed from the assumed specification requirements. These Feature Model diagrams have been used to derive test models such as Object Model diagram and State Chart diagram. Unit testing and System testing have been performed on test models to obtain test cases that have been stored in the repository. Regression testing has been applied to these test cases to classify them into Reusable, Re-testable and Obsolete.
|
2 |
Minimax Design for Approximate Straight Line RegressionDaemi, Maryam Unknown Date
No description available.
|
3 |
Algorithms for Scalable On-line Machine Learning on Regression TasksSchoenke, Jan H. 25 April 2019 (has links)
In the realm of ever increasing data volume and traffic the processing of data as a stream is key in order to build flexible and scalable data processing engines. On-line machine learning provides powerful algorithms for extracting predictive models from such data streams even if the modeled relation is time-variant in nature. The modeling of real valued data in on-line regression tasks is especially important as it connects to modeling and system identification tasks in engineering domains and bridges to other fields of machine learning like classification and reinforcement learning. Therefore, this thesis considers the problem of on-line regression on time variant data streams and introduces a new multi resolution perspective for tackling it.
The proposed incremental learning system, called AS-MRA, comprises a new interpolation scheme for symmetric simplicial input segmentations, a layered approximation structure of sequential local refinement layers and a learning architecture for efficiently training the layer structure. A key concept for making these components work together in harmony is a differential parameter encoding between subsequent refinement layers which allows to decompose the target function into independent additional components represented as individual refinement layers. The whole AS-MRA approach is designed to form a smooth approximation while having its computational demands scaling linearly towards the input dimension and the overall expressiveness and therefore potential storage demands scaling exponentially towards input dimension.
The AS-MRA provides no mandatory design parameters, but offers opportunities for the user to state tolerance parameters for the expected prediction performance which automatically and adaptively shape the resulting layer structure during the learning process. Other optional design parameters allow to restrict the resource consumption with respect to computational and memory demands. The effect of these parameters and the learning behavior of the AS-MRA as such are investigated with respect to various learning issues and compared to different related on-line learning approaches. The merits and contributions of the AS-MRA are experimentally shown and linked to general considerations about the relation between key concepts of the AS-MRA and fundamental results in machine learning.
|
4 |
Efficiency Comparison of Distribution-Free Transformations in the Straight-Line Regression Problem / 非参直线回归问题中不同变换方法的有效性比较Zhang, Ling January 2010 (has links)
<p>In statistical inference of the distribution-free straight-line regression problem, two common transformations, rank transformation and sign transformation, are used to construct the test statistics. When shall we need to use the transformations and which transformation is more efficient are two common questions met by researchers. In this thesis, we will discuss the comparison of the efficiencies of the statistics before and after the rank transformation or the sign transformation in both theoretical and practical ways. Simulation is also used to compare the efficiencies of the statistics under different distributions. Some recommendations about when to use transformations and which one to choose are put forward associated with the conclusion drawn from the research work we have done.</p>
|
5 |
Efficiency Comparison of Distribution-Free Transformations in the Straight-Line Regression Problem / 非参直线回归问题中不同变换方法的有效性比较Zhang, Ling January 2010 (has links)
In statistical inference of the distribution-free straight-line regression problem, two common transformations, rank transformation and sign transformation, are used to construct the test statistics. When shall we need to use the transformations and which transformation is more efficient are two common questions met by researchers. In this thesis, we will discuss the comparison of the efficiencies of the statistics before and after the rank transformation or the sign transformation in both theoretical and practical ways. Simulation is also used to compare the efficiencies of the statistics under different distributions. Some recommendations about when to use transformations and which one to choose are put forward associated with the conclusion drawn from the research work we have done.
|
Page generated in 0.0724 seconds