Return to search

Criticality Assessments for Improving Algorithmic Robustness

<p> Though computational models typically assume all program steps execute flawlessly, that does not imply all steps are equally important if a failure should occur. In the "Constrained Reliability Allocation" problem, sufficient resources are guaranteed for operations that prompt eventual program termination on failure, but those operations that only cause output errors are given a limited budget of some vital resource, insufficient to ensure correct operation for each of them. </p><p> In this dissertation, I present a novel representation of failures based on a combination of their timing and location combined with criticality assessments&mdash;a method used to predict the behavior of systems operating outside their design criteria. I observe that strictly correct error measures hide interesting failure relationships, failure importance is often determined by failure <i> timing</i>, and recursion plays an important role in structuring output error. I employ these observations to improve the output error of two matrix multiplication methods through an economization procedure that moves failures from worse to better locations, thus providing a possible solution to the constrained reliability allocation problem. I show a 38% to 63% decrease in absolute value error on matrix multiplication algorithms, despite nearly identical failure counts between control and experimental studies. Finally, I show that efficient sorting algorithms are less robust at large scale than less efficient sorting algorithms.</p><p>

Identiferoai:union.ndltd.org:PROQUEST/oai:pqdtoai.proquest.com:10980232
Date09 April 2019
CreatorsJones, Thomas Berry
PublisherThe University of New Mexico
Source SetsProQuest.com
LanguageEnglish
Detected LanguageEnglish
Typethesis

Page generated in 0.0015 seconds