Spelling suggestions: "subject:"[een] HEURISTIC"" "subject:"[enn] HEURISTIC""
201 |
Performance understanding and tuning of iterative computation using profiling techniquesOzarde, Sarang Anil 18 May 2010 (has links)
Most applications spend a significant amount of time in the iterative parts of a computation. They typically iterate over the same set of operations with different values. These values either depend on inputs or values calculated in previous iterations. While loops capture some iterative behavior, in many cases such a behavior is spread over whole program sometimes through recursion. Understanding iterative behavior of the
computation can be very useful to fine-tune it. In this thesis, we present a profiling based
framework to understand and improve performance of iterative computation. We capture
the state of iterations in two aspects 1) Algorithmic State 2) Program State. We demonstrate the applicability of our framework for capturing algorithmic state by applying it to the SAT Solvers and program state by applying it to a variety of benchmarks exhibiting completely parallelizable loops. Further, we show that such a
performance characterization can be successfully used to improve the performance of the
underlying application.
Many high performance combinatorial optimization applications involve SAT solving. A variety of SAT solvers have been developed that employ different data structures and different propagation methods for converging on a fixed point for generating a satisfiable solution. The performance debugging and tuning of SAT solvers
to a given domain is an important problem encountered in practice. Unfortunately not much work has been done to quantify the iterative efficiency of SAT solvers. In this work, we develop quantifiable measures for calculating convergence efficiency of SAT solvers. Here, we capture the Algorithmic state of the application by tracking the assignment of variables for each iteration. A compact representation of profile data is developed to track the rate of progress and convergence. The novelty of this approach is
that it is independent of the specific strategies used in individual solvers, yet it gives key
insights into the "progress" and "convergence behavior" of the solver in terms of a specific implementation at hand. An analysis tool is written to interpret the profile data and extract values of the following metrics such as: average convergence rate, efficiency of iteration and variable stabilization. Finally, using this system we produce a study of 4 well known SAT solvers to compare their iterative efficiency using random as well as industrial benchmarks. Using the framework, iterative inefficiencies that lead to slow convergence are identified. We also show how to fine-tune the solvers by adapting the key steps.
We also show that the similar profile data representation can be easily applied to loops, in general, to capture their program state. One of the key attributes of the program state inside loops is their branch behavior. We demonstrate the applicability of the framework by profiling completely parallelizable loops (no cross-iteration dependence) and by storing the branching behavior of each iteration. The branch behavior across a group of iterations is important in devising the thread warps from parallel loops for efficient execution on GPUs. We show how some loops can be effectively parallelized on GPUs using this information.
|
202 |
Heuristic multi-sequence search methodsJochumsson, Thorvaldur January 2001 (has links)
<p>With increasing size of sequence databases heuristic search approaches have become necessary. Hidden Markov models are the best performing search methods known today with respect to discriminative power, but are too time complex to be practical when searching in large sequence databases. In this report, heuristic algorithms that reduce the search space before searching with traditional search algorithms of hidden Markov models are presented and experimentally validated. The results of the validation show that the heuristic search algorithms will speed up the searches without decreasing their discriminative power.</p>
|
203 |
Exemplification in newspapers a content analysis and case studies /Weaver, Dustin A. January 2009 (has links)
Thesis (M.S.)--Ohio University, August, 2009. / Title from PDF t.p. Includes bibliographical references.
|
204 |
A phenomenological exploration of ecological consciousness developmentWhite, Peter. January 2009 (has links)
Thesis (Ph.D.)--University of Western Sydney, 2009. / A thesis presented to the University of Western Sydney, College of Arts, in fulfilment of the requirements for the degree of Doctor of Philosophy. Includes bibliographies.
|
205 |
A heuristic method for a rostering problem with the objective of equal accumulated flying timeYe, Xugang. Blumsack, Steve. January 2003 (has links)
Thesis (M.S.)--Florida State University, 2003. / Advisor: Dr. Steve Blumsack, Florida State University, College of Arts and Sciences, Dept. of Mathemtics. Title and description from dissertation home page (viewed Mar. 3, 2004). Includes bibliographical references.
|
206 |
Stochastic Optimization Models for Rapid Detection of Viruses in Cellphone NetworksLee, Jinho, doctor of operations research and industrial engineering 20 November 2012 (has links)
We develop a class of models to represent the dynamics of a virus spreading in a cellphone network, employing a taxonomy that includes five key characteristics. Based on the resulting dynamics governing the spread, we present optimization models to rapidly detect the virus, subject to resource limitations. We consider two goals, maximizing the probability of detecting a virus by a time threshold and minimizing the expected time to detection, which can be applied to all spread models we consider. We establish a submodularity result for these two objective functions that ensures that a greedy heuristic yields a well-known constant-factor (63%) approximation. We relate the latter optimization problem, under a specific virus-spread mechanism from our class of models, to a classic facility-location model. Using data from a large carrier, we build several base cellphone contact networks of different scale. We then rescale these base networks using the so-called c-core decomposition that removes vertices of low degree in a recursive way. We further show that this down-sampling strategy preserves, in general, the topological properties of the base networks, based on testing several measures. For the objective that maximizes the probability that we detect a virus by a time threshold, we provide a sample-average optimization model that yields an asymptotically-optimal design for locating the detection devices, as the number of samples grows large. To choose a relevant time threshold, we perform a simulation for some spread models. We then test the performance of our proposed solution methods by solving the presented optimization models for some spread dynamics using some of the contact networks built after the c-core decomposition. The computational results show that the greedy algorithm is an efficient way to solve the corresponding sample-average approximation model, and the greedy solutions outperform some other simple solution approaches. / text
|
207 |
Parallel machine scheduling with time windowsRojanasoonthon, Siwate 28 August 2008 (has links)
Not available / text
|
208 |
Connecting the Old with the New: Developing a Podcast Usability Heuristic from the Canons of RhetoricWolfram, Laurissa J 07 May 2011 (has links)
Though a relatively new form of communication technology, the podcast serves as a remediated form of the classical orator—merging the classical practices of oration with current methods of production and delivery. This study draws connections from the historical five canons of rhetoric and current usability studies to build a heuristic for developing and evaluating usable podcast design.
|
209 |
Relevance and rationalisation in the Wason selection taskLucas, Erica Jane January 2007 (has links)
Evans' (e.g., 2006) heuristic-analytic theory of the selection task proposes that card selections are triggered by relevance-determining heuristics, with analytic processing serving merely to rationalise heuristically-cued decisions. Evans (1996) provided evidence for the theory by setting up an inspection-time paradigm. He used computerpresented selection tasks and instructions for participants to indicate (with a mousepointer) cards under consideration. The theory predicts that longer inspection times should be associated with selected cards (which are subjected to rationalisation) than with rejected cards. Evans found support for this idea. Roberts (1998b) however, argued that mouse-pointing gives rise to artefactual support for Evans' predictions because of biases associated with the task format and the use of mouse pointing. In the present thesis all sources of artefact were eradicated by combining careful task constructions with eye-movement tracking to measure directly on-line attentional processing. Across a series of experiments good evidence was produced for the robustness of the inspection-time effect, supporting the predictions of the heuristicanalytic account. It was notable, however, that the magnitude of the inspection-time effect was always small. A further experiment separated the presentation of rules from associated cards to avoid possible dilution of the inspection-time effect arising from parallel rule and card presentation. However, the observed inspection time effect remained small. A series of experiments utilising think-aloud methods were then employed to test further the predictions concerning relevance effects and rationalisation processes in the selection task. Predictions in relation to these experiments were that selected cards should be associated with more references to both their facing and their hidden sides than rejected cards, which are not subjected to analytic rationalisation. Support was found for all heuristic-analytic predictions, even .,' . where 'select/don't select' decisions were enforced for all cards. These experiments also clarify the role played by secondary heuristics in cueing the consideration of hidden card values during rationalisation. It is suggested that whilst Oaksford and Chater's (e.g., 2003) information gain theory can provide a compelling account of our protocol findings, Evans' heuristic-analytic theory provides the best account of the full findings of the thesis. The mental models theory (e.g., Johnson-Laird & Byrne, 2002) fares less well as an explanation of the full dataset.
|
210 |
Optimal transit route network design problem : algorithms, implementations, and numerical resultsFan, Wei, 1974- 02 August 2011 (has links)
Not available / text
|
Page generated in 0.0466 seconds