Spelling suggestions: "subject:"deproblem solving data processing"" "subject:"deproblem solving mata processing""
1 |
Determining the Factors Influential in the Validation of Computer-based Problem Solving SystemsMorehead, Leslie Anne 01 January 1996 (has links)
Examination of the literature on methodologies for verifying and validating complex computer-based Problem Solving Systems led to a general hypothesis that there exist measurable features of systems that are correlated with the best testing methods for those systems. Three features (Technical Complexity, Human Involvement, and Observability) were selected as the basis of the current study. A survey of systems currently operating in over a dozen countries explored relationships between these system features, test methods, and the degree to which systems were considered valid. Analysis of the data revealed that certain system features and certain test methods are indeed related to reported levels of confidence in a wide variety of systems. A set of hypotheses was developed, focused in such a way that they correspond to linear equations that can be estimated and tested for significance using statistical regression analysis. Of 24 tested hypotheses, 17 were accepted, resulting in 49 significant models predicting validation and verification percentages, using 37 significant variables. These models explain between 28% and 86% of total variation. Interpretation of these models (equations) leads directly to useful recommendations regarding system features and types of validation methods that are most directly associated with the verification and validation of complex computer systems. The key result of the study is the identification of a set of sixteen system features and test methods that are multiply correlated with reported levels of verification and validation. Representative examples are: • People are more likely to trust a system if it models a real-world event that occurs frequently. • A system is more likely to be accepted if users were involved in its design. • Users prefer systems that give them a large choice of output. • The longer the code, or the greater the number of modules, or the more programmers involved on the project, the less likely people are to believe a system is error-free and reliable. From these results recommendations are developed that bear strongly on proper resource allocation for testing computer-based Problem Solving Systems. Furthermore, they provide useful guidelines on what should reasonably be expected from the validation process.
|
2 |
Evaluating the development and effectiveness of grit and growth mindset among high school students in a computer programming projectKench, Delia Joan January 2017 (has links)
A dissertation submitted to the Faculty of Science, University of the Witwatersrand, in fulfilment of the requirements for the degree of Master of Science
Johannesburg 2016. / This dissertation investigates grit “passion and perseverance” for a long-term goal and growth mindset in grade 11 high school students as they code a non-trivial pro-gramming project in Java over a six-week period. Students are often challenged by the complexities of programming and can be overwhelmed when they encounter errors causing them to give up and not persevere. The programming project includes scaffolding with frequent feedback to increase the motivation of students. The study used mixed methods research that used both quantitative and qualitative data to find answers to the research questions. Whilst the correlation between grit, mindset and the project results were moderate, that students submitted their project numerous times showed an indication to perseverance. The data gathered from the interviews further indicated that the students’ perseverance led them to employ their own prob-lem-solving strategies when they encounter problems. / MT 2017
|
3 |
Data-oriented specification of exception handling.January 1990 (has links)
by Cheng Kar Wai. / Thesis (M.Phil.)--Chinese University of Hong Kong, 1990. / Bibliography: leaves [195-199] / ABSTRACT / Chapter CHAPTER 1 --- INTRODUCTION --- p.1 / Chapter 1.1 --- Problem --- p.2 / Chapter 1 .2 --- Approach --- p.3 / Chapter 1.2.1 --- Programming Approach --- p.4 / Chapter 1.2.2 --- Specification Approach --- p.5 / Chapter 1.3 --- Thesis Organization --- p.11 / Chapter CHAPTER 2 --- MODEL SPECIFICATION APPROACH --- p.12 / Chapter 2 .1 --- Overview --- p.14 / Chapter 2.2 --- Compilation Phases --- p.17 / Chapter 2.3 --- Array Graph --- p.22 / Chapter 2.4 --- Scheduling --- p.25 / Chapter CHAPTER 3 --- SURVEY --- p.31 / Chapter 3.1 --- Goodenough's Proposal --- p.31 / Chapter 3.2 --- Exception Handling Models --- p.34 / Chapter 3.3 --- Programming Languages --- p.40 / Chapter 3.4 --- Data-Oriented Exception Handling --- p.49 / Chapter 3.5 --- Specification Languages --- p.50 / Chapter CHAPTER 4 --- EXCEPTION HANDLING SPECIFICATION --- p.55 / Chapter 4.1 --- Data-Oriented Exceptions Specification --- p.55 / Chapter 4.2 --- Assertions for Exception Handling --- p.59 / Chapter 4.2.1 --- User-defined Exception Condition Assertion --- p.61 / Chapter 4.2.2 --- Fatal Condition Assertion --- p.62 / Chapter 4.2.3 --- Replacement Assertion --- p.64 / Chapter 4.2.3.1 --- Scenario 1: Immediate Replacement --- p.68 / Chapter 4.2.3.2 --- Scenario 2: Direct Dependency --- p.71 / Chapter 4.2.3.3 --- Scenario 3: Indirect Dependency --- p.72 / Chapter 4.2.3.4 --- Scenario 4: Lower Dimensionality --- p.74 / Chapter 4.2.4 --- Message Vector Assertion --- p.76 / Chapter CHAPTER 5 --- ARRAY GRAPH FOR EXCEPTION HANDLING --- p.78 / Chapter 5.1 --- Subgraph Embedding --- p.78 / Chapter 5.1.1 --- User-Defined Exception Conditions --- p.80 / Chapter 5.1.2 --- Fatal Conditions --- p.82 / Chapter 5.1.3 --- Pre-Defined Exception Conditions --- p.83 / Chapter 5.1.4 --- Replacement Assertions --- p.85 / Chapter 5.1.5 --- Message Vector Assertions --- p.89 / Chapter 5.2 --- Data Dependency Interpretation --- p.91 / Chapter 5.2.1 --- Immediate Replacement --- p.92 / Chapter 5.2.2 --- Direct Dependency --- p.92 / Chapter 5.2.3 --- Indirect Dependency --- p.93 / Chapter 5.2.4 --- Shared Data Variable --- p.99 / Chapter CHAPTER 6 --- SCHEDULING FOR EXCEPTION HANDLING --- p.104 / Chapter 6.1 --- Backward Path Tracing --- p.106 / Chapter 6.1.1 --- Forward Versus Backward Tracing --- p.106 / Chapter 6.1.2 --- Assertion-Marking Strategy --- p.113 / Chapter 6.2 --- Grain Scheduling --- p.116 / Chapter 6.2.1 --- New Constraints --- p.120 / Chapter 6.3 --- Delayed Exception Raise Event --- p.125 / Chapter 6.4 --- Enhancement of Scheduling Algorithm --- p.126 / Chapter 6.5 --- Control Flow Issues --- p.128 / Chapter 6.5.1 --- Immediate Replacement --- p.130 / Chapter 6.5.2 --- Direct Dependency --- p.131 / Chapter 6.5.3 --- Indirect Dependency --- p.132 / Chapter 6.5.4 --- Lower Dimensionality --- p.133 / Chapter CHAPTER 7 --- MORE COMPLICATED SCHEDULING --- p.135 / Chapter 7.1 --- Multiple Exception Handling Assertions --- p.137 / Chapter 7.1.1 --- Overlapped Scopes of Exception Grain --- p.138 / Chapter 7.1.2 --- Priorities in Scheduling --- p.156 / Chapter 7.2 --- Single Replacement Assertion --- p.160 / Chapter 7.2.1 --- Multiple Exception Conditions --- p.160 / Chapter 7.2.2 --- Conditional Replacement --- p.163 / Chapter 7 .3 --- Loop Optimization --- p.164 / Chapter 7.4 --- Modifications to the Scheduling Algorithm --- p.177 / Chapter 7.5 --- Implementation --- p.180 / Chapter 7.5.1 --- Syntax Checking --- p.180 / Chapter 7.5.2 --- Array Graph Construction --- p.182 / Chapter 7.5.3 --- Array Graph Analysis --- p.185 / Chapter 7.5.4 --- Generation of Schedule with Exception Handling Subgraph --- p.186 / Chapter CHAPTER 8 --- CONCLUSIONS --- p.187 / Chapter 8 .1 --- Future Work --- p.188 / APPENDIX / Chapter 1. --- Backward Tracing/Assertion Marking Strategy / REFERENCE
|
4 |
Effects of language features, templates, and procedural skills on problem solving in programming.January 1988 (has links)
by Kong Siu Cheung. / Thesis (M.Ph.)--Chinese University of Hong Kong, 1988. / Bibliography: leaves 114-122.
|
5 |
Individuals solving problems : the effects of problem solving strategies and problem solving technologies on generating solutionsWelsh, Kimberly D. January 1997 (has links)
This experiment was designed to compare two problem solving strategies, brainstorming and the hierarchical technique, and two problem solving technologies, computer software and pencil and paper. The first purpose of this study was to explore what effects computer software and pencil and paper have on the facilitation of solutions for individual problem solvers. Subjects generated solutions by either recording ideas on a computer or by writing ideas down on paper. The second purpose of this study was to examine how individuals evaluate solutions they have generated.Specifically, we were looking for solution evaluations to differ according to which problem solving strategy subjects received training on, brainstorming or the hierarchical technique. Solutions were rated on overall quality, practicality, and originality on a scale ranging from 0 (being the lowest possible score) to 4 (being the highest possible score).Subjects who used a computer to record ideas generated significantly more solutions than those subjects recording ideas on paper. Subjects trained with the hierarchical technique generated ideas higher in quality than those trained with brainstorming. Subjects trained with brainstorming generated more original ideas than those trained with the hierarchical technique. Finally, subjects rating of practicality did not differ according to problem solving strategy. / Department of Psychological Science
|
6 |
Task Domain Knowledge as a Moderator of Information System UsageMarshall, Thomas E. (Thomas Edward), 1954- 05 1900 (has links)
Information system (IS) support of human problem solving during the complex task of auditing within a computer environment was investigated. 74 computer audit specialist professionals from nine firms participated in the field experiment. Task accomplishment behavior was recorded via a computerized activity-logging technique. Theoretical constructs of interest included: 1) IS problem-solving support, 2) task domain knowledge, and 3) decision-making behavior. It was theorized that task domain knowledge influences the type of IS most functionally appropriate for usage by that individual. IS task presentation served as the treatment variable. Task domain knowledge was investigated as a moderating factor of task accomplishment Task accomplishment, the dependent variable, was defined as search control strategy and quality of task performance. A subject's task domain knowledge was assessed over seven theoretical domains. Subjects were assigned to higher or lower task domain knowledge groups based on performance on professional competency examination questions. Research hypothesis one investigated the effects of task domain knowledge on task accomplishment behavior. Several task domain knowledge bases were found to influence both search control strategy and task performance. Task presentation ordering effects, hypothesis two, were not found to significantly influence search control strategy or task performance. The third hypothesis investigated interaction effects of a subject's task domain knowledge and task presentation ordering treatments on task accomplishment behavior. An interaction effect was found to influence the subject's search control strategy. The computer-specific knowledge base and task presentation ordering treatments were found to interact as joint moderators of search control strategy. Task performance was not found to be significantly influenced by interaction effects. Users' task accomplishment was modeled based upon problem-solving behavior. A subject's level of task domain knowledge was found to serve as a moderating factor of IS usage. Human information-processing strategies, IS usage, and task domain knowledge were integrated into a comprehensive IS user task model. This integrated model provides a robust characterization scheme for IS problem-solving support in a complex task environment.
|
7 |
Massive parallelism for combinatorial problems by hardware acceleration with an application to the label switching problemSteere, Edward January 2016 (has links)
A dissertation submitted to the Faculty of Engineering and the Built Environment, University
of the Witwatersrand, in fulfilment of the requirements for the degree of Master of Science in
Engineering. / This dissertation proposes an approach to solving hard combinatorial problems in massively
parallel architectures using parallel metaheuristics.
Combinatorial problems are common in many scientific fields. Scientific progress is constrained
by the fact that, even using state of the art algorithms, solving hard combinatorial
problems can take days or weeks. This is the case with the Label Switching Problem (LSP)
in the field of Bioinformatics.
In this field, prior work to solve the LSP has resulted in the program CLUMPP (CLUster
Matching and Permutation Program). CLUMPP focuses solely on the use of a sequential,
classical heuristic, and has had success in smaller low complexity problems.
By contrast this dissertation proposes the Parallel Solvers model for the acceleration of
hard combinatorial problems. This model draws on the commonalities evident in algorithms
and strategies in metaheuristics.
After investigating the effectiveness of the mechanisms apparent in the Parallel Solvers
model with regards to the LSP, the author developed DePermute, an algorithm which can be
used to solve the LSP significantly faster. Results were generated from time based testing of
simulated data, as well as data freely available on the Internet as part of various projects.
An investigation into the effectiveness of DePermute was carried out on a CPU (Central
Processing Unit) based computer. The time based testing was carried out on a CPU based
computer and on a Graphics Processing Unit (GPU) attached to a CPU host computer. The
dissertation also proposes the design of an Field Programmable Gate Arrays (FGPA) based
implementation of DePermute.
Using Parallel Solvers, in the DePermute algorithm, the time taken for population group
sizes, K, ranging from K = 5 to 20 was improved by up to two orders of magnitude using the
GPU implementation and aggressive settings for CLUMPP. The CPU implementation, while
slower than the GPU implementation still outperforms CLUMPP, using aggressive settings,
marginally and usually with better quality. In addition it outperforms CLUMPP by at least
an order of magnitude when CLUMPP is set to use higher quality settings.
Combinatorial problems can be very difficult. Parallel Solvers has been effective in the
field of Bioinformatics in solving the LSP. This dissertation proposes that it might assist in
the reasoning and design of algorithms in other fields. / MT2017
|
8 |
Contribution à la méthodologie de l'étude des différences individuelles dans la résolution de problèmes: approche par la simulation et la problématique de sa validation, approche statistiqueKarnas, Guy January 1974 (has links)
Doctorat en sciences psychologiques / info:eu-repo/semantics/nonPublished
|
9 |
Classification of the difficulty in accelerating problems using GPUsTristram, Uvedale Roy January 2014 (has links)
Scientists continually require additional processing power, as this enables them to compute larger problem sizes, use more complex models and algorithms, and solve problems previously thought computationally impractical. General-purpose computation on graphics processing units (GPGPU) can help in this regard, as there is great potential in using graphics processors to accelerate many scientific models and algorithms. However, some problems are considerably harder to accelerate than others, and it may be challenging for those new to GPGPU to ascertain the difficulty of accelerating a particular problem or seek appropriate optimisation guidance. Through what was learned in the acceleration of a hydrological uncertainty ensemble model, large numbers of k-difference string comparisons, and a radix sort, problem attributes have been identified that can assist in the evaluation of the difficulty in accelerating a problem using GPUs. The identified attributes are inherent parallelism, branch divergence, problem size, required computational parallelism, memory access pattern regularity, data transfer overhead, and thread cooperation. Using these attributes as difficulty indicators, an initial problem difficulty classification framework has been created that aids in GPU acceleration difficulty evaluation. This framework further facilitates directed guidance on suggested optimisations and required knowledge based on problem classification, which has been demonstrated for the aforementioned accelerated problems. It is anticipated that this framework, or a derivative thereof, will prove to be a useful resource for new or novice GPGPU developers in the evaluation of potential problems for GPU acceleration.
|
10 |
Concepts in parallel problem solvingKornfeld, William A January 1982 (has links)
Thesis (Ph.D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1982. / MICROFICHE COPY AVAILABLE IN ARCHIVES AND ENGINEERING / Bibliography: leaves 180-184. / by William Arthur Kornfeld. / Ph.D.
|
Page generated in 0.1005 seconds