Spelling suggestions: "subject:"chesscomputer cience."" "subject:"chesscomputer cscience.""
11 |
Computer simulation of the two body abrasive wear process.Naicker, Theo. January 2002 (has links)
New computer technologies are applied to the classical material engineering two-body
abrasive wear process. The computer simulation provides an interactive and visual
representation of the wear process. The influence of grit size, grit tip radius and load (at
constant workpiece hardness and tool path) on the wear rate, wear coefficient and wear
surface topography is predicted. The simulation implements microcutting and
microploughing with material displacement to the sides of the groove. The validation of
the simulation is demonstrated by comparing with the previous modelling literature and
with experiments. / Thesis (M.Sc.)-University of Natal,Durban, 2002.
|
12 |
A support environment for the teaching of programming.Stewart, Rosanne. January 1996 (has links)
This thesis examines the effectiveness of a specially constructed computer based support
environment for the teaching of computer programming to novice programmers. In order
to achieve this, the following distinct activities were pursued. Firstly, an in-depth
investigation of programming misconceptions and techniques used for overcoming them
was carried out. Secondly, the educational principles gained from this investigation were
used to design and implement a computer based environment to support novice
programmers learning the Pascal language. Finally, several statistical methods were used
to compare students who made use of the support environment to those who did not and
the results are discussed. / Thesis (M.Sc.)-University of Natal, Pietermaritzburg, 1996.
|
13 |
Application of genetic algorithms to the travelling salesperson problem.McKenzie, Peter John Campbell. January 1996 (has links)
Genetic Algorithms (GAs) can be easily applied to many different problems since
they make few assumptions about the application domain and perform relatively well.
They can also be modified with some success for handling a particular problem. The
travelling salesperson problem (TSP) is a famous NP-hard problem in combinatorial
optimization. As a result it has no known polynomial time solution. The aim of this
dissertation will be to investigate the application of a number of GAs to the TSP.
These results will be compared with those of traditional solutions to the TSP and
with the results of other applications of the GA to the TSP. / Thesis (M.Sc.)-University of Natal, Pietermaritzburg, 1996.
|
14 |
A study of genetic algorithms for solving the school timetabling problem.Raghavjee, Rushil. 17 December 2013 (has links)
The school timetabling problem is a common optimization problem faced by many primary and secondary schools. Each school has its own set of requirements and constraints that are dependent on various factors such as the number of resources available and rules specified by the department of education for that country. There are two objectives in this study. In previous studies, genetic algorithms have only been used to solve a single type of school timetabling problem. The first objective of this study is to test the effectiveness of a genetic algorithm approach in solving more than one type of school timetabling problem. The second objective is to evaluate a genetic algorithm that uses an indirect representation (IGA) when solving the school timetabling problem. This IGA approach is then compared to the performance of a genetic algorithm that uses a direct representation (DGA). This approach has been covered in other domains such as job shop scheduling but has not been covered for the school timetabling problem.
Both the DGA and IGA were tested on five school timetabling problems. Both the algorithms were initially developed based on findings in the literature. They were then improved iteratively based on their performance when tested on the problems. The processes of the genetic algorithms that were improved were the method of initial population creation, the selection methods and the genetic operators used.
Both the DGA and the IGA were found to produce timetables that were competitive and in some cases superior to that of other methods such as simulated annealing and tabu search. It was found that different processes (i.e. the method of initial population creation, selection methods and genetic operators) were needed for each problem in order to produce the best results. When comparing the performance of the two approaches, the IGA outperformed the DGA for all of the tested school timetabling problems. / Thesis (M.Sc.)-University of KwaZulu-Natal, Pietermaritzburg, 2013.
|
15 |
Shot classification in broadcast soccer video.Guimaraes, Lionel. January 2013 (has links)
Event understanding systems, responsible for automatically generating human relatable event descriptions
from video sequences, is an open problem in computer vision research that has many applications in the sports
domain, such as indexing and retrieval systems for sports video. Background modelling and shot classification
of broadcast video are important steps in event understanding in video sequences. Shot classification seeks
to identify shots, i.e. the labelling of continuous frame sequences captured by a single camera action such
as long shot, close-up and audience shot, while background modelling seeks to classify pixels in an image
as foreground/background. Many features used for shot classification are built upon the background model
therefore background modelling is an essential part of shot classification.
This dissertation reports on an investigation into techniques and procedures for background modelling and
classification of shots in broadcast soccer videos. Broadcast video refers to video which would typically be
viewed by a person at home on their television set and imposes constraints that are often not considered in
many approaches to event detection. In this work we analyse the performances of two background modelling
techniques appropriate for broadcast video, the colour distance model and Gaussian mixture model. The
performance of the background models depends on correctly set parameters. Some techniques offer better
updating schemes and thus adapt better to the changing conditions of a game, some are shown to be more
robust to changes in broadcast technique and are therefore of greater value in shot classification. Our results
show the colour distance model slightly outperformed the Gaussian mixture model with both techniques
performing similar to those found in literature.
Many features useful for shot classification are proposed in the literature. This dissertation identifies these
features and presents a detailed analysis and comparison of various features appropriate for shot classification
in broadcast soccer video. Once a feature set is established, a classifier is required to determine a shot class
based on the extracted features. We establish the best use of the feature set and decision tree parameters
that result in the best performance and then use a combined feature set to train a neural network to
classify shots. The combined feature set in conjunction with the neural network classifier proved effective in
classifying shots and in some situations outperformed those techniques found in literature. / Thesis (M.Sc.)-University of KwaZulu-Natal, Durban, 2012.
|
16 |
Modelling with mathematica.Murrell, Hugh. January 1994 (has links)
In this thesis a number of mathematical models are investigated with the aid of the modelling
package Mathematica. Some of the models are of a mechanical nature and some of the
models are laboratories that have been constructed for the purpose of assisting researchers
in a particular field.
In the early sections of the thesis mechanical models are investigated. After the equations
of motion for the model have been presented, Mathematica is employed to generate solutions
which are then used to drive animations of the model. The frames of the animations
are graphical snapshots of the model in motion. Mathematica proves to be an ideal tool
for this type of modelling since it combines algebraic, numeric and graphics capabilities on
one platform.
In the later sections of this thesis, Mathematica laboratories are created for investigating
models in two different fields. The first laboratory is a collection of routines for performing
Phase-Plane analysis of planar autonomous systems of ordinary differential equations. A
model of a mathematical concept called a bifurcation is investigated and an animation of
this mathematical event is produced.
The second laboratory is intended to help researchers in the tomography field. A standard
filtered back-projection algorithm for reconstructing images from their projections is implemented.
In the final section of the thesis an indication of how the tomography laboratory
could be used is presented. Wavelet theory is used to construct a new filter that could be
used in filtered back-projection tomography. / Thesis (Ph.D.)-University of Natal, Durban, 1994.
|
17 |
A practical investigation of meteor-burst communications.Melville, Stuart William. January 1991 (has links)
This study considers the meteor-burst communication (MBC) environment at
three levels. At the lowest level, the trails themselves are studied and analysed.
Then individual links are studied in order to determine the data throughput and
wait time that might be expected at various data rates. Finally, at the top level,
MBC networks are studied in order to provide information on the effects of
routing strategies, topologies, and connectivity in such networks.
A significant amount of theoretical work has been done in the classification of
meteor trails, and the analysis of the throughput potential of the channel. At the
same time the issues of wait time on MBC links, and MBC network strategies,
have been largely ignored. The work presented here is based on data captured
on actual monitoring links, and is intended to provide both an observational
comparison to theoretical predictions in the well-researched areas, and a source
of base information for the others.
Chapter 1 of this thesis gives an overview of the field of meteor-burst communications.
Prior work in the field is discussed, as are the advantages and disadvantages
of the channel, and current application areas.
Chapter 2 describes work done on the classification of observed meteor trails
into distinctive 'families'. The rule-based system designed for this task is discussed
as well as the eventual classification schema produced, which is far more
comprehensive and consistent than previously proposed schemas.
Chapter 3 deals with the throughput potential of the channel, based on the
observed trails. A comparison to predicted results, both as regards fixed and
adaptive data-rates, is made with some notable differences between predicted
v
results and observed results highlighted. The trail families with the largest
contribution to the throughput capacity of the channel are identified.
Chapter 4 deals with wait time in meteor-burst communications. The data rates
at which wait time is minimised in the links used are found, and compared to the
rates at which throughput was optimised. These are found to be very different,
as indeed are the contributions of the various trail families at these rates.
Chapter 5 describes a software system designed to analyse the effect of routing
strategies in MBC networks, and presents initial results derived from this
system. Certain features of the channel, in particular its sporadic nature, are
shown to have significant effects on network performance.
Chapter 6 continues the presentation of network results, specifically concentrating
on the effect of topologies and connectivity within MBC networks.
Chapter 7 concludes the thesis, highlighting suggested areas for further research
as well as summarising the more important results presented. / Thesis (Ph.D.)-University of Natal, Durban, 1991.
|
18 |
Ontology driven multi-agent systems : an architecture for sensor web applications.Moodley, Deshendran. January 2009 (has links)
Advances in sensor technology and space science have resulted in the availability of vast quantities of
high quality earth observation data. This data can be used for monitoring the earth and to enhance our
understanding of natural processes. Sensor Web researchers are working on constructing a worldwide
computing infrastructure that enables dynamic sharing and analysis of complex heterogeneous earth observation
data sets. Key challenges that are currently being investigated include data integration; service
discovery, reuse and composition; semantic interoperability; and system dynamism. Two emerging technologies
that have shown promise in dealing with these challenges are ontologies and software agents.
This research investigates how these technologies can be integrated into an Ontology Driven Multi-Agent
System (ODMAS) for the Sensor Web.
The research proposes an ODMAS framework and an implemented middleware platform, i.e. the
Sensor Web Agent Platform (SWAP). SWAP deals with ontology construction, ontology use, and agent
based design, implementation and deployment. It provides a semantic infrastructure, an abstract architecture,
an internal agent architecture and a Multi-Agent System (MAS) middleware platform. Distinguishing
features include: the incorporation of Bayesian Networks to represent and reason about uncertain
knowledge; ontologies to describe system entities such as agent services, interaction protocols and agent
workflows; and a flexible adapter based MAS platform that facilitates agent development, execution and
deployment. SWAP aims to guide and ease the design, development and deployment of dynamic alerting
and monitoring applications. The efficacy of SWAP is demonstrated by two satellite image processing
applications, viz. wildfire detection and monitoring informal settlement. This approach can provide significant
benefits to a wide range of Sensor Web users. These include: developers for deploying agents
and agent based applications; end users for accessing, managing and visualising information provided by
real time monitoring applications, and scientists who can use the Sensor Web as a scientific computing
platform to facilitate knowledge sharing and discovery.
An Ontology Driven Multi-Agent Sensor Web has the potential to forever change the way in which
geospatial data and knowledge is accessed and used. This research describes this far reaching vision,
identifies key challenges and provides a first step towards the vision. / Thesis (Ph.D.)-University of KwaZulu-Natal, 2009.
|
19 |
Speech recognition and blackboard expert systems.Loureiro, Guy Marchand. January 1992 (has links)
Spoken language is used by people to communicate naturally with one another. A simplistic
view of the communication process is as follows. Person A wishes to communicate an idea
to person B. The idea, initiated in the mind/brain of person A is encoded into speech
signals by means of the person A's speech production mechanism, the vocal apparata in
the vocal tract. Various kinds of noise may interfere with the speech signals as they travel
to person B. The resulting signal is captured by person B's speech receiving mechanism,
the ear. It is then analysed and decoded into a meaningful message by the brain of
person B.
This thesis concerns itself with the investigation of and attempt to automate the receiving
and decoding of English sentences using a machine - that is to perform the task of
person B in the above scenario using a computer. The aim is not only to produce a
sequence of phonetic sounds, but to look at the problems of building in the 'mind of the
machine', a picture of the meanings, intentions, absurdities and realities of the spoken
message.
The various models, algorithms and techniques of speech recognition and speech
understanding systems are examined. Speech signals are captured and digitised by
hardware. The digital samples are analysed and the important distinguishing features of all
speech sounds are identified. These are then used to classify speech sounds in subsequent
spoken words. The way speech sounds are joined together to form syllables and words
introduces difficult problems to the automatic recognition process. Speech sounds are
blurred, overlapped or left out due to the effects of coarticulation. Finally, natural language
processing issues, such as the importance of syntax (the structure) and semantics (the
meaning) of sentences, are studied.
A system to control and unite all the above processing is considered. The blackboard expert
system model of the widely reported HEARSAY-II speech recognition system is reviewed
as the system with the best potential for the above tasks. / Thesis (M.Sc.)-University of KwaZulu-Natal, 1992.
|
20 |
The development of a swarm intelligent simulation tool for sugarcane transport logistics systems.McDonald, Brendon Clyde. 14 November 2013 (has links)
Transport logistics systems typically evolve as networks over time, which may result
in system rigidity and cause changes to become expensive and time consuming. In
this study a logistics model, named TranSwarm, was developed to simulate sugarcane
harvesting, transport and mill-yard activities for a mill supply area. The aim was to
simulate produce flow, and allow individual working entities to make decisions,
driven by rules and protocols, based on their micro-environments. Noodsberg mill
was selected as a case study because of low current levels of synchronization. Growers
were assumed to operate independent harvesting and transport systems causing
inconsistent convergences at the mill. This diverse and fragmented system provided a
suitable environment to construct a model that would consider interactions between
individual growers and their respective transport systems. Ideally, by assessing the
micro-decisions of individuals and how they influence the larger holistic supply chain,
TranSwarm quantifies the impacts of different types of transport practices, such as
staggering shift changes, transport scheduling, core sampling and consortium-based
logistics. TranSwarm is visual, mechanistic and represents key entities, such as roads,
farm groupings and the mill. The system uses discrete events to create a dynamic and
stochastic environment from which observations and conclusions can be drawn. This
approach potentially allows stakeholders to identify key components and interactions
that may jeopardize overall efficiency and to use the system to test new working
protocols and logistics rules for improving the supply chain. / Thesis (M.Sc.)-University of KwaZulu-Natal, 2008.
|
Page generated in 0.0883 seconds