• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 114
  • 12
  • 7
  • 7
  • 7
  • 4
  • 4
  • 3
  • 3
  • 2
  • 1
  • 1
  • Tagged with
  • 200
  • 200
  • 49
  • 44
  • 41
  • 41
  • 27
  • 24
  • 23
  • 23
  • 21
  • 21
  • 20
  • 20
  • 19
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

A feasibility study of combining expert system technology and linear programming techniques in dietetics / Annette van der Merwe

Van der Merwe, Annette January 2014 (has links)
Linear programming is widely used to solve various complex problems with many variables, subject to multiple constraints. Expert systems are created to provide expertise on complex problems through the application of inference procedures and advanced expert knowledge on facts relevant to the problem. The diet problem is well-known for its contribution to the development of linear programming. Over the years many variations and facets of the diet problem have been solved by means of linear programming techniques and expert systems respectively. In this study the feasibility of combining expert system technology and linear programming techniques to solve a diet problem topical to South Africa, is examined. A computer application is created that incorporates goal programming- and multi-objective linear programming models as the inference engine of an expert system. The program is successfully applied to test cases obtained through knowledge acquisition. The system delivers an eating-plan for an individual that conforms to the nutritional requirements of a healthy diet, includes the personal food preferences of that individual, and includes the food items that result in the lowest total cost. It further allows prioritization of the food preference and least cost factors through the use of weights. Based on the results, recommendations and contributions to the linear programming and expert system fields are presented. / MSc (Computer Science), North-West University, Potchefstroom Campus, 2014
42

DFM – Weldability analysis and system development

Pabolu, Venkata Krishna Rao January 2015 (has links)
This thesis work is mainly focused on the processes involved in manufacturing of aircraft engine components. The processes are especially about welding and welding methods. The basics of welding and the thesis support has been taken from the GKN Aerospace Sweden AB, a global aerospace product supplier.  The basic objective of this thesis work is to improve the usability of an automation system which is developed for evaluating the weldability of a part. A long run maintainability aspect of this automation system has been considered. The thesis work addresses the problems arising during the usage of a computerised automated system such as process transparency, recognisability, details traceability and other maintenance aspects such as maintainability and upgradability of the system in the course of time. The action research methodology has been used to address these problems.  Different approaches have been tried to finding the solution to those problems. A rule based manufacturability analysis system has been attempted to analyse the weldability of a component in terms of different welding technics. The software “Howtomation” has been used to improve the transparency of this analysis system. User recognisability and details tractability have been taken into account during the usage of a ruled based analysis system. The system attributes such as maintainability, upgradability, adaptiveness to modern welding methods has been addressed. The system suitability for large scale analysis has been considered.
43

Ontology learning for Semantic Web Services

Alfaries, Auhood January 2010 (has links)
The expansion of Semantic Web Services is restricted by traditional ontology engineering methods. Manual ontology development is time consuming, expensive and a resource exhaustive task. Consequently, it is important to support ontology engineers by automating the ontology acquisition process to help deliver the Semantic Web vision. Existing Web Services offer an affluent source of domain knowledge for ontology engineers. Ontology learning can be seen as a plug-in in the Web Service ontology development process, which can be used by ontology engineers to develop and maintain an ontology that evolves with current Web Services. Supporting the domain engineer with an automated tool whilst building an ontological domain model, serves the purpose of reducing time and effort in acquiring the domain concepts and relations from Web Service artefacts, whilst effectively speeding up the adoption of Semantic Web Services, thereby allowing current Web Services to accomplish their full potential. With that in mind, a Service Ontology Learning Framework (SOLF) is developed and applied to a real set of Web Services. The research contributes a rigorous method that effectively extracts domain concepts, and relations between these concepts, from Web Services and automatically builds the domain ontology. The method applies pattern-based information extraction techniques to automatically learn domain concepts and relations between those concepts. The framework is automated via building a tool that implements the techniques. Applying the SOLF and the tool on different sets of services results in an automatically built domain ontology model that represents semantic knowledge in the underlying domain. The framework effectiveness, in extracting domain concepts and relations, is evaluated by its appliance on varying sets of commercial Web Services including the financial domain. The standard evaluation metrics, precision and recall, are employed to determine both the accuracy and coverage of the learned ontology models. Both the lexical and structural dimensions of the models are evaluated thoroughly. The evaluation results are encouraging, providing concrete outcomes in an area that is little researched.
44

A Timescale Estimating Model for Rule-Based Systems

Moseley, Charles Warren 12 1900 (has links)
The purpose of this study was to explore the subject of timescale estimating for rule-based systems. A model for estimating the timescale necessary to build rule-based systems was built and then tested in a controlled environment.
45

Multimedia and live performance

Willcock, Ian January 2012 (has links)
The use of interactive multimedia within live performance is now well established and a significant body of exciting and sophisticated work has been produced. However, almost all work in the field seems to start by creating at least some of the software and hardware systems that will provide the infrastructure for the project, an approach which might involve significant duplication of effort. The research described in this thesis sets out to discover if there are common features in the practice of artists from a range of performance backgrounds and, if so, whether the features of a system which might support these common aspects could be established. Based on evidence from a set of interviews, it is shown that there are indeed common factors in work in this field, especially the intensive linking of elements in performances and the use of triggering or cuing. A statement of requirements for a generic system to support work in digital performance is then established based on interview analysis and personal creative work. A general model of live performance, based on set theory, is described which provides a rationale for the integration of digital technology within live performance. A computational model outlining the formal requirements of a general system for use in live performance is then presented. The thesis then describes the creation of a domain specific language specifically for controlling live performance and the development of a prototype reference implementation of a generic system, the Live Interactive Multimedia Performance Toolkit (LIMPT). The system is then evaluated from a number of standpoints including a set of criteria established earlier in the study. It is concluded that, while there are many resources currently used by artists working in digital performance (a comprehensive survey of current resources is presented), none offer the combination of functionality, usability and scalability offered by the prototype LIMPT system. The thesis concludes with a discussion of possible future work and the potential for increased creative activity in multimedia and live performance.
46

Developing a data quality scorecard that measures data quality in a data warehouse

Grillo, Aderibigbe January 2018 (has links)
The main purpose of this thesis is to develop a data quality scorecard (DQS) that aligns the data quality needs of the Data warehouse stakeholder group with selected data quality dimensions. To comprehend the research domain, a general and systematic literature review (SLR) was carried out, after which the research scope was established. Using Design Science Research (DSR) as the methodology to structure the research, three iterations were carried out to achieve the research aim highlighted in this thesis. In the first iteration, as DSR was used as a paradigm, the artefact was build from the results of the general and systematic literature review conduct. A data quality scorecard (DQS) was conceptualised. The result of the SLR and the recommendations for designing an effective scorecard provided the input for the development of the DQS. Using a System Usability Scale (SUS), to validate the usability of the DQS, the results of the first iteration suggest that the DW stakeholders found the DQS useful. The second iteration was conducted to further evaluate the DQS through a run through in the FMCG domain and then conducting a semi-structured interview. The thematic analysis of the semi-structured interviews demonstrated that the stakeholder's participants' found the DQS to be transparent; an additional reporting tool; Integrates; easy to use; consistent; and increases confidence in the data. However, the timeliness data dimension was found to be redundant, necessitating a modification to the DQS. The third iteration was conducted with similar steps as the second iteration but with the modified DQS in the oil and gas domain. The results from the third iteration suggest that DQS is a useful tool that is easy to use on a daily basis. The research contributes to theory by demonstrating a novel approach to DQS design This was achieved by ensuring the design of the DQS aligns with the data quality concern areas of the DW stakeholders and the data quality dimensions. Further, this research lay a good foundation for the future by establishing a DQS model that can be used as a base for further development.
47

Optimizing Cost and Data Entry for Assignment of Patients to Clinical Trials Using Analytical and Probabilistic Web-Based Agents

Goswami, Bhavesh Dineshbhai 05 November 2003 (has links)
A clinical trial is defined as a study conducted on a group of patients to determine the effect of a treatment. Assignment of patients to clinical trials is a data and labor intensive task. Usually, medical personnel manually check the eligibility of a patient for a clinical trial based on the patient's medical history and current medical condition. According to studies, most clinical trials are under-enrolled which negatively affects their effectiveness. We have developed web-based agents that can test the eligibility of patients for many clinical trials at once. We have tested various heuristics for optimizing cost and data entry needed in assigning patients to clinical trials. Testing eligibility of a patient for many clinical trials is only feasible if it is cost and data entry efficient. Agents with different heuristics were then tested on data from current breast cancer patients at the Moffitt Cancer Center. Results with different heuristics are compared with each other and with that of the clinicians. It is shown that cost savings are possible in clinical trial assignment. Also, less data entry is needed when probabilistic agents are used to reorder questions.
48

An exploration of student performance, utilization, and attitude to the use of a controlled content sequencing web based learning environment.

BROWN, Justin, j.brown@ecu.edu.au January 2005 (has links)
#DEFAULT
49

The intelligent placement of vegetation objects in 3D worlds

Jiang, Li January 2009 (has links)
In complex environments, increasing demand for exploring natural resources by both decision makers and the public is driving the search for sustainable planning initiatives. Among these is the use of virtual environments to support effective communication and informed decision-making. Central to the use of virtual environments is their development at low cost and with high realism. / This paper explores intelligent approaches to objects placement, orientation and scaling in virtual environments such that the process is both accurate and cost-effective. The work involves: (1) determining of the key rules to be applied for the classification of vegetation objects and the ways to build an object library according to ecological classes; (2) exploring rules for the placement of vegetation objects based on vegetation behaviours and the growth potential value collected for the research area; (3) developing GIS algorithms for implementation of these rules; and (4) integrating of the GIS algorithms into the existing SIEVE Direct software in such a way that the rules find expression in the virtual environment. / This project is an extension of an integrated research project SIEVE (Spatial Information Exploration and Visualization Environment) that looks at converting 2D GIS data into 3D models which are used for visualization. The aims of my contribution to this research are to develop rules for the classification and intelligent placement of objects, to build a normative object database for rural objects and to output these as 2D billboards or 3D models using the developed intelligent placement algorithms. / Based on Visual Basic Language and ArcObjects tools (ESRI ArcGIS and Game Engine), the outcomes of the intelligent placement process for vegetation objects are shown in the SIEVE environment with 2D images and 3D models. These GIS algorithms were tested in the integrated research project. According to the case study in Victoria, rule-based intelligent placement is based on the idea that certain decision-making processes can be codified into rules which, if followed automatically, would yield results similar to those which would occur in the natural environment. Final product produces Virtual Reality (VR) scenes similar to the natural landscapes. Considering the 2D images and 3D models represented in the SIEVE scenario and the rules (for natural and plantation vegetation) developed in conjunction with scientists in the Victorian Department of Primary Industries (DPI) and other agencies, outcomes will contribute to the development of policies for better land and resource management and link to wide ranging vegetation assessment projects.
50

"'Jag tycker såhär och då är det såhär.' Det är inte så." : En kvalitativ intervjustudie om lärares interaktionsstrategier i bemötandet av rasistiska, avvikande och kontroversiella uppfattningar

Ädel, Rebecca January 2012 (has links)
The purpose of this study is to investigate how six students at a Secondary School describes interaction patterns among teachers when students express in a way that the teacher perceives as racist and/or xenophobic, by presenting a qualitative interview study. The interviews were analyzed by two opposing models of value education: the traditional and the constructive model. The results show that students divide teachers in different categories based on five qualities: 1) they listening, 2) they accept students' opinions, 3) they allow discussion, 4) they are knowledgeable and 5) they can express their own opinions. These qualities are included in the value pedagogical model for deliberative conversations, whose strategies for interaction aim to create an understanding of different rules and values, thereby creating skills of rules by using a democratic approach. In contrast to this model, the rule-based moral education, in which the teacher uses his authority and refers to rules without giving an explanation for why they occurred, as strategies.

Page generated in 0.0666 seconds