• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 20453
  • 5226
  • 1262
  • 1211
  • 867
  • 670
  • 435
  • 410
  • 410
  • 410
  • 410
  • 410
  • 407
  • 158
  • 156
  • Tagged with
  • 34402
  • 34402
  • 14117
  • 10833
  • 3107
  • 2982
  • 2738
  • 2541
  • 2483
  • 2354
  • 2279
  • 2178
  • 2166
  • 2046
  • 1937
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
111

Data mining and database systems : integrating conceptual clustering with a relational database management system

Lepinioti, Konstantina January 2011 (has links)
Many clustering algorithms have been developed and improved over the years to cater for large scale data clustering. However, much of this work has been in developing numeric based algorithms that use efficient summarisations to scale to large data sets. There is a growing need for scalable categorical clustering algorithms as, although numeric based algorithms can be adapted to categorical data, they do not always produce good results. This thesis presents a categorical conceptual clustering algorithm that can scale to large data sets using appropriate data summarisations. Data mining is distinguished from machine learning by the use of larger data sets that are often stored in database management systems (DBMSs). Many clustering algorithms require data to be extracted from the DBMS and reformatted for input to the algorithm. This thesis presents an approach that integrates conceptual clustering with a DBMS. The presented approach makes the algorithm main memory independent and supports on-line data mining.
112

Embedding requirements within the model driven architecture

Fouad, A. January 2011 (has links)
The Model Driven Architecture (MDA) is offered as one way forward in software systems modelling to connect software design with the business domain. The general focus of the MDA is the development of software systems by performing transformations between software design models, and the automatic generation of application code from those models. Software systems are provided by developers, whose experience and models are not always in line with those of other stakeholders, which presents a challenge for the community. From reviewing the available literature, it is found that whilst many models and notations are available, those that are significantly supported by the MDA may not be best for use by non technical stakeholders. In addition, the MDA does not explicitly consider requirements and specification. This research begins by investigating the adequacy of the MDA requirements phase and examining the feasibility of incorporating a requirements definition, specifically focusing upon model transformations. MDA artefacts were found to serve better the software community and requirements were not appropriately integrated within the MDA, with significant extension upstream being required in order to sufficiently accommodate the business user in terms of a requirements definition. Therefore, an extension to the MDA framework is offered that directly addresses Requirements Engineering (RE), including the distinction of analysis from design, highlighting the importance of specification. This extension is suggested to further the utility of the MDA by making it accessible to a wider audience upstream, enabling specification to be a direct output from business user involvement in the requirements phase of the MDA. To demonstrate applicability, this research illustrates the framework extension with the provision of a method and discusses the use of the approach in both academic and commercial settings. The results suggest that such an extension is academically viable in facilitating the move from analysis into the design of software systems, accessible for business use and beneficial in industry by allowing for the involvement of the client in producing models sufficient enough for use in the development of software systems using MDA tools and techniques.
113

A society of mind approach to cognition and metacognition in a cognitive architecture

Venkatamuni, Vijayakumar Maragal January 2008 (has links)
This thesis investigates the concept of mind as a control system using the "Society of Agents" metaphor. "Society of Agents" describes collective behaviours of simple and intelligent agents. "Society of Mind" is more than a collection of task-oriented and deliberative agents; it is a powerful concept for mind research and can benefit from the use of metacognition. The aim is to develop a self configurable computational model using the concept of metacognition. A six tiered SMCA (Society of Mind Cognitive Architecture) control model is designed that relies on a society of agents operating using metrics associated with the principles of artificial economics in animal cognition. This research investigates the concept of metacognition as a powerful catalyst for control, unify and self-reflection. Metacognition is used on BDI models with respect to planning, reasoning, decision making, self reflection, problem solving, learning and the general process of cognition to improve performance. One perspective on how to develop metacognition in a SMCA model is based on the differentiation between metacognitive strategies and metacomponents or metacognitive aids. Metacognitive strategies denote activities such as metacomphrension (remedial action) and metamanagement (self management) and schema training (meaning full learning over cognitive structures). Metacomponents are aids for the representation of thoughts. To develop an efficient, intelligent and optimal agent through the use of metacognition requires the design of a multiple layered control model which includes simple to complex levels of agent action and behaviours. This SMCA model has designed and implemented for six layers which includes reflexive, reactive, deliberative (BDI), learning (Q-Ieamer), metacontrol and metacognition layers.
114

Combinations of time series forecasts : when and why are they beneficial?

Lemke, Christiane January 2010 (has links)
Time series forecasting has a long track record in many application areas. In forecasting research, it has been illustrated that finding an individual algorithm that works best for all possible scenarios is hopeless. Therefore, instead of striving to design a single superior algorithm, current research efforts have shifted towards gaining a deeper understanding of the reasons a forecasting method may perform well in some conditions whilst it may fail in others. This thesis provides a number of contributions to this matter. Traditional empirical evaluations are discussed from a novel point of view, questioning the benefit of using sophisticated forecasting methods without domain knowledge. An own empirical study focusing on relevant off-the shelf forecasting and forecast combination methods underlines the competitiveness of relatively simple methods in practical applications. Furthermore, meta-features of time series are extracted to automatically find and exploit a link between application specific data characteristics and forecasting performance using meta-learning. Finally, the approach of extending the set of input forecasts by diversifying functional approaches, parameter sets and data aggregation level used for learning is discussed, relating characteristics of the resulting forecasts to different error decompositions for both individual methods and combinations. Advanced combination structures are investigated in order to take advantage of the knowledge on the forecast generation processes. Forecasting is a crucial factor in airline revenue management; forecasting of the anticipated booking, cancellation and no-show numbers has a direct impact on general planning of routes and schedules, capacity control for fareclasses and overbooking limits. In a collaboration with Lufthansa Systems in Berlin, experiments in the thesis are conducted on an airline data set with the objective of improving the current net booking forecast by modifying one of its components, the cancellation forecast. To also compare results achieved of the methods investigated here with the current state-of-the-art in forecasting research, some experiments also use data sets of two recent forecasting competitions, thus being able to provide a link between academic research and industrial practice.
115

The needs of the software industry and the content of undergraduate education in Ireland : a survey of the views of practitioners, managers and academics

Byrne, Declan Jerome January 1996 (has links)
No description available.
116

Computational logic : structure sharing and proof of program properties

Moore, J. Strother January 1973 (has links)
This thesis describes the results of two studies in computational logic. The first concerns a very efficient method of implementing resolution theorem provers. The second concerns a non-resolution program which automatically proves many theorems about LISP functions, using structural induction. In Part 1, a method of representing clauses, called 'structure sharing'is presented. In this representation, terms are instantiated by binding their variables on a stack, or in a dictionary, and derived clauses are represented in terms of their parents. This allows the structure representing a clause to be used in different contexts without renaming its variables or copying it in any way. The amount of space required for a clause is (2 + n) 36-bit words, where n is the number of components in the unifying substitution made for the resolution or factor. This is independant of the number of literals in the clause and the depth of function nesting. Several ways of making the unification algorithm more efficient are presented. These include a method od preprocessing the input terms so that the unifying substitution for derived terms can be discovered by a recursive look-up proceedure. Techniques for naturally mixing computation and deduction are presented. The structure sharing implementation of SL-resolution is described in detail. The relationship between structure sharing and programming language implementations is discussed. Part 1 concludes with the presentation of a programming language, based on predicate calculus, with structure sharing as the natural implementation. Part 2 of this thesis describes a program which automatically proves a wide variety of theorems about functions written in a subset of pre LISP. Features of this program include: The program is fully automatic, requiring no information from the user except the LISP definitions of the functions involved and the statement of the theorem to be proved. No inductive assertions are required for the user. The program uses structural induction when required, automatically generating its own induction formulas. All relationships in the theorem are expressed in terms of user defined LISP functions, rather than a secong logical language. The system employs no built-in information about any non-primitive function. All properties required of any function involved in a proof are derived and established automatically. The progeam is capable of generalizing some theorems in order to prove them; in doing so, it often generates interesting lemmas. The program can write new, recursive LISP functions automatically in attempting to generalize a theorem. Finally, the program is very fast by theorem proving standards, requiring around 10 seconds per proof.
117

Relaxation and its role in vision

Hinton, Geoffrey E. January 1977 (has links)
It is argued that a visual system, especially one which handles imperfect data, needs a way of selecting the best consistent combination from among the many interrelated, locally plausible hypotheses about how parts or aspects of the visual input may be interpreted. A method is presented in which each hypothesis is given a supposition value between 0 and 1. A parallel relaxation I operator, based on the plausibilities of hypotheses and the logical relations between them, is then used to modify the supposition values, and the process is repeated until the best consistent set of hypotheses have supposition values of approximately 1, and the rest have values of approximately 0. The method is incorporated in a program which can interpret configurations of overlapping rectangles as puppets. For this task it is possible to formulate all the potentially relevant hypotheses before using relaxation to select the best consistent set. For more complex tasks, it is necessary to use relaxation on the locally plausible interpretations to guide the search for locally less obvious ones. Ways of doing this are discussed. Finally, an implemented system is presented which allows the user to specify schemas and inference rules, and uses relaxation to control the building of a network of instances of the schemas, when presented with data about some instances and relations between them
118

Computer modelling of English grammar

Ritchie, Graeme D. January 1977 (has links)
Recent work in artificial intelligence has developed a number of techniques which are particularly appropriate for constructing a model of the process of understanding English sentences. These methods are used here in the definition of a framework for linguistic description, called "computational grammar". This framework is employed to explore the - details of the operations involved in transforming an representation English sentence into a general semantic Computational grammar includes both "syntactic" and "semantic" constructs, in order to clarify the interactions between all the various kinds of information, and treats the sentence-analysis process as having a semantic goal which may require syntactic means to achieve it. The sentence-analyser is based on the concept of an "augmented transition network grammar", modified to minimise unwanted top-down processing and unnecessary era bedding. The analyser does not build a purely syntactic ,structure for a sentence, but the semantic rules operate hierarchically in a way which reflects the traditional tree structure. The processing operations are simplified by using temporary storage to postpone premature decisions or to conflate different options. The computational grammar framework has been applied to a few areas of English, including relative clauses, referring expressions, verb phrases and tense. A computer program ( "MCHINE") has been written which implements the constructs of computational grammar and some of the linguistic descriptions of English. A number of sentences have been successfully processed by the program, which can carry on a simple. dialogue as well as building semantic representations for isolated sentences.
119

Computer perception of curved objects using a television camera

Turner, Kenneth J. January 1974 (has links)
Various techniques are described for the computer perception of curved objects (cups, mugs, toruses, etc.). Research has been conducted in the areas of image-processing, object recognition, 3-d analysis, and scene analysis. A representation of images in terms of lines is argued to be superior to one in terms of regions. Line-finding is accomplished by an edge-follower which can track round curved boundaries. A new method of segmentation coupled to an improved procedure for fitting conic sections is used to obtain a line-drawing from the object boundaries. A topological description of the image is built up in which junctions are classified according to a comprehensive scheme applicable to pictures with both curved and straight lines. 'The performance of Barrow & Popplestone's program for recognising irregular objects is evaluated, and it is demonstrated that considerable improvements in speed. may be obtained by letting gross features of an object, such as its outline, direct the matching. Objects may also be identified with a new form of Waltz's techniques, based on labelling-constraints derived from topological, 3-d object models. It is shown how the hierarchical synthesis method for object recognition may be implemented so as to facilitate flexible interaction. Good tolerance of imperfections and rapidity of matching are achieved with this technique. A qualitative measure of the shape of object surfaces is obtained from examination of the intensity contours created by the shading of reflected light. 3-d information is also acquired by matching image descriptions with "procedural" models of a set of prototype objects. Scene analysis is performed by generalisations of the methods devised by Waltz. Description and partition of a scene is carried out using either models of specific objects or models of object classes. Procedures are explained for generating labels for the junctions caused by curved objects; the labels for a variety of types of surface interactions are tabulated in an appendix. A complete scene analysis system is described which integrates the research efforts in these areas. Its performance on simple scenes containing both curved and polyhedral objects is assessed. The thesis concludes with an appraisal of the results achieved, with particular regard to future lines of development.
120

The formalisation of discourse production

Davey, Anthony January 1974 (has links)
This paper describes a computer program which produces English discourse. The program is capable of describing in a sequence of English sentences any game of noughts-&-crosses (tic-tac-toe), whether given or actually played with the program. The object is to understand something of what a speaker is doing when he speaks, and the program therefore demonstrates the operation of rules for selecting information into sentences, for connecting sentences into a discourse, and for constructing clauses, groups, and words to convey the required information with the maximum possible economy. The program uses a systemic functional grammar to co-operate with semantic procedures in producing English. The grammar generates only a limited range of English, but one which is nonetheless sufficient to illustrate the advantages both theoretical and practical of such a grammar for a productive system. Many other computer programs have accepted more or less natural English input, usually in the form of questions requiring an answer, but few have been designed to produce natural English, particularly connected discourse. As a producing system the present model offers a view of language use from a viewpoint slightly different from that of its predecessors. However comprehension and production are dependent on each other, so that study of one may be expected to illuminate the other.

Page generated in 0.0783 seconds