• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 304
  • 40
  • 18
  • 6
  • 6
  • 6
  • 6
  • 6
  • 5
  • 4
  • 4
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 446
  • 446
  • 446
  • 446
  • 73
  • 73
  • 65
  • 64
  • 56
  • 50
  • 38
  • 37
  • 34
  • 33
  • 32
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
401

Single Function Agents and their Negotiation Behavior in Expert Systems

Dunskus, Bertram V. 05 November 1999 (has links)
"A Single Function Agent (SiFA) is a software agent, with only one function, one point of view, and one target object on which to act. For example, an agent might be a critic (function) of material (target) from the point of view of cost. This research investigates the possibilities and implications of the SiFA concept, and analyzes the definition language, negotiation language and negotiation strategies of the agents. After defining a domain-independent set of agent types we investigated negotiation, analyzing which pairs/groups of agents have reason to communicate, and what the information passed between them should be, as well as what knowledge was needed to support the negotiation. A library for the CLIPS expert system shell was built, which allows development of SiFA based expert systems from domain independent templates. We will present two such systems, one as implemented for the domain of ceramic component material selection and the other (in development) for simple sailboat design. The effect of negotiation on the design process and the results are discussed, as well as directions for future research into SiFAs."
402

Probing for a Continual Validation Prototype

Gill, Peter W. 26 August 2001 (has links)
"Continual Validation of distributed software systems can facilitate their development and evolution and engender user trust. We present a monitoring architecture that is being developed collaboratively under DARPA's Dynamic Assembly for System Adaptability, Dependability, and Assurance program. The monitoring system includes a probing infrastructure that is injected into or wrapped around a target software system. Probes deliver events of interest to be processed by a monitoring infrastructure that consists of gauges for delivering information to system administrators. This thesis presents a classification of existing probing technologies and contains a full implementation of a probing infrastructure in Java."
403

Conflicts and Negotiations in Single Function Agent Based Design Systems

Berker, Ilan 04 November 1999 (has links)
"Design is a very complicated and ill-defined problem solving activity. Routine parametric design is a more restricted and well-defined version of design problems. Even this restricted version requires many different kinds of expert knowledge and the ability to perform a variety of tasks. One approach to solving this restricted version is to use Single Function Agents (SiFAs), each of which can perform a very specialized task, from a single point of view. The ability to represent expertise with different points of view is very important in design. These different points of view usually cause conflicts among agents, and these conflicts need to be resolved in order for the design process to be successful. Therefore, agents need to be capable of detecting and resolving these conflicts. This thesis presents a model of conflicts and negotiations in the SiFA framework. Some extensions to the present state of the SiFA paradigm are introduced. A hierarchy of possible conflicts is proposed and the steps of the negotiation process are discussed. The ability of agents to negotiation in order to resolve conflicts makes SiFA-based design systems more versatile, less brittle, and easier to construct and maintain. Also, the extended SiFA paradigm, where agents have negotiation capabilities leads to many interesting directions for further research. "
404

Patterns for web services standards

Unknown Date (has links)
Web services intend to provide an application integration technology that can be successfully used over the Internet in a secure, interoperable and trusted manner. Policies are high-level guidelines defining the way an institution conducts its activities. The WS-Policy standard describes how to apply policies of security definition, enforcement of access control, authentication and logging. WS-Trust defines a security token service and a trust engine which are used by web services to authenticate other web services. Using the functions defined in WS-Trust, applications can engage in secure communication after establishing trust. BPEL is a language for web service composition that intends to provide convenient and effective means for application integration over the Internet. We address security considerations in BPEL and how to enforce them, as well as its interactions with other web services standards such as WS-Security and WS-Policy. / by Ola Ajaj. / Thesis (M.S.C.S.)--Florida Atlantic University, 2010. / Includes bibliography. / Electronic reproduction. Boca Raton, Fla., 2010. Mode of access: World Wide Web.
405

Development of an expert system for irrigation and fertilization management in the Pacific Northwest

Cao, Weidong 23 October 1992 (has links)
Recent advances in computer technology have made possible the development of expert systems. Expert systems are computer programs that perform at the level of a human expert. Expert systems can help integrate and apply diverse sources of information and expertise to problems of integrated crop management. A prototype Crop Management eXpert (CMX) system has been developed. The primary goal of CMX is to provide recommendations on optimal irrigation and fertilization scheduling for wheat production in the Pacific Northwest. This system can be used by farmers and/or extension agents. OUS II Shell has been used as a implementation tool. To build this rule - based expert system, a development strategy, commonly used in the construction of expert systems, consisting of 1 ) identification; 2) conceptualization; 3) formalization; 4) implementation; and 5) testing was applied. CMX is composed of modules for irrigation and fertilization management. For irrigation management, CMX is mainly involved in the irrigation scheduling which is the major part of irrigation management. Irrigation strategies have been applied in irrigation decision making. For each strategy, timing criteria which generally consist of management allowed depletion, soil water potential, leaf water potential, and water stress indices have been used. The system provides farmers with irrigation scenarios which determine when and how much water to apply. CMX represents an integration of conventional computing and expert systems technology designed to provide expert recommendations enabling farmers to obtain the best return on their water and fertilizer investment. For fertilization management, a variety of variables have been taken into considered. Crop growth stages, soil moisture, nutrient analysis, protein requirement, and application methods are important factors for the fertilizer decision making. Several constraints have been used in optimal fertilizer advice. CMX can focus only on relevant information, thus reducing the problem space to a manageable size and significantly, improving the efficiency of the system. The facility of the expert system to explain the decision-making process enables users to better understand the underlying assumptions, facts, and reasoning used to generate recommendations. The CMX prototype demonstrates the feasibility of employing expert systems technology in agricultural applications. CMX has been validated and evaluated. The survey results showed that this prototype was successful in capturing domain experts' knowledge as rules and providing advice on the irrigation and fertilization management for wheat. / Graduation date: 1993
406

Case-based expert system using wavelet packet transform and kernel-based feature manipulation for engine spark ignition diagnosis / Case-based expert system using WPT and kernel-based feature manipulation for engine spark ignition diagnosis

Huang, He January 2010 (has links)
University of Macau / Faculty of Science and Technology / Department of Computer and Information Science
407

Knowledge guided processing of magnetic resonance images of the brain [electronic resource] / by Matthew C. Clark.

Clark, Matthew C. January 2001 (has links)
Includes vita. / Title from PDF of title page. / Document formatted into pages; contains 222 pages. / Includes bibliographical references. / Text (Electronic thesis) in PDF format. / ABSTRACT: This dissertation presents a knowledge-guided expert system that is capable of applying routinesfor multispectral analysis, (un)supervised clustering, and basic image processing to automatically detect and segment brain tissue abnormalities, and then label glioblastoma-multiforme brain tumors in magnetic resonance volumes of the human brain. The magnetic resonance images used here consist of three feature images (T1-weighted, proton density, T2-weighted) and the system is designed to be independent of a particular scanning protocol. Separate, but contiguous 2D slices in the transaxial plane form a brain volume. This allows complete tumor volumes to be measured and if repeat scans are taken over time, the system may be used to monitor tumor response to past treatments and aid in the planning of future treatment. Furthermore, once processing begins, the system is completely unsupervised, thus avoiding the problems of human variability found in supervised segmentation efforts.Each slice is initially segmented by an unsupervised fuzzy c-means algorithm. The segmented image, along with its respective cluster centers, is then analyzed by a rule-based expert system which iteratively locates tissues of interest based on the hierarchy of cluster centers in feature space. Model-based recognition techniques analyze tissues of interest by searching for expected characteristics and comparing those found with previously defined qualitative models. Normal/abnormal classification is performed through a default reasoning method: if a significant model deviation is found, the slice is considered abnormal. Otherwise, the slice is considered normal. Tumor segmentation in abnormal slices begins with multispectral histogram analysis and thresholding to separate suspected tumor from the rest of the intra-cranial region. The tumor is then refined with a variant of seed growing, followed by spatial component analysis and a final thresholding step to remove non-tumor pixels.The knowledge used in this system was extracted from general principles of magnetic resonance imaging, the distributions of individual voxels and cluster centers in feature space, and anatomical information. Knowledge is used both for single slice processing and information propagation between slices. A standard rule-based expert system shell (CLIPS) was modified to include the multispectral analysis, clustering, and image processing tools.A total of sixty-three volume data sets from eight patients and seventeen volunteers (four with and thirteen without gadolinium enhancement) were acquired from a single magnetic resonance imaging system with slightly varying scanning protocols were available for processing. All volumes were processed for normal/abnormal classification. Tumor segmentation was performed on the abnormal slices and the results were compared with a radiologist-labeled ground truth' tumor volume and tumor segmentations created by applying supervised k-nearest neighbors, a partially supervised variant of the fuzzy c-means clustering algorithm, and a commercially available seed growing package. The results of the developed automatic system generally correspond well to ground truth, both on a per slice basis and more importantly in tracking total tumor volume during treatment over time. / System requirements: World Wide Web browser and PDF reader. / Mode of access: World Wide Web.
408

Business and the grid : economic and transparent utilization of virtual resources /

Weishäupl, Thomas. January 2006 (has links)
Univ., Diss.--Wien, 2006.
409

Conceptual design methodology of distributed intelligence large scale systems

Nairouz, Bassem R. 20 September 2013 (has links)
Distributed intelligence systems are starting to gain dominance in the field of large-scale complex systems. These systems are characterized by nonlinear behavior patterns that are only predicted through simulation-based engineering. In addition, the autonomy, intelligence, and reconfiguration capabilities required by certain systems introduce obstacles adding another layer of complexity. However, there exists no standard process for the design of such systems. This research presents a design methodology focusing on distributed control architectures while concurrently considering the systems design process. The methodology has two major components. First, it introduces a hybrid design process, based on the infusion of the control architecture and conceptual system design processes. The second component is the development of control architectures metamodel, placing a distinction between control configuration and control methods. This enables a standard representation of a wide spectrum of control architectures frameworks.
410

Case-driven collaborative classification

Vazey, Megan Margaret January 2007 (has links)
Thesis (PhD) -- Macquarie University, Division of Information and Communication Sciences, Department of Computing, 2007. / "Submitted January 27 2007, revised July 27 2007". / Bibliography: p. 281-304. / Mode of access: World Wide Web. / xiv, 487 p., bound ill. (some col.)

Page generated in 0.1106 seconds