• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 5509
  • 3856
  • 482
  • 422
  • 373
  • 349
  • 302
  • 172
  • 128
  • 128
  • 128
  • 128
  • 128
  • 118
  • 73
  • Tagged with
  • 13637
  • 6047
  • 2624
  • 2537
  • 2050
  • 1948
  • 1609
  • 1536
  • 1523
  • 1468
  • 1450
  • 1180
  • 1027
  • 1022
  • 1006
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
321

Learning by Augmenting Rules and Accumulating Censors

Winston, Patrick H. 01 May 1982 (has links)
This paper is a synthesis of several sets of ideas: ideas about learning from precedents and exercises, ideas about learning using near misses, ideas about generalizing if-then rules, and ideas about using censors to prevent procedure misapplication. The synthesis enables two extensions to an implemented system that solves problems involving precedents and exercises and that generates if-then rules as a byproduct . These extensions are as follows: If-then rules are augmented by unless conditions, creating augmented if-then rules. An augmented if-then rule is blocked whenever facts in hand directly demonstrate the truth of an unless condition, the rule is called a censor. Like ordinary augmented if-then rules, censors can be learned. Definition rules are introduced that facilitate graceful refinement. The definition rules are also augmented if-then rules. They work by virtue of unless entries that capture certain nuances of meaning different from those expressible by necessary conditions. Like ordinary augmented if-then rules, definition rules can be learned. The strength of the ideas is illustrated by way of representative experiments. All of these experiments have been performed with an implemented system.
322

The Delta Tree: An Object-Centered Approach to Image-Based Rendering

Dally, William J., McMillan, Leonard, Bishop, Gary, Fuchs, Henry 02 May 1997 (has links)
This paper introduces the delta tree, a data structure that represents an object using a set of reference images. It also describes an algorithm for generating arbitrary re-projections of an object by traversing its delta tree. Delta trees are an efficient representation in terms of both storage and rendering performance. Each node of a delta tree stores an image taken from a point on a sampling sphere that encloses the object. Each image is compressed by discarding pixels that can be reconstructed by warping its ancestor's images to the node's viewpoint. The partial image stored at each node is divided into blocks and represented in the frequency domain. The rendering process generates an image at an arbitrary viewpoint by traversing the delta tree from a root node to one or more of its leaves. A subdivision algorithm selects only the required blocks from the nodes along the path. For each block, only the frequency components necessary to reconstruct the final image at an appropriate sampling density are used. This frequency selection mechanism handles both antialiasing and level-of-detail within a single framework. A complex scene is initially rendered by compositing images generated by traversing the delta trees of its components. Once the reference views of a scene are rendered once in this manner, the entire scene can be reprojected to an arbitrary viewpoint by traversing its own delta tree. Our approach is limited to generating views of an object from outside the object's convex hull. In practice we work around this problem by subdividing objects to render views from within the convex hull.
323

Edge and Mean Based Image Compression

Desai, Ujjaval Y., Mizuki, Marcelo M., Masaki, Ichiro, Horn, Berthold K.P. 01 November 1996 (has links)
In this paper, we present a static image compression algorithm for very low bit rate applications. The algorithm reduces spatial redundancy present in images by extracting and encoding edge and mean information. Since the human visual system is highly sensitive to edges, an edge-based compression scheme can produce intelligible images at high compression ratios. We present good quality results for facial as well as textured, 256~x~256 color images at 0.1 to 0.3 bpp. The algorithm described in this paper was designed for high performance, keeping hardware implementation issues in mind. In the next phase of the project, which is currently underway, this algorithm will be implemented in hardware, and new edge-based color image sequence compression algorithms will be developed to achieve compression ratios of over 100, i.e., less than 0.12 bpp from 12 bpp. Potential applications include low power, portable video telephones.
324

Recognizing 3D Object Using Photometric Invariant

Nagao, Kenji, Grimson, Eric 22 April 1995 (has links)
In this paper we describe a new efficient algorithm for recognizing 3D objects by combining photometric and geometric invariants. Some photometric properties are derived, that are invariant to the changes of illumination and to relative object motion with respect to the camera and/or the lighting source in 3D space. We argue that conventional color constancy algorithms can not be used in the recognition of 3D objects. Further we show recognition does not require a full constancy of colors, rather, it only needs something that remains unchanged under the varying light conditions sand poses of the objects. Combining the derived color invariants and the spatial constraints on the object surfaces, we identify corresponding positions in the model and the data space coordinates, using centroid invariance of corresponding groups of feature positions. Tests are given to show the stability and efficiency of our approach to 3D object recognition.
325

Direct Object Recognition Using No Higher Than Second or Third Order Statistics of the Image

Nagao, Kenji, Horn, Berthold 01 December 1995 (has links)
Novel algorithms for object recognition are described that directly recover the transformations relating the image to its model. Unlike methods fitting the typical conventional framework, these new methods do not require exhaustive search for each feature correspondence in order to solve for the transformation. Yet they allow simultaneous object identification and recovery of the transformation. Given hypothesized % potentially corresponding regions in the model and data (2D views) --- which are from planar surfaces of the 3D objects --- these methods allow direct compututation of the parameters of the transformation by which the data may be generated from the model. We propose two algorithms: one based on invariants derived from no higher than second and third order moments of the image, the other via a combination of the affine properties of geometrical and the differential attributes of the image. Empirical results on natural images demonstrate the effectiveness of the proposed algorithms. A sensitivity analysis of the algorithm is presented. We demonstrate in particular that the differential method is quite stable against perturbations --- although not without some error --- when compared with conventional methods. We also demonstrate mathematically that even a single point correspondence suffices, theoretically at least, to recover affine parameters via the differential method.
326

Steps towards an empirically responsible AI : a methodological and theoretical framework

Svedberg, Peter O.S. January 2004 (has links)
Initially we pursue a minimal model of a cognitive system. This in turn form the basis for the development of amethodological and theoretical framework. Two methodological requirements of the model are that explanation be from the perspective of the phenomena, and that we have structural determination. The minimal model is derived from the explanatory side of a biologically based cognitive science. Fransisco Varela is our principal source for this part. The model defines the relationship between a formally defined autonomous system and an environment, in such a way as to generate the world of the system, its actual environment. The minimal model is a modular explanation in that we find it on different levels in bio-cognitive systems, from the cell to small social groups. For the latter and for the role played by artefactual systems we bring in Edwin Hutchins' observational study of a cognitive system in action. This necessitates the introduction of a complementary form of explanation. A key aspect of Hutchins' findings is the social domain as environment for humans. Aspects of human cognitive abilities usually attributed to the person are more properly attributed to the social system, including artefactual systems. Developing the methodological and theoretical framework means making a transition from the bio-cognitive to the computational. The two complementary forms of explanation are important for the ability to develop a methodology that supports the construction of actual systems. This has to be able to handle the transition from external determination of a system in design to internal determination (autonomy) in operation. Once developed, the combined framework is evaluated in an application area. This is done by comparing the standard conception of the Semantic Web with how this notion looks from the perspective of the framework. This includes the development of the methodological framework as a metalevel external knowledge representation. A key difference between the two approaches is the directness by which the semantic is approached. Our perspective puts the focus on interaction and the structural regularities this engenders in the external representation. Regularities which in turn form the basis for machine processing. In this regard we see the relationship between representation and inference as analogous to the relationship between environment and system. Accordingly we have the social domain as environment for artefactual agents. For human level cognitive abilities the social domain as environment is important. We argue that a reasonable shortcut to systems we can relate to, about that very domain, is for artefactual agents to have an external representation of the social domain as environment.
327

Complexity of probabilistic inference in belief nets--an experimental study

Li, Zhaoyu 16 November 1990 (has links)
There are three families of exact methods used for probabilistic inference in belief nets. It is necessary to compare them and analyze the advantages and the disadvantages of each algorithm, and know the time cost of making inferences in a given belief network. This paper discusses the factors that influence the computation time of each algorithm, presents the predictive model of the time complexity for each algorithm and shows the statistical results of testing the algorithms with randomly generated belief networks. / Graduation date: 1991
328

Decomposition and Symmetry in Constraint Optimization Problems

Kitching, Matthew 14 November 2011 (has links)
This thesis presents several techniques that advance search-based algorithms for solving Constraint Optimization Problems (COPs). These techniques exploit structural features common in such problems. In particular, the thesis presents a number of innovative algorithms, and associated data structures, designed to exploit decomposition and symmetry in COPs. First, a new technique called component templating is introduced. Component templates are data structures for efficiently representing the disjoint sub-problems that are encountered during search. Information about each disjoint sub-problem can then be reused during search, increasing efficiency. A new algorithm called OR-decomposition is introduced. This algorithm obtains many of the computational benefits of decomposition without the need to resort to separate recursions. That is, the algorithm explores a standard OR tree rather than an AND-OR tree. In this way, the search algorithm gains greater freedom in its variable ordering compared to previous decomposition algorithms. Although decomposition algorithms such as OR-decomposition are effective techniques for solving COPs with low tree-width, existing decomposition algorithms offer little advantage over branch and bound search on problems with high tree-width. A new method for exploiting decomposition on problems with high tree-width is presented. This technique involves detecting and exploiting decompositions on a selected subset of the problem’s objectives. Such decompositions can then be used to more efficiently compute additional bounds that can be used by branch and bound search. The second half of the thesis explores the use of symmetries in COPs. Using component templates, it is possible to exploit dynamic symmetries that appear during search when some of the variables of a problem have been assigned a value. Symmetries have not previously been combined with decomposition in COPs. An algorithm called Set Branching is presented, which exploits almost-symmetries in the values of a variable by clustering similar values together, then branching on sets of values rather than on each single value. The decomposition and symmetry algorithms presented in this thesis increase the efficiency of constraint optimization solvers. The thesis also presents experimental results that test these algorithms on a variety of real world problems, and demonstrate performance improvements over current state-of-the-art techniques.
329

Decomposition and Symmetry in Constraint Optimization Problems

Kitching, Matthew 14 November 2011 (has links)
This thesis presents several techniques that advance search-based algorithms for solving Constraint Optimization Problems (COPs). These techniques exploit structural features common in such problems. In particular, the thesis presents a number of innovative algorithms, and associated data structures, designed to exploit decomposition and symmetry in COPs. First, a new technique called component templating is introduced. Component templates are data structures for efficiently representing the disjoint sub-problems that are encountered during search. Information about each disjoint sub-problem can then be reused during search, increasing efficiency. A new algorithm called OR-decomposition is introduced. This algorithm obtains many of the computational benefits of decomposition without the need to resort to separate recursions. That is, the algorithm explores a standard OR tree rather than an AND-OR tree. In this way, the search algorithm gains greater freedom in its variable ordering compared to previous decomposition algorithms. Although decomposition algorithms such as OR-decomposition are effective techniques for solving COPs with low tree-width, existing decomposition algorithms offer little advantage over branch and bound search on problems with high tree-width. A new method for exploiting decomposition on problems with high tree-width is presented. This technique involves detecting and exploiting decompositions on a selected subset of the problem’s objectives. Such decompositions can then be used to more efficiently compute additional bounds that can be used by branch and bound search. The second half of the thesis explores the use of symmetries in COPs. Using component templates, it is possible to exploit dynamic symmetries that appear during search when some of the variables of a problem have been assigned a value. Symmetries have not previously been combined with decomposition in COPs. An algorithm called Set Branching is presented, which exploits almost-symmetries in the values of a variable by clustering similar values together, then branching on sets of values rather than on each single value. The decomposition and symmetry algorithms presented in this thesis increase the efficiency of constraint optimization solvers. The thesis also presents experimental results that test these algorithms on a variety of real world problems, and demonstrate performance improvements over current state-of-the-art techniques.
330

Agent-based management of clinical guidelines

Isern Alarcón, David 05 February 2009 (has links)
Les guies de pràctica clínica (GPC) contenen un conjunt d'accions i dades que ajuden a un metge a prendre decisions sobre el diagnòstic, tractament o qualsevol altre procediment a un pacient i sobre una determinada malaltia. És conegut que l'adopció d'aquestes guies en la vida diària pot millorar l'assistència mèdica als pacients, pel fet que s'estandarditzen les pràctiques. Sistemes computeritzats que utilitzen GPC poden constituir part de sistemes d'ajut a la presa de decisions més complexos amb la finalitat de proporcionar el coneixement adequat a la persona adequada, en un format correcte i en el moment precís. L'automatització de l'execució de les GPC és el primer pas per la seva implantació en els centres mèdics.Per aconseguir aquesta implantació final, hi ha diferents passos que cal solucionar com per exemple, l'adquisició i representació de les GPC, la seva verificació formal, i finalment la seva execució. Aquesta Tesi està dirigida en l'execució de GPC i proposa la implementació d'un sistema multi-agent. En aquest sistema els diferents actors dels centres mèdics coordinen les seves activitats seguint un pla global determinat per una GPC. Un dels principals problemes de qualsevol sistema que treballa en l'àmbit mèdic és el tractament del coneixement. En aquest cas s'han hagut de tractar termes mèdics i organitzatius, que s'ha resolt amb la implementació de diferents ontologies. La separació de la representació del coneixement del seu ús és intencionada i permet que el sistema d'execució de GPC sigui fàcilment adaptable a les circumstàncies concretes dels centres, on varien el personal i els recursos disponibles.En paral·lel a l'execució de GPC, el sistema proposat manega preferències del pacient per tal d'implementar serveis adaptats al pacient. En aquesta àrea concretament, a) s'han definit un conjunt de criteris, b) aquesta informació forma part del perfil de l'usuari i serveix per ordenar les propostes que el sistema li proposa, i c) un algoritme no supervisat d'aprenentatge permet adaptar les preferències del pacient segons triï.Finalment, algunes idees d'aquesta Tesi actualment s'estan aplicant en dos projectes de recerca. Per una banda, l'execució distribuïda de GPC, i per altra banda, la representació del coneixement mèdic i organitzatiu utilitzant ontologies. / Clinical guidelines (CGs) contain a set of directions or principles to assist the health care practitioner with patient care decisions about appropriate diagnostic, therapeutic, or other clinical procedures for specific clinical circumstances. It is widely accepted that the adoption of guideline-execution engines in daily practice would improve the patient care, by standardising the care procedures. Guideline-based systems can constitute part of a knowledge-based decision support system in order to deliver the right knowledge to the right people in the right form at the right time. The automation of the guideline execution process is a basic step towards its widespread use in medical centres.To achieve this general goal, different topics should be tackled, such as the acquisition of clinical guidelines, its formal verification, and finally its execution. This dissertation focuses on the execution of CGs and proposes the implementation of an agent-based platform in which the actors involved in health care coordinate their activities to perform the complex task of guideline enactment. The management of medical and organizational knowledge, and the formal representation of the CGs, are two knowledge-related topics addressed in this dissertation and tackled through the design of several application ontologies. The separation of the knowledge from its use is fully intentioned, and allows the CG execution engine to be easily customisable to different medical centres with varying personnel and resources.In parallel with the execution of CGs, the system handles citizen's preferences and uses them to implement patient-centred services. With respect this issue, the following tasks have been developed: a) definition of the user's criteria, b) use of the patient's profile to rank the alternatives presented to him, c) implementation of an unsupervised learning method to adapt dynamically and automatically the user's profile.Finally, several ideas of this dissertation are being directly applied in two ongoing funded research projects, including the agent-based execution of CGs and the ontological management of medical and organizational knowledge.

Page generated in 0.0605 seconds