Spelling suggestions: "subject:"recursive conditioning"" "subject:"recursive donditioning""
1 
Conditioning graphs: practical structures for inference in bayesian networksGrant, Kevin John 16 January 2007
Probability is a useful tool for reasoning when faced with uncertainty. Bayesian networks offer a compact representation of a probabilistic problem, exploiting independence amongst variables that allows a factorization of the joint probability into much smaller local probability distributions.<p>The standard approach to probabilistic inference in Bayesian networks is to compile the graph into a jointree, and perform computation over this secondary structure. While jointrees are among the most timeefficient methods of inference in Bayesian networks, they are not always appropriate for certain applications. The memory requirements of jointree can be prohibitively large. The algorithms for computing over jointrees are large and involved, making them difficult to port to other systems or be understood by general programmers without Bayesian network expertise. <p>This thesis proposes a different method for probabilistic inference in Bayesian networks. We present a data structure called a conditioning graph, which is a runtime representation of Bayesian network inference. The structure mitigates many of the problems of jointree inference. For example, conditioning graphs require much less space to store and compute over. The algorithm for calculating probabilities from a conditioning graph is small and basic, making it portable to virtually any architecture. And the details of Bayesian network inference are compiled away during the construction of the conditioning graph, leaving an intuitive structure that is easy to understand and implement without any Bayesian network expertise. <p>In addition to the conditioning graph architecture, we present several improvements to the model, that maintain its small and simplistic style while reducing the runtime required for computing over it. We present two heuristics for choosing variable orderings that result in shallower elimination trees, reducing the overall complexity of computing over conditioning graphs. We also demonstrate several compile and runtime extensions to the algorithm, that can produce substantial speedup to the algorithm while adding a small space constant to the implementation. We also show how to cache intermediate values in conditioning graphs during probabilistic computation, that allows conditioning graphs to perform at the same speed as standard methods by avoiding duplicate computation, at the price of more memory. The methods presented also conform to the basic style of the original algorithm. We demonstrate a novel technique for reducing the amount of required memory for caching. <p>We demonstrate empirically the compactness, portability, and ease of use of conditioning graphs. We also show that the optimizations of conditioning graphs allow competitive behaviour with standard methods in many circumstances, while still preserving its small and simple style. Finally, we show that the memory required under caching can be quite modest, meaning that conditioning graphs can be competitive with standard methods in terms of time, using a fraction of the memory.

2 
Conditioning graphs: practical structures for inference in bayesian networksGrant, Kevin John 16 January 2007 (has links)
Probability is a useful tool for reasoning when faced with uncertainty. Bayesian networks offer a compact representation of a probabilistic problem, exploiting independence amongst variables that allows a factorization of the joint probability into much smaller local probability distributions.<p>The standard approach to probabilistic inference in Bayesian networks is to compile the graph into a jointree, and perform computation over this secondary structure. While jointrees are among the most timeefficient methods of inference in Bayesian networks, they are not always appropriate for certain applications. The memory requirements of jointree can be prohibitively large. The algorithms for computing over jointrees are large and involved, making them difficult to port to other systems or be understood by general programmers without Bayesian network expertise. <p>This thesis proposes a different method for probabilistic inference in Bayesian networks. We present a data structure called a conditioning graph, which is a runtime representation of Bayesian network inference. The structure mitigates many of the problems of jointree inference. For example, conditioning graphs require much less space to store and compute over. The algorithm for calculating probabilities from a conditioning graph is small and basic, making it portable to virtually any architecture. And the details of Bayesian network inference are compiled away during the construction of the conditioning graph, leaving an intuitive structure that is easy to understand and implement without any Bayesian network expertise. <p>In addition to the conditioning graph architecture, we present several improvements to the model, that maintain its small and simplistic style while reducing the runtime required for computing over it. We present two heuristics for choosing variable orderings that result in shallower elimination trees, reducing the overall complexity of computing over conditioning graphs. We also demonstrate several compile and runtime extensions to the algorithm, that can produce substantial speedup to the algorithm while adding a small space constant to the implementation. We also show how to cache intermediate values in conditioning graphs during probabilistic computation, that allows conditioning graphs to perform at the same speed as standard methods by avoiding duplicate computation, at the price of more memory. The methods presented also conform to the basic style of the original algorithm. We demonstrate a novel technique for reducing the amount of required memory for caching. <p>We demonstrate empirically the compactness, portability, and ease of use of conditioning graphs. We also show that the optimizations of conditioning graphs allow competitive behaviour with standard methods in many circumstances, while still preserving its small and simple style. Finally, we show that the memory required under caching can be quite modest, meaning that conditioning graphs can be competitive with standard methods in terms of time, using a fraction of the memory.

3 
New Heuristics for Planning with Action CostsKeyder, Emil Ragip 17 December 2010 (has links)
Classical planning is the problem of nding a sequence of actions that take
an agent from an initial state to a desired goal situation, assuming deter
ministic outcomes for actions and perfect information. Satis cing planning
seeks to quickly nd lowcost solutions with no guarantees of optimality. The
most e ective approach for satis cing planning has proved to be heuristic
search using nonadmissible heuristics. In this thesis, we introduce several
such heuristics that are able to take into account costs on actions, and there
fore try to minimize the more general metric of cost, rather than length, of
plans, and investigate their properties and performance. In addition, we show
how the problem of planning with soft goals can be compiled into a classical
planning problem with costs, a setting in which costsensitive heuristics such
as those presented here are essential. / La plani caci on cl asica es el problema que consiste en hallar una secuencia de
acciones que lleven a un agente desde un estado inicial a un objetivo, asum
iendo resultados determin sticos e informaci on completa. La plani caci on
\satis cing" busca encontrar una soluci on de bajo coste, sin garant as de op
timalidad. La b usqueda heur stica guiada por heur sticas no admisibles es el
enfoque que ha tenido mas exito. Esta tesis presenta varias heur sticas de
ese g enero que consideran costes en las acciones, y por lo tanto encuentran
soluciones que minimizan el coste, en lugar de la longitud del plan. Adem as,
demostramos que el problema de plani caci on con \soft goals", u objetivos
opcionales, se puede reducir a un problema de plani caci on clasica con costes
en las acciones, escenario en el que heur sticas sensibles a costes, tal como las
aqu presentadas, son esenciales.

Page generated in 0.089 seconds