91 |
Conceptual modeling architecture and implementation of object-oriented simulation for automated guided vehicle (AGV) systemsWang, I-Chien 09 June 1995 (has links)
Traditional simulation languages and simulators do not
fully support the need to design, modify, and extend
simulation models of manufacturing systems, especially,
material handling systems. Since AGV systems, one type of
automated material handling systems, require complicated
control logic, flexible job routings, and frequent layout
modifications and extensions to correspond to production
requirements, the time consumption and efforts to achieve
the above tasks in traditional paradigms are significant.
However, such difficulties can be overcome by the use of
object-oriented simulation.
This research develops an object-oriented modeling
architecture for the simulation of AGV (automated guided
vehicle) systems by extending Beaumariage's object-oriented
modeling environment (1990) which is originally designed for
the simulation of job shop type manufacturing systems. For
this extension, several classes required to comprise an AGV
system are created into the original environment which
include AGV, limited size queue, control point, track
segment, machine cell, AGV system control classes, and so
on. This architecture provides a flexible environment that
enables the modeling of traditional and tandem AGV system
layouts. A best-first search approach, one artificial
intelligence search algorithm, is employed to direct AGVs to
determine the shortest path from all possible travel paths.
The computerized modeling system with this conceptual
architecture is easy to use, especially compared with
traditional simulation tools. In addition, the extended
object-oriented architecture used for the simulation of AGV
systems is program independent and may be implemented in any
object-oriented language.
The prototype system implemented as a portion of this
research is performed in Smalltalk/V. Two case examples are
presented for verification and validation. / Graduation date: 1996
|
92 |
Extracting and Representing Qualitative Behaviors of Complex Systems in Phase SpacesZhao, Feng 01 March 1991 (has links)
We develop a qualitative method for understanding and representing phase space structures of complex systems and demonstrate the method with a program, MAPS --- Modeler and Analyzer for Phase Spaces, using deep domain knowledge of dynamical system theory. Given a dynamical system, the program generates a complete, high level symbolic description of the phase space structure sensible to human beings and manipulable by other programs. Using the phase space descriptions, we are developing a novel control synthesis strategy to automatically synthesize a controller for a nonlinear system in the phase space to achieve desired properties.
|
93 |
Automatic Qualitative Modeling of Dynamic Physical SystemsAmsterdam, Jonathan 01 January 1993 (has links)
This report describes MM, a computer program that can model a variety of mechanical and fluid systems. Given a system's structure and qualitative behavior, MM searches for models using an energy-based modeling framework. MM uses general facts about physical systems to relate behavioral and model properties. These facts enable a more focussed search for models than would be obtained by mere comparison of desired and predicted behaviors. When these facts do not apply, MM uses behavior-constrained qualitative simulation to verify candidate models efficiently. MM can also design experiments to distinguish among multiple candidate models.
|
94 |
Using Controlled Natural Language for World Knowledge ReasoningDellis, Nelson Charles 01 January 2010 (has links)
Search engines are the most popular tools for finding answers to questions, but unfortunately they do not always provide complete direct answers. Answers often need to be extracted by the user, from the web pages returned by the search engine. This research addresses this problem, and shows how an automated theorem prover, combined with existing ontologies and the web, is able to reason about world knowledge and return direct answers to users' questions. The use of an automated theorem prover also allows more complex questions to be asked. Automated theorem provers that exhibit these capabilities are called World Knowledge Reasoning systems. This research discusses one such system, the CNL-WKR system. The CNL-WKR system uses the ACE controlled natural language as its user-input language. It then calls upon external sources on the web, as well as internal ontological sources, during the theorem proving process, in order to find answers. The system uses the automated theorem prover, SPASS-XDB. The result is a system that is capable of answering complex questions about the world.
|
95 |
Machine Learning for Automated Theorem ProvingKakkad, Aman 01 January 2009 (has links)
Developing logic in machines has always been an area of concern for scientists. Automated Theorem Proving is a field that has implemented the concept of logical consequence to a certain level. However, if the number of available axioms is very large then the probability of getting a proof for a conjecture in a reasonable time limit can be very small. This is where the ability to learn from previously proved theorems comes into play. If we see in our own lives, whenever a new situation S(NEW) is encountered we try to recollect all old scenarios S(OLD) in our neural system similar to the new one. Based on them we then try to find a solution for S(NEW) with the help of all related facts F(OLD) to S(OLD). Similar is the concept in this research. The thesis deals with developing a solution and finally implementing it in a tool that tries to prove a failed conjecture (a problem that the ATP system failed to prove) by extracting a sufficient set of axioms (we call it Refined Axiom Set (RAS)) from a large pool of available axioms. The process is carried out by measuring the similarity of a failed conjecture with solved theorems (already proved) of the same domain. We call it "process1", which is based on syntactic selection of axioms. After process1, RAS may still have irrelevant axioms, which motivated us to apply semantic selection approach on RAS so as to refine it to a much finer level. We call this approach as "process2". We then try to prove failed conjecture either from the output of process1 or process2, depending upon whichever approach is selected by the user. As for our testing result domain, we picked all FOF problems from the TPTP problem domain called SWC, which consisted of 24 broken conjectures (problems for which the ATP system is able to show that proof exists but not able to find it because of limited resources), 124 failed conjectures and 274 solved theorems. The results are produced by keeping in account both the broken and failed problems. The percentage of broken conjectures being solved with respect to the failed conjectures is obviously higher and the tool has shown a success of 100 % on the broken set and 19.5 % on the failed ones.
|
96 |
Impact of automated validation on software model qualityTufvesson, Hampus January 2013 (has links)
Model driven development is gaining momentum, and thus, larger and more complex systems are being represented and developed with the help of modeling. Complex systems often suffer from a number of problems such as difficulties in keeping the model understandable, long compilation times and high coupling. With modeling comes the possibility to validate the models against constraints, which makes it possible to handle problems that traditional static analysis tools can't solve. This thesis is a study on to what degree the usage of automatic model validation can be a useful tool in addressing some of the problems that appear in the development of complex systems. This is done by compiling a list of validation constraints based on existing problems, implementing and applying fixes for these and measuring how a number of different aspects of the model is affected. After applying the fixes and measuring the impact on the models ,it could be seen that validation of dependencies can have a signicant impact on the models by reducing build times of the generated code. Other types of validation constraints require further study to decide what impact they might have on model quality.
|
97 |
Modified bargaining protocols for automated negotiation in open multi-agent systemsWinoto, Pinata 29 March 2007
Current research in multi-agent systems (MAS) has advanced to the development of open MAS, which are characterized by the heterogeneity of agents, free exit/entry and decentralized control. Conflicts of interest among agents are inevitable, and hence automated negotiation to resolve them is one of the promising solutions. This thesis studies three modifications on alternating-offer bargaining protocols for automated negotiation in open MAS. The long-term goal of this research is to design negotiation protocols which can be easily used by intelligent agents in accommodating their need in resolving their conflicts. In particular, we propose three modifications: allowing non-monotonic offers during the bargaining (non-monotonic-offers bargaining protocol), allowing strategic delay (delay-based bargaining protocol), and allowing strategic ignorance to augment argumentation when the bargaining comprises argumentation (ignorance-based argumentation-based negotiation protocol). <p>Utility theory and decision-theoretic approaches are used in the theoretical analysis part, with an aim to prove the benefit of these three modifications in negotiation among myopic agents under uncertainty. Empirical studies by means of computer simulation are conducted in analyzing the cost and benefit of these modifications. Social agents, who use common human bargaining strategies, are the subjects of the simulation. <p>In general, we assume that agents are bounded rational with various degrees of belief and trust toward their opponents. In particular in the study of the non-monotonic-offers bargaining protocol, we assume that our agents have diminishing surplus. We further assume that our agents have increasing surplus in the study of delay-based bargaining protocol. And in the study of ignorance-based argumentation-based negotiation protocol, we assume that agents may have different knowledge and use different ontologies and reasoning engines. <p>Through theoretical analysis under various settings, we show the benefit of allowing these modifications in terms of agents expected surplus. And through simulation, we show the benefit of allowing these modifications in terms of social welfare (total surplus). Several implementation issues are then discussed, and their potential solutions in terms of some additional policies are proposed. Finally, we also suggest some future work which can potentially improve the reliability of these modifications.
|
98 |
Automated Microfluidic Sample Preparation for Laser Scanning CytometryWu, Eric 06 April 2010 (has links)
Laser scanning cytometry (LSC) is a slide-based method that is used clinically for Quantitative Imaging Cytometry (QIC). A “Clatch” slide, named after the inventor, which is used in conjunction with the LSC for immunophenotyping patient cell samples, has several drawbacks. The slide requires time consuming and laborious pipette steps, making the slide prone to handling errors. The Clatch slide also uses a significant amount of cell sample, limiting the number of analyses for fine needle aspirate (FNA) samples.
This thesis details an automated microfluidic system, composed of an embedded circuit, a plastic and polymer microfluidic device, and an aluminum frame, which can perform the same immunophenotyping procedures. This new system reduces the labor from 36 pipette steps to 8, it reduces the amount of cell sample from 180 μL to 56 μL, and it shortens the entire procedure from 75 minutes to 42 minutes.
|
99 |
Automated Microfluidic Sample Preparation for Laser Scanning CytometryWu, Eric 06 April 2010 (has links)
Laser scanning cytometry (LSC) is a slide-based method that is used clinically for Quantitative Imaging Cytometry (QIC). A “Clatch” slide, named after the inventor, which is used in conjunction with the LSC for immunophenotyping patient cell samples, has several drawbacks. The slide requires time consuming and laborious pipette steps, making the slide prone to handling errors. The Clatch slide also uses a significant amount of cell sample, limiting the number of analyses for fine needle aspirate (FNA) samples.
This thesis details an automated microfluidic system, composed of an embedded circuit, a plastic and polymer microfluidic device, and an aluminum frame, which can perform the same immunophenotyping procedures. This new system reduces the labor from 36 pipette steps to 8, it reduces the amount of cell sample from 180 μL to 56 μL, and it shortens the entire procedure from 75 minutes to 42 minutes.
|
100 |
Internally Contracted Multireference Coupled Cluster Method and Normal-Order-Based Automatic Code GeneratorKong, Liguo January 2009 (has links)
Single reference coupled cluster theory has been established as the method of choice for calculating electronic properties of
small-to-medium size molecules. However, in typical multireference cases, such as bond breaking processes, biradicals, excited states, very high order excitations may
be needed in the cluster operator to obtain reliable and accurate results, which is not practical due to the rapidly growing computational costs. Although there has been much e®ort to extend the applicability of single reference methods, there is little doubt that genuine multireference methods
are indispensable.
The method we are developing, the State Speci¯c Equation of Motion Coupled Cluster (SS-EOMCC) method, generalizes the state universal Equation of Motion Coupled Cluster (EOMCC)methods to a state specific version. SS-EOMCC works for both ground states and excited states. It is rigorously
spin-adapted. The cluster operator amplitudes are solved, taking the complete-
active-space self-consistent-¯eld function as the reference function. The differential relaxation effects are taken into account by diagonalizing the transformed Hamiltonian in the multireference configuration interaction singles (MRCIS) space. To implement the method, we developed an automatic program generator, the details of which are presented.
The strategy used in approximating residual equations in SS-EOMCC is based on a novel
normal order theory, which is a generalization of traditional particle-hole formalism based normal order theory. We discuss normal order theory in a general context, start with the version developed
by Mukherjee and Kutzelnigg, and we furnish an algebraic proof for the corresponding contraction rules. Then we proceed to show how our normal order theory works.
Finally we present the benchmark results to gauge the SS-EOMCC method. We calculate
the triplet state state of F₂ to examine the behavior of the method for single reference systems, and study the singlet states of H₂O, CO and N₂ to test its performance for multireference systems. In addition, we illustrate the e®ect of a perturbative correction, which attempts to alleviate the
redundancy issue. We also apply the method to study the energetics of end-on and side-on
peroxide coordination in ligated Cu₂O₂ models, where SS-EOMCC[+2] employing a small active space achieves quite accurate results.
The final diagonalization of the transformed Hamiltonian in the MRCIS space is expensive and limits the applicability of the method. We attempt to develop a cheaper internally contracted multireference coupled cluster method by introducing semi-internal excitation operators in the cluster operator such that the final diagonalization can be confined within the active space, but
the results are not satisfactory yet.
The Jeziorski-Monkhorst (JM) ansatz has been studied extensively, and different ways to
resolve the redundancy issue have been explored. We analyze these JM-ansatz based methods, derive them in a simple way to disclose their connections transparently, and point out some problem in these methods. Another issue of general interest which is examined in the thesis is orbital invariance. For single reference methods the invariance property is usually clear, but this is not always the case for multireference methods. We analyze this problem from the tensor theory
point of view, and propose a practical self-consistency-checking algorithm to determine whether a method is orbital invariant or not. We apply the algorithm to different methods, in particular, demonstrating the lack of the invariance property for JM-ansatz based methods.
|
Page generated in 0.0808 seconds