261 |
On the moral agency for high frequency trading systems and their role in distributed moralityRomar, Daniel January 2015 (has links)
No description available.
|
262 |
The agent-focus construction in Ixil Maya : a descriptive/formal analysisBlunk, William B. 17 January 2013 (has links)
This Master’s report describes the Agent Focus construction in Ixil Mayan discourse and proposes a bi-clausal analysis that is discussed within the framework of Lexical Functional Grammar (Bresnan, 2001; Dalrymple, 2001). Many previous analyses of the Agent focus construction have proposed a monoclausal analysis of this construction in other Mayan languages (Aissen, 1992 [Mayan languages in general], 1999 [Tzotzil]; Broadwell, 2000 [Kaqchikel]; Duncan, 2003 [Tzutujil]; Norman & Campbell, 1978 [Proto-Maya]). This analysis differs from these in that I assume the Agent focus construction is a complex, that is, bi-clausal cleft construction. Evidence for this analysis comes from a discussion of the Agent Focus construction in other Mayan languages, and facts about Ixil syntax, and the usage of the Agent Focus in Ixil discourse. I use Lambrecht’s (2001) framework of a cross linguistic typology of cleft construction to establish the function of the Agent Focus in Ixil. / text
|
263 |
A computer model for learning to teach : proposed categorizations and demonstrated effectsGaertner, Emily Katherine 30 January 2014 (has links)
With the proliferation of new technological alternatives to the traditional classroom, it becomes increasingly important understand the role that innovative technologies play in learning. Computer environments for learning to teach have the potential to be innovative tools that improve the skill and effectiveness of pre-service and in-service teachers. There is a tacit sense in such environments that “realism” is best created through, and associated with, a kind of pictorial literalism. I designed a computer model (the Direct Instruction tool) that, though simple, appears realistic to many users and thus contradicts that sense of literalism. I also propose a theoretical classification of computer representations based on the relationship (or lack thereof) between perceived usefulness or relevance and realism. In this study, I investigate two questions: 1) What are the kinds of claims or insights that respondents generate in relation to using the DI tool to organize their experiences? 2) How do the functionalities of the DI tool fit with or support what respondents see as meaningful? Results indicate that a model can be seen as relevant and useful even if it is not internally consistent. Two major themes that were meaningful to study participants were the simultaneously positive and negative role of “difficulty” in the classroom, and the balance between past performance and future potential. The DI tool seems to promote a shared focus on these themes despite the diversity of past educational experiences among study participants. Responses to this model suggest that extremely abstracted representations of teaching are able to influence the claims and insights of users, affording a glimpse into the internal realities of pre-service teachers. This in turn creates an opportunity to articulate these alternative realities without judgment, describe them with respect, and make them an object of consideration rather than a hidden force. The results of this study contribute to a theory of computer environments for learning to teach that can shape the effective use of these tools in the present, as well as accommodate new models that may be developed as technologies change in the future. / text
|
264 |
The implementation of a heterogeneous multi-agent swarm with autonomous target tracking capabilitiesSzmuk, Michael 04 April 2014 (has links)
This thesis details the development of a custom autopilot system designed specifically for multi-agent robotic missions. The project was motivated by the need for a flexible autopilot system architecture that could be easily adapted to a variety of future multi-vehicle experiments. The development efforts can be split into three categories: algorithm and software development, hardware development, and testing and integration. Over 12,000 lines of C++ code were written in this project, resulting in custom flight and ground control software. The flight software was designed to run on a Gumstix Overo Fire(STORM) computer on module (COM) using a Linux Angstrom operating system. The flight software was designed to support the onboard GN&C algorithms. The ground control station and its graphical user interface were developed in the Qt C++ framework. The ground control software has been proven to operate safely during multi-vehicle tests, and will be an asset in future work. Two TSH GAUI 500X quad-rotors and one Gears Educational Systems SMP rover were integrated into an autonomous swarm. Each vehicle used the Gumstix Overo COM. The C-DUS Pilot board was designed as a custom interface circuit board for the Overo COM and its expansion board, the Gumstix Pinto-TH. While the built-in WiFi capability of the Overo COM served as a communication link to a central wireless router, the C-DUS Pilot board allowed for the compact and reliable integration of sensors and actuators. The sensors used in this project were limited to accelerometers, gyroscopes, magnetometers, and GPS. All of the components underwent extensive testing. A series of ground and flight tests were conducted to safely and gradually prove system capabilities. The work presented in this thesis culminated with a successful three-vehicle autonomous demonstration comprised of two quad-rotors executing a standoff tracking trajectory around a moving rover, while simultaneously performing GPS-based collision avoidance. / text
|
265 |
Novel potential-function based control schemes for nonholonomic multi-agent systems to prevent the local minimum problemOkamoto, Makiko 23 June 2014 (has links)
Research on multi-agent systems performing cooperative tasks has received considerable attention in recent years. Because multiple agents perform cooperative tasks in close proximity, the coordination control of multiple agents to avoid collisions holds one of the critical keys to mission success. The potential function approach has been extensively employed for collision avoidance, but it has one inherent limitation of local minimum. This dissertation proposes a new avoidance strategy for the issue of local minimum. The primary objective of this research is to construct novel potential-function-based control schemes that drive agents from their initial to the goal configurations while avoiding collision with other agents and obstacles. The control schemes enable agents to avoid being trapped at a local minimum by forcing them to exit from the regions that may contain a local minimum. This dissertation consists of three studies, each of which has different technical assumptions. In the first study, all-to-all communication ability among agents is assumed. In addition, each agent is assumed to a priori know the location of all obstacles. In the second study, all-to-all communication ability is again assumed, but each agent is assumed to determine the location of obstacles using a sensor with a limited sensing range. In the third study, limited communication ability is assumed (i.e., each agent exchanges information only with agents within its limited communication range), and each agent is assumed to determine the location of the obstacles using its sensor with a limited sensing range. Relative to existing solutions, the new control schemes presented here have three distinct advantages. First, our avoidance strategy can provide cost-efficient solutions in applications because agents will never be trapped at a local minimum. Second, our control signals are continuous, which allows agents to change their speed in a realistic manner that is consistent with their natural motion traits. Finally, our control scheme allows for setting the upper bound of the velocity of each agent, which guarantees that the speed of agents will never exceed a maximum speed limit. / text
|
266 |
MECHANISM, PURPOSE AND AGENCY: the metaphysics of mental causation and free willJudisch, Neal Damian 28 August 2008 (has links)
Not available / text
|
267 |
Developing Responsive MRI Contrast Agents to Study Tumor BiologyHingorani, Dina Vinoo January 2014 (has links)
Enzymes are important biomarkers for determining tumor growth and progression. We have developed two molecules to image enzyme response by catalyCEST MRI. This technology allows for non-invasive detection of enzymes. A background of importance of measuring enzyme activity and MRI agents developed for this purpose have been covered in Chapter 1. We have synthesized a responsive paramagnetic Chemical Exchange Saturation Transfer (CEST) agent, called Tm-DO3A-cadaverine. This contrast agents has been successfully cross-linked to the protein albumin by the enzyme transglutaminase leading to the appearance of CEST at -9.2 ppm. The enzyme catalysis has been validated by measuring chemical exchange rates. We have shown that the position of the CEST peak is influenced by the conformation of the molecule depending on the neighboring amino acids to glutamine. This is the first example to show the appearance of CEST due to formation of a covalent bond. We have also synthesized a diamagnetic CEST agent with a large chemical shift dispersion to detect cathespin B activity. Upon enzyme mediated cleavage of PheArgSal, the aryl amide CEST peak at 5.3 ppm disappears. Taking a ratio of the CEST effects from salicylic acid at 9.5 ppm and aryl amide at 5.3 ppm we can detect enzyme activity. The salicylic acid moiety also undergoes some slow response due to enzyme action, as evident by the disappearance of CEST at 9.5 ppm. However, this proof of concept study is the first example of a DIACEST agent designed to measure enzyme activity using a ratio of two CEST effects from the same substrate. The last chapter highlights suggests improvements to the catalyCEST research. The appendix shows the use of bulk magnetic susceptibility measurements by NMR to determine bio-distribution of lanthanides in ex-vivo tissue.
|
268 |
Enter Paranoia: Identity and "Makeshift Salvations" in Kon Satoshi's "Paranoia Agent"Hanson, Jeffrey Steven January 2007 (has links)
Kon Satoshi's Paranoia Agent is a series that demonstrates how many types of identity are constructed. While some aspects of the series are based in fantasy, Paranoia Agent takes place in a Tokyo that closely resembles the Tokyo of the real world. In particular, a corporate icon named Maromi parallels the rise of icons such as Hello Kitty in Japan; the public's devotion to Maromi demonstrates how consumerism shapes one's personal identity. Consumerism can also be used to explain the existence of Lil' Slugger, a type of phantasm who initially appears to free the people of Tokyo from their problems, but is actually a "crutch" that society uses to run away from reality. The destination of this escape can be called "consumutopia," a virtual space of "perfect consumption" where reality can be ignored. Consumutopia is one example of the - real or metaphorical - "spaces" that are examined in Paranoia Agent.
|
269 |
Exploring Complexity in the Past: The Hohokam Water Management SimulationMurphy, John Todd January 2009 (has links)
The Hohokam Water Management Simulation (HWM) is a computer simulation for exploring the operation of the Hohokam irrigation systems in southern Arizona. The simulation takes a middle road between two common kinds of archaeological simulation: large-scale, detailed landscape and environmental reconstructions and highly abstract hypothesis-testing simulations. Given the apparent absence in the Hohokam context of a central authority, the specific aim of the HWM is approaching the Hohokam as a complex system, using principles such as resilience, robustness, and self-organization. The Hohokam case is reviewed, and general questions concerning how the irrigation systems operated are shown to subsume multiple crosscutting and unresolved issues. Existing proposals about the relevant aspects of Hohokam society and of its larger long-term trajectory are based on widely varying short- and long-term processes that invoke different elements, draw different boundaries, and operate at different spatial and temporal scales, and many rely on information that is only incompletely available. A framework for pproaching problems of this kind is put forward. A definition of modeling is offered that specifies its epistemological foundations, permissible patterns of inference, and its role in our larger scientific process. Invoking Logical Positivism, a syntactic rather than semantic view of modeling is proposed: modeling is the construction of sets of assertions about the world and deductions that can be drawn from them. This permits a general model structure to be offered that admits hypothetical or provisional assertions and the flexible interchange of model components of varying scope and resolution. Novel goals for archaeological inquiry fall from this flexible approach; these move from specific reconstruction to a search for more universal and general dynamics. A software toolkit that embodies these principles is introduced: the Assertion-Based Computer Modeling toolkit (ABCM), which integrates simulation with the logical architecture of a relational database, and further provides an easy means for linking models of natural and social processes (including agent-based modeling). The application of this to the Hohokam context is described, and an extended example is presented that demonstrates the flexibility, utility and challenges of the approach. An attached file provides sample output.
|
270 |
A SIMULATION PLATFORM FOR EXPERIMENTATION AND EVALUATION OF DISTRIBUTED-COMPUTING SYSTEMSXu, Yijia January 2005 (has links)
Distributed simulations have been widely applied as the method to study complex systems which are analytically intractable and numerically prohibitive to evaluate. However it is not a trivia task to develop distributed simulations. Besides distributed simulations may introduce difficulties for analysis due to decentralized, heterogeneous data sources. It is important to integrate these data sources seamlessly for analysis. In applications for system design, it is required to explore the alternatives of hardware components, algorithms, and simulation models. How to enable these operations conveniently is critical for the distributed system as well. All these challenges raise the need of a workbench that facilitates rapid composition, evaluation, modification and validation of components in a distributed system.This dissertation proposes a platform for these challenges, which we refer to the SPEED-CS platform. The architecture of the platform consists of multiple layers that include network layer, component management layer, components layer, and modeling layer. It is a multi-agent system (MAS), containing static agents and mobile agents. The mobile agent is referred as the Data Exchange Agent, which is able to visit sub-simulations and has the intelligence to find the useful data for output analysis. Experiments show that the MAS requires much less network bandwidth than the "centralized" system does, in which simulations report data to output analyst.The application of the SPEED-CS platform is extended to handle systems with dynamic data sources. We demonstrate that the platform can be used for parallel reality applications where simulation parameters can be updated according to real-time sensor information. Data exchange agents are involved to manage the collection, dissemination, and analysis of data from dynamic data sources including simulations and/or physical systems.The SPEED-CS platform is also implemented to integrate simulations and optimizations. The system is able to provide services to facilitate distributed computing, event services, naming services, and component management. One of the important features is that the component sets can be updated and enlarged with different models adding in. This feature enables the platform to work as a testbed to explore alternatives of system designs.Finally we conclude this dissertation with several future research topics.
|
Page generated in 0.0676 seconds