A terrestrial robotic electrophysiology platform has been developed that can hold a moth (<italic>Manduca sexta</italic>), record signals from its brain or muscles, and use these signals to control the rotation of the robot. All signal processing (electrophysiology, spike detection, and robotic control) was performed onboard the robot with custom designed electronic circuits. Wireless telemetry allowed remote communication with the robot. In this study, we interfaced directionally-sensitive visual neurons and pleurodorsal steering muscles of the mesothorax with the robot and used the spike rate of these signals to control its rotation, thereby emulating the classical optomotor response known from studies of the fly visual system. The interfacing of insect and machine can contribute to our understanding of the neurobiological processes underlying behavior and also suggest promising advancements in biosensors and human brain-machine interfaces.
Identifer | oai:union.ndltd.org:arizona.edu/oai:arizona.openrepository.com:10150/145388 |
Date | January 2011 |
Creators | Melano, Timothy |
Contributors | Higgins, Charles M., Frye, Mark A., Gronenberg, Wulfila, Hildebrand, John G., Strausfeld, Nicholas J. |
Publisher | The University of Arizona. |
Source Sets | University of Arizona |
Language | English |
Detected Language | English |
Type | Electronic Dissertation, text |
Rights | Copyright © is held by the author. Digital access to this material is made possible by the University Libraries, University of Arizona. Further transmission, reproduction or presentation (such as public display or performance) of protected items is prohibited except with permission of the author. |
Page generated in 0.0022 seconds