Spelling suggestions: "subject:"2mission control"" "subject:"2mission coontrol""
1 |
Trends in Space Shuttle Telemetry ApplicationsMuratore, John F. 10 1900 (has links)
International Telemetering Conference Proceedings / October 26-29, 1987 / Town and Country Hotel, San Diego, California / During early manned spacecraft operations, the primary role of ground telemetry systems was data display to flight controllers. As manned spaceflights have increased in complexity, greater demands have been placed on flight controllers to simultaneously monitor systems and replan systems operations. This has led to interest in automated telemetry monitoring systems to decrease the workload on flight controllers. The Mission Operations Directorate at the Lyndon B. Johnson Space Center has developed a five layer model to integrate various monitoring and analysis technologies such as digital filtering, fault detection algorithms, and expert systems. The paper describes the five layer model and explains how it has been used to guide prototyping efforts at Mission Control. Results from some initial expert systems are presented. The paper also describes the integrated prototype currently under development which implements a real time expert system to assist flight controllers in the Mission Control Center in monitoring Space Shuttle communications systems.
|
2 |
MMTS: Multi-Vehicle Metric & Telemetry SystemAspnes, Richard K., Yuma, Russell J. 10 1900 (has links)
International Telemetering Conference Proceedings / October 17-20, 1988 / Riviera Hotel, Las Vegas, Nevada / The Multi-Vehicle Metric & Telemetry System (MMTS) is a complete range system which performs real-time tracking, command destruct, and telemetry processing functions for support of range safety and the test and evaluation of airborne vehicles. As currently configured, the MMTS consists of five hardware and software subsystems with the capability to receive, process, and display tracking data from up to ten range sensors and telemetry data from two instrumented vehicles. During a range operation, the MMTS is employed to collect, process, and display tracking and telemetry data. The instrumentation sites designated for operational support acquire tracking and telemetered data and transmit these data to the MMTS. The raw data is then identified, formatted, time tagged, recorded, processed, and routed for display to mission control and telemetry display areas. Additionally, processed tracking data is transmitted back to instrumentation sites as an aid to acquire or maintain vehicle track. The mission control area consists of a control and status console, high resolution color graphics stations, and large screen displays. As the mission controller observes mission progress on the graphics stations operational decisions can be made and invoked by activation of the appropriate console controls. Visual alarms provided my MMTS will alert mission control personnel of hazardous conditions posed by any tracked vehicle. Manual action can then be taken to activate transmission of the MMTS vehicle destruct signal. The telemetry display area consists of ten fully-functional, PC compatible computers which are switchable to either of two telemetry front end processors. Each PC can be independently set up by telemetry analysts to display data of interest. A total of thirty data pages per PC can be defined and any defined data page can be activated during a mission. A unique feature of the MMTS is that telemetry data can be combined with tracking data for use by the range safety functions.
|
3 |
REAL-TIME TELEMETRY DATA SUPPORT FOR THE F-22 FLIGHT TEST PROGRAMKegel, Thomas, Lipe, Bruce, Swords, Jacquelyn 10 1900 (has links)
International Telemetering Conference Proceedings / October 25-28, 1999 / Riviera Hotel and Convention Center, Las Vegas, Nevada / This paper describes the recently developed F-22 real-time telemetry data processing
system. The F-22 Combined Test Force (CTF) and the Range Division worked together
to develop a real-time telemetry processing system able to support the F-22’s fast paced
flight test program. This paper provides an overview of the Ridley Mission Control
Center (RMCC) modernization effort for F-22. The paper also describes how the F-22
uses the Advanced Data Acquisition and Processing Systems (ADAPS) Real-Time/Post
Flight Processing (RT/PFP) system, the Integrated Analysis and Display System (IADS),
and other mission control room system’s for F-22 mission control support.
|
4 |
Integration of Mission Control System, On-board Computer Core and spacecraft Simulator for a Satellite Test BenchChintalapati, Lakshmi Venkata Bharadwaj 04 November 2016 (has links) (PDF)
The satellite avionics platform has been developed in cooperation with Airbus and is called „Future Low-cost Platform“ (FLP). It is based on an Onboard Computer (OBC) with redundant processor boards based on SPARC V8 microchips of type Cobham Aeroflex UT699. At the University of Stuttgart a test bench with a real hardware OBC and a fully simulated satellite is available for testing real flight scenarios with the Onboard Software (OBSW) running on representative hardware. The test bench as later the real flying satellite "Flying Laptop" – is commanded from a real Ground Control Centre (GCC). The main challenges in the FLP project were
- Onboard computer design,
- Software design and
- Interfaces between platform and payloads
In the course of industrialization of this FLP platform technology for later use in satellite constellations, Airbus has started to set up an in-house test bench where all the technologies shall be developed. The initial plan is to get first core elements of the FLP OBSW ported to the new dual-core processor and the new Space Wire(SpW) routing network. The plan also has an inclusion of new Mission Control Software with which one can command the OBC. The new OBC has a dual core processor Cobham Gaisler GR712 and hence, all the payload and related functionality are to be implemented only in a second core which involves a lot of low-level task distribution. The consequent SpW router network application and dual-core platform/payload OBSW sharing are entirely new in the field of satellite engineering.
|
5 |
Système robotisé semi-autonome pour l'observation des espèces marines / Semi-autonomous robotic system for marine species observationLouis, Silvain 23 July 2018 (has links)
L'objectif de cette thèse, en collaboration avec une équipe de biologistes de Marbec, est de développer un système robotisé semi-autonome pour l’observation des espèces marines. Pour cela, ce système devra effectuer les protocoles biologistes connus ainsi que de nouveaux protocoles tout en démontrant son efficacité par rapport à un plongeur. Pour réaliser correctement les protocoles, nous avons développé les lois de commande associées ainsi qu'un système de gestion de mission pour permettre la construction, la validation formelle et l'exécution d'une telle mission. Enfin, pour répondre à la problématique de faisabilité de l'observation par un robot, nous avons mené les expérimentations à Mayotte. / The goal of this thesis, in collaboration with a biologists team of Marbec, is to develop a semi-autonomous robotic system for marine species observation. For this, this system will have to perform the known biologist protocols as well as new protocols while proving its effectiveness compared to a diver. To achieve the protocols, we have developed the associated control laws and a mission management system to allow the construction, the formal validation and the execution of a mission. Finally, to answer the problem of feasibility of observation by a robot, we conducted the experiments in Mayotte.
|
6 |
Expert System-based Autonomous Mission Control for Unmanned Aerial VehicleAhmed, Salaheldin Ashraf Abdulrahiem 11 September 2018 (has links)
UAV applications have witnessed a great leap during the last decade including aerial photography, surveillance, inspection, mapping and many other applications. Using UAVs has many advantages over manned aerial vehicles. Reducing costs and avoiding putting human lives in danger are two major benefits. Currently, most of the UAVs are remotely controlled by human operators, either by having Line of Sight between the operator and the UAV or by controlling it from a ground control station. This may be fine in short missions. However, manually executing long and boring missions adds much inconvenience on the human operators and consumes more human resources. In addition, there is always the risk of losing the connection between the UAV and the human operators which leads to unpredicted, and probably catastrophic, consequences. The objective of this work is to reduce this inconvenience by moving the decision making responsibility from the human operators to the mission control system mounted on the UAV. In other words, the target is to design an on-board autonomous mission control system that has the capability of making decisions on-board and in real-time. Expert system technology, which is a type of artificial intelligence, is used to reach the autonomy of the target UAV. Expert system has the advantage of dealing with uncertainty during the mission execution. It also makes the system easily adaptable to execute any mission that can be described in form of rules. In this thesis, the design, implementation and testing of the expert system-based autonomous mission controller (ESBAMC) is covered. The target mission used to prove the feasibility of the proposed approach is the inspection of power poles. Power pole insulator is autonomously inspected by capturing three pictures from three different points of view. The proposed system has been successfully tested in simulation. Results show the performance and efficiency of the system to make decisions in real-time in any possible situation that may occur during the execution of the considered mission. In the near future, it is planned to test the proposed system in reality.
|
7 |
A framework for autonomous mission and guidance control of unmanned aerial vehicles based on computer vision techniquesBasso, Maik January 2018 (has links)
A computação visual é uma área do conhecimento que estuda o desenvolvimento de sistemas artificiais capazes de detectar e desenvolver a percepção do meio ambiente através de informações de imagem ou dados multidimensionais. A percepção visual e a manipulação são combinadas em sistemas robóticos através de duas etapas "olhar"e depois "movimentar-se", gerando um laço de controle de feedback visual. Neste contexto, existe um interesse crescimente no uso dessas técnicas em veículos aéreos não tripulados (VANTs), também conhecidos como drones. Essas técnicas são aplicadas para posicionar o drone em modo de vôo autônomo, ou para realizar a detecção de regiões para vigilância aérea ou pontos de interesse. Os sistemas de computação visual geralmente tomam três passos em sua operação, que são: aquisição de dados em forma numérica, processamento de dados e análise de dados. A etapa de aquisição de dados é geralmente realizada por câmeras e sensores de proximidade. Após a aquisição de dados, o computador embarcado realiza o processamento de dados executando algoritmos com técnicas de medição (variáveis, índice e coeficientes), detecção (padrões, objetos ou áreas) ou monitoramento (pessoas, veículos ou animais). Os dados processados são analisados e convertidos em comandos de decisão para o controle para o sistema robótico autônomo Visando realizar a integração dos sistemas de computação visual com as diferentes plataformas de VANTs, este trabalho propõe o desenvolvimento de um framework para controle de missão e guiamento de VANTs baseado em visão computacional. O framework é responsável por gerenciar, codificar, decodificar e interpretar comandos trocados entre as controladoras de voo e os algoritmos de computação visual. Como estudo de caso, foram desenvolvidos dois algoritmos destinados à aplicação em agricultura de precisão. O primeiro algoritmo realiza o cálculo de um coeficiente de reflectância visando a aplicação auto-regulada e eficiente de agroquímicos, e o segundo realiza a identificação das linhas de plantas para realizar o guiamento dos VANTs sobre a plantação. O desempenho do framework e dos algoritmos propostos foi avaliado e comparado com o estado da arte, obtendo resultados satisfatórios na implementação no hardware embarcado. / Cumputer Vision is an area of knowledge that studies the development of artificial systems capable of detecting and developing the perception of the environment through image information or multidimensional data. Nowadays, vision systems are widely integrated into robotic systems. Visual perception and manipulation are combined in two steps "look" and then "move", generating a visual feedback control loop. In this context, there is a growing interest in using computer vision techniques in unmanned aerial vehicles (UAVs), also known as drones. These techniques are applied to position the drone in autonomous flight mode, or to perform the detection of regions for aerial surveillance or points of interest. Computer vision systems generally take three steps to the operation, which are: data acquisition in numerical form, data processing and data analysis. The data acquisition step is usually performed by cameras or proximity sensors. After data acquisition, the embedded computer performs data processing by performing algorithms with measurement techniques (variables, index and coefficients), detection (patterns, objects or area) or monitoring (people, vehicles or animals). The resulting processed data is analyzed and then converted into decision commands that serve as control inputs for the autonomous robotic system In order to integrate the visual computing systems with the different UAVs platforms, this work proposes the development of a framework for mission control and guidance of UAVs based on computer vision. The framework is responsible for managing, encoding, decoding, and interpreting commands exchanged between flight controllers and visual computing algorithms. As a case study, two algorithms were developed to provide autonomy to UAVs intended for application in precision agriculture. The first algorithm performs the calculation of a reflectance coefficient used to perform the punctual, self-regulated and efficient application of agrochemicals. The second algorithm performs the identification of crop lines to perform the guidance of the UAVs on the plantation. The performance of the proposed framework and proposed algorithms was evaluated and compared with the state of the art, obtaining satisfactory results in the implementation of embedded hardware.
|
8 |
A Mission Planning Expert System with Three-Dimensional Path Optimization for the NPS Model 2 Autonomous Underwater VehicleOng, Seow Meng 06 1900 (has links)
Approved for public release; distribution is unlimited / Unmanned vehicle technology has matured significantly over the last two decades. This is evidenced by its widespread use in industrial and military applications ranging from deep-ocean exploration to anti-submarine warefare. Indeed, the feasiblity of short-range, special-purpose vehicles (whether aunonomous or remotely operated) is no longer in question. The research efforts have now begun to shift their focus on development of reliable, longer-range, high-endurance and fully autonomous systems. One of the major underlying technologies required to realize this goal is Artificial Intelligence (AI). The latter offers great potential to endow vehicles with the intelligence needed for full autonomy and extended range capability; this involves the increased application of AI technologies to support mission planning and execution, navigation and contingency planning. This thesis addresses two issues associated with the above goal for Autonomous Underwater Vehicles (AUV's). Firstly, a new approach is proposed for path planning in underwater environments that is capable of dealing with uncharted obstacles and which requires significantly less planning time and computer memory. Secondly, it explores the use of expert system technology in the planning of AUV missions.
|
9 |
A framework for autonomous mission and guidance control of unmanned aerial vehicles based on computer vision techniquesBasso, Maik January 2018 (has links)
A computação visual é uma área do conhecimento que estuda o desenvolvimento de sistemas artificiais capazes de detectar e desenvolver a percepção do meio ambiente através de informações de imagem ou dados multidimensionais. A percepção visual e a manipulação são combinadas em sistemas robóticos através de duas etapas "olhar"e depois "movimentar-se", gerando um laço de controle de feedback visual. Neste contexto, existe um interesse crescimente no uso dessas técnicas em veículos aéreos não tripulados (VANTs), também conhecidos como drones. Essas técnicas são aplicadas para posicionar o drone em modo de vôo autônomo, ou para realizar a detecção de regiões para vigilância aérea ou pontos de interesse. Os sistemas de computação visual geralmente tomam três passos em sua operação, que são: aquisição de dados em forma numérica, processamento de dados e análise de dados. A etapa de aquisição de dados é geralmente realizada por câmeras e sensores de proximidade. Após a aquisição de dados, o computador embarcado realiza o processamento de dados executando algoritmos com técnicas de medição (variáveis, índice e coeficientes), detecção (padrões, objetos ou áreas) ou monitoramento (pessoas, veículos ou animais). Os dados processados são analisados e convertidos em comandos de decisão para o controle para o sistema robótico autônomo Visando realizar a integração dos sistemas de computação visual com as diferentes plataformas de VANTs, este trabalho propõe o desenvolvimento de um framework para controle de missão e guiamento de VANTs baseado em visão computacional. O framework é responsável por gerenciar, codificar, decodificar e interpretar comandos trocados entre as controladoras de voo e os algoritmos de computação visual. Como estudo de caso, foram desenvolvidos dois algoritmos destinados à aplicação em agricultura de precisão. O primeiro algoritmo realiza o cálculo de um coeficiente de reflectância visando a aplicação auto-regulada e eficiente de agroquímicos, e o segundo realiza a identificação das linhas de plantas para realizar o guiamento dos VANTs sobre a plantação. O desempenho do framework e dos algoritmos propostos foi avaliado e comparado com o estado da arte, obtendo resultados satisfatórios na implementação no hardware embarcado. / Cumputer Vision is an area of knowledge that studies the development of artificial systems capable of detecting and developing the perception of the environment through image information or multidimensional data. Nowadays, vision systems are widely integrated into robotic systems. Visual perception and manipulation are combined in two steps "look" and then "move", generating a visual feedback control loop. In this context, there is a growing interest in using computer vision techniques in unmanned aerial vehicles (UAVs), also known as drones. These techniques are applied to position the drone in autonomous flight mode, or to perform the detection of regions for aerial surveillance or points of interest. Computer vision systems generally take three steps to the operation, which are: data acquisition in numerical form, data processing and data analysis. The data acquisition step is usually performed by cameras or proximity sensors. After data acquisition, the embedded computer performs data processing by performing algorithms with measurement techniques (variables, index and coefficients), detection (patterns, objects or area) or monitoring (people, vehicles or animals). The resulting processed data is analyzed and then converted into decision commands that serve as control inputs for the autonomous robotic system In order to integrate the visual computing systems with the different UAVs platforms, this work proposes the development of a framework for mission control and guidance of UAVs based on computer vision. The framework is responsible for managing, encoding, decoding, and interpreting commands exchanged between flight controllers and visual computing algorithms. As a case study, two algorithms were developed to provide autonomy to UAVs intended for application in precision agriculture. The first algorithm performs the calculation of a reflectance coefficient used to perform the punctual, self-regulated and efficient application of agrochemicals. The second algorithm performs the identification of crop lines to perform the guidance of the UAVs on the plantation. The performance of the proposed framework and proposed algorithms was evaluated and compared with the state of the art, obtaining satisfactory results in the implementation of embedded hardware.
|
10 |
A framework for autonomous mission and guidance control of unmanned aerial vehicles based on computer vision techniquesBasso, Maik January 2018 (has links)
A computação visual é uma área do conhecimento que estuda o desenvolvimento de sistemas artificiais capazes de detectar e desenvolver a percepção do meio ambiente através de informações de imagem ou dados multidimensionais. A percepção visual e a manipulação são combinadas em sistemas robóticos através de duas etapas "olhar"e depois "movimentar-se", gerando um laço de controle de feedback visual. Neste contexto, existe um interesse crescimente no uso dessas técnicas em veículos aéreos não tripulados (VANTs), também conhecidos como drones. Essas técnicas são aplicadas para posicionar o drone em modo de vôo autônomo, ou para realizar a detecção de regiões para vigilância aérea ou pontos de interesse. Os sistemas de computação visual geralmente tomam três passos em sua operação, que são: aquisição de dados em forma numérica, processamento de dados e análise de dados. A etapa de aquisição de dados é geralmente realizada por câmeras e sensores de proximidade. Após a aquisição de dados, o computador embarcado realiza o processamento de dados executando algoritmos com técnicas de medição (variáveis, índice e coeficientes), detecção (padrões, objetos ou áreas) ou monitoramento (pessoas, veículos ou animais). Os dados processados são analisados e convertidos em comandos de decisão para o controle para o sistema robótico autônomo Visando realizar a integração dos sistemas de computação visual com as diferentes plataformas de VANTs, este trabalho propõe o desenvolvimento de um framework para controle de missão e guiamento de VANTs baseado em visão computacional. O framework é responsável por gerenciar, codificar, decodificar e interpretar comandos trocados entre as controladoras de voo e os algoritmos de computação visual. Como estudo de caso, foram desenvolvidos dois algoritmos destinados à aplicação em agricultura de precisão. O primeiro algoritmo realiza o cálculo de um coeficiente de reflectância visando a aplicação auto-regulada e eficiente de agroquímicos, e o segundo realiza a identificação das linhas de plantas para realizar o guiamento dos VANTs sobre a plantação. O desempenho do framework e dos algoritmos propostos foi avaliado e comparado com o estado da arte, obtendo resultados satisfatórios na implementação no hardware embarcado. / Cumputer Vision is an area of knowledge that studies the development of artificial systems capable of detecting and developing the perception of the environment through image information or multidimensional data. Nowadays, vision systems are widely integrated into robotic systems. Visual perception and manipulation are combined in two steps "look" and then "move", generating a visual feedback control loop. In this context, there is a growing interest in using computer vision techniques in unmanned aerial vehicles (UAVs), also known as drones. These techniques are applied to position the drone in autonomous flight mode, or to perform the detection of regions for aerial surveillance or points of interest. Computer vision systems generally take three steps to the operation, which are: data acquisition in numerical form, data processing and data analysis. The data acquisition step is usually performed by cameras or proximity sensors. After data acquisition, the embedded computer performs data processing by performing algorithms with measurement techniques (variables, index and coefficients), detection (patterns, objects or area) or monitoring (people, vehicles or animals). The resulting processed data is analyzed and then converted into decision commands that serve as control inputs for the autonomous robotic system In order to integrate the visual computing systems with the different UAVs platforms, this work proposes the development of a framework for mission control and guidance of UAVs based on computer vision. The framework is responsible for managing, encoding, decoding, and interpreting commands exchanged between flight controllers and visual computing algorithms. As a case study, two algorithms were developed to provide autonomy to UAVs intended for application in precision agriculture. The first algorithm performs the calculation of a reflectance coefficient used to perform the punctual, self-regulated and efficient application of agrochemicals. The second algorithm performs the identification of crop lines to perform the guidance of the UAVs on the plantation. The performance of the proposed framework and proposed algorithms was evaluated and compared with the state of the art, obtaining satisfactory results in the implementation of embedded hardware.
|
Page generated in 0.0747 seconds