1 |
WINGS CONCEPT: PRESENT AND FUTUREHarris, Jim, Downing, Bob 10 1900 (has links)
International Telemetering Conference Proceedings / October 20-23, 2003 / Riviera Hotel and Convention Center, Las Vegas, Nevada / The Western Aeronautical Test Range (WATR) of NASA’s Dryden Flight Research Center (DFRC) is
facing a challenge in meeting the technology demands of future flight mission projects. Rapid growth in
technology for aircraft has resulted in complexity often surpassing the capabilities of the current WATR
real-time processing and display systems. These current legacy systems are based on an architecture
that is over a decade old. In response, the WATR has initiated the development of the WATR
Integrated Next Generation System (WINGS). The purpose of WINGS is to provide the capability to
acquire data from a variety of sources and process that data for subsequent analysis and display to
Project Users in the WATR Mission Control Centers (MCCs) in real-time, near real-time and
subsequent post-mission analysis. WINGS system architecture will bridge the continuing gap between
new research flight test requirements and capability by distributing current system architectures to
provide incremental and iterative system upgrades.
|
2 |
A generic predictive information system for resource planning and optimisationTavakoli, Siamak January 2010 (has links)
The purpose of this research work is to demonstrate the feasibility of creating a quick response decision platform for middle management in industry. It utilises the strengths of current, but more importantly creates a leap forward in the theory and practice of Supervisory and Data Acquisition (SCADA) systems and Discrete Event Simulation and Modelling (DESM). The proposed research platform uses real-time data and creates an automatic platform for real-time and predictive system analysis, giving current and ahead of time information on the performance of the system in an efficient manner. Data acquisition as the backend connection of data integration system to the shop floor faces both hardware and software challenges for coping with large scale real-time data collection. Limited scope of SCADA systems does not make them suitable candidates for this. Cost effectiveness, complexity, and efficiency-orientation of proprietary solutions leave space for more challenge. A Flexible Data Input Layer Architecture (FDILA) is proposed to address generic data integration platform so a multitude of data sources can be connected to the data processing unit. The efficiency of the proposed integration architecture lies in decentralising and distributing services between different layers. A novel Sensitivity Analysis (SA) method called EvenTracker is proposed as an effective tool to measure the importance and priority of inputs to the system. The EvenTracker method is introduced to deal with the complexity systems in real-time. The approach takes advantage of event-based definition of data involved in process flow. The underpinning logic behind EvenTracker SA method is capturing the cause-effect relationships between triggers (input variables) and events (output variables) at a specified period of time determined by an expert. The approach does not require estimating data distribution of any kind. Neither the performance model requires execution beyond the real-time. The proposed EvenTracker sensitivity analysis method has the lowest computational complexity compared with other popular sensitivity analysis methods. For proof of concept, a three tier data integration system was designed and developed by using National Instruments’ LabVIEW programming language, Rockwell Automation’s Arena simulation and modelling software, and OPC data communication software. A laboratory-based conveyor system with 29 sensors was installed to simulate a typical shop floor production line. In addition, EvenTracker SA method has been implemented on the data extracted from 28 sensors of one manufacturing line in a real factory. The experiment has resulted 14% of the input variables to be unimportant for evaluation of model outputs. The method proved a time efficiency gain of 52% on the analysis of filtered system when unimportant input variables were not sampled anymore. The EvenTracker SA method compared to Entropy-based SA technique, as the only other method that can be used for real-time purposes, is quicker, more accurate and less computationally burdensome. Additionally, theoretic estimation of computational complexity of SA methods based on both structural complexity and energy-time analysis resulted in favour of the efficiency of the proposed EvenTracker SA method. Both laboratory and factory-based experiments demonstrated flexibility and efficiency of the proposed solution.
|
3 |
Low power real-time data acquisition using compressive sensingPowers, Linda S., Zhang, Yiming, Chen, Kemeng, Pan, Huiqing, Wu, Wo-Tak, Hall, Peter W., Fairbanks, Jerrie V., Nasibulin, Radik, Roveda, Janet M. 18 May 2017 (has links)
New possibilit ies exist for the development of novel hardware/software platforms havin g fast data acquisition capability with low power requirements. One application is a high speed Adaptive Design for Information (ADI) system that combines the advantages of feature-based data compression, low power nanometer CMOS technology, and stream computing [1]. We have developed a compressive sensing (CS) algorithm which linearly reduces the data at the analog front end, an approach which uses analog designs and computations instead of smaller feature size transistors for higher speed and lower power. A level-crossing sampling approach replaces Nyquist sampling. With an in-memory design, the new compressive sensing based instrumentation performs digitization only when there is enough variation in the input and when the random selection matrix chooses this input.
|
4 |
Smart offshore structure for reliability prediction processNwankwo, Cosmas Chidozie 09 1900 (has links)
A review of the developments within the field of structural reliability theory shows that
some gaps still exist in the reliability prediction process and hence there is an urgent
desire for improvements such that the estimated structural reliability will be capable of
expressing a physical property of the given structure. The current reliability prediction
process involves the continuous estimation and use of reliability index as a way of
estimating the safety of any given structure. The reliability index β depends on the
Probability Density Function (PDF) distribution for the wave force and the corresponding
PDF of resistance from respective structural members of the given structure. The PDF
for the applied wave force will depend on the PDF of water depth, wave angular velocity
and wave direction hence the reliability index as currently practiced is a statistical way
of managing uncertainties based on a general probabilistic model.
This research on Smart Offshore Structure for Reliability Prediction has proposed the
design of a measurement based reliability prediction process as a way of closing the
gap on structural reliability prediction process. Structural deflection and damping are
some of the measurable properties of an offshore structure and this study aims at
suggesting the use of these measurable properties for improvements in structural
reliability prediction process. A design case study has shown that a typical offshore
structure can deflect to a range of only a few fractions of a millimetre. This implies that if
we have a way of monitoring this level of deflection, we could use the results from such
measurement for the detection of a structural member failure. This advocated concept is
based on the hypothesis that if the original dynamic characteristics of a structure is
known, that measurement based modified dynamic properties can be used to determine
the onset of failure or failure propagation of the given structure.
This technology could reveal the location and magnitude of internal cracks or corrosion
effects on any given structure which currently is outside the current probability based
approach. A simple economic analysis shows that the recommended process shows a
positive net present value and that some $74mln is the Value of Information for any life
extension technology that could reveal the possibility of extending the life of a given
10,000bopd production platform from 2025 to 2028.
|
5 |
MIGRATING FROM A VAX/VMS TO AN INTEL/WINDOWS-NT BASED GROUND STATIONPenna, Sergio D., Rios, Domingos B. 10 1900 (has links)
International Telemetering Conference Proceedings / October 25-28, 1999 / Riviera Hotel and Convention Center, Las Vegas, Nevada / Upgrading or replacing production systems is always a very resource-consuming task, in particular if the systems being replaced are quite specialized, such as those serving any Flight Test Ground Station. In the recent past a large number of Ground Station systems were based in Digital’s VAX/VMS architecture. The computer industry then expanded very fast and by 1990 realtime PCM data processing systems totally dependent on hardware and software designed for IBM-PC compatible micro-computers were becoming available. A complete system replacement in a typical Ground Station can take from one to several years to become a reality. It depends on how complex the original system is, how complex the resulting system needs to be, how much resources are available to support the operation, how soon the organization needs it, etc. This paper intends to review the main concerns encountered during the replacement of a typical VAX/VMS-based by an Intel-Windows NT-based Ground Station. It covers the transition from original requirements to totally new requirements, from mini-computers to micro-computers, from DMA to high-speed LAN data transfers, while conserving some key architectural features. This 8-month development effort will expand EMBRAER’s capability in acquiring, processing and archiving PCM data in the next few years at a lower cost, while preserving compatibility with old legacy flight test data.
|
6 |
Desenvolvimento de uma c?lula de filtra??o com opera??o autom?tica para monitoramento de dados on line / Developments of a filtration cell with automatic operation that monitoring experimental data in real timeBARBOSA, Renan Fraga 25 August 2016 (has links)
Submitted by Jorge Silva (jorgelmsilva@ufrrj.br) on 2017-05-03T18:45:04Z
No. of bitstreams: 1
2016 - Renan Fraga Barbosa.pdf: 4950481 bytes, checksum: 7339d2f44b8f9102ca0d628cc4a99002 (MD5) / Made available in DSpace on 2017-05-03T18:45:04Z (GMT). No. of bitstreams: 1
2016 - Renan Fraga Barbosa.pdf: 4950481 bytes, checksum: 7339d2f44b8f9102ca0d628cc4a99002 (MD5)
Previous issue date: 2016-08-25 / Funda??o de Apoio ? Pesquisa Cient?fica e Tecnol?gica da UFRRJ / The filtration loss is a phenomenon caused by the gradient pressure between the annular region and the rock formation during the drilling of an oil and gas well under overbalanced conditions. The invasion of fluids may provoke irreversible damages to the integrity of the well due to the solids present in their composition. Therefore, drilling fluids must be formulated in such a way that a low permeability mudcake with controlled thickness is formed in order to mitigate the filtration and invasion. Filtration cells are experimental apparatus used in the industry and at universities to study the mudcake growth in the rock formation and to measure the filtrate volume. Such devices operate off line and require a specialized operator. The objective of this work was to optimize, automate and validate a static filtration prototype cell that collected experimental data in real time. An experimental unit was built to prepare the fluids and to feed them into the filtration prototype. This unit is composed of a positive displacement pump, mixing tanks and sensors to monitor the properties of the fluid. The unit and the prototype compose the on line filtration unit. In the filtration prototype, valves and sensors were installed to operate remotely. A virtual interface was developed in order to manage the filtration unit. This interface is capable of monitoring the data provided by the sensors as well as operating the equipments (pump, agitators and automatic valves, including the ones in the filtration cell). Comparative experiments were performed in a HTHP off line filtration cell using similar operational conditions to validate the prototype. As results, the filtrate volume and the filtration properties (porosity and permeability of the mudcake) values obtained for both cells shown to be similar. The on line filtration unit was capable of reproducing the data provided by the off line filtration cell used as a model in this work. Thus, one can conclude that the on line filtration cell operates appropriately. / A filtra??o e a invas?o do fluido de perfura??o s?o fen?menos provocados pelo diferencial de press?o entre a regi?o anular e a forma??o rochosa durante a perfura??o de um po?o de petr?leo e g?s sob condi??es overbalance. A invas?o de fluidos e s?lidos presentes no fluido de perfura??o podem causar danos irrevers?veis ao po?o, portanto formulam-se os fluidos de modo que seja formado um reboco de baixa permeabilidade e espessura controlada, minimizando a filtra??o e a invas?o. Na ind?stria e na academia, para estudar o crescimento da torta na forma??o rochosa e avaliar o volume de filtrado, utilizam-se aparatos experimentais, denominados c?lulas de filtra??o, entretanto estes dispositivos s?o aparatos de bancada necessitam de um operador especializado. O objetivo desse trabalho foi otimizar, automatizar e validar um prot?tipo de filtra??o est?tica que coleta dados em linha e em tempo real. Para alimentar o fluido no prot?tipo de filtra??o, foi constru?da uma unidade de prepara??o de fluidos composta por uma bomba de deslocamento positivo, tanques de mistura e sensores para monitorar as propriedades do fluido. A unidade de preparo e o prot?tipo de filtra??o constituem a denominada unidade de filtra??o. No prot?tipo de filtra??o, foram instaladas v?lvulas e sensores para opera??o remota. Para gerenciar a unidade de filtra??o, foi desenvolvida uma interface virtual que monitora os dados fornecidos pelos sensores e opera os equipamentos (bomba, agitadores e as v?lvulas autom?ticas, inclusive as da c?lula de filtra??o). Para validar o prot?tipo, foram realizados experimentos comparativos com uma c?lula de filtra??o HTHP de bancada em condi??es de opera??o semelhantes. Como resultado das filtra??es na c?lula de bancada e com o prot?tipo, foi observado que o volume de filtrado e propriedades calculadas nos experimentos de filtra??o (porosidade e permeabilidade da torta) apresentaram valores similares indicando que a c?lula on line opera de forma adequada, reproduzindo os dados da c?lula de bancada que foi usada como modelo na proposta deste trabalho.
|
7 |
The Design of a Synchronized Wireless Biomedical Measurement System / Konstruktion av ett synkroniserat trådlöst biomedicinskt mätsystemGulda, Max, Jesper, Sjöberg January 2024 (has links)
This thesis presents the design, implementation, and validation of a synchronized wireless biomedical measurement system, specifically developed for research in Cardiac-Locomotor Coupling (CLC). By integrating EMG and ECG sensors in combination with inertial measurements, the system enables the simultaneous recording and analysis of physiological and biomechanical data. The adoption of a 433 MHz radio communication protocol ensures reliable synchronization across multiple sensor nodes. Results demonstrate the system’s effectiveness in providing accurate, synchronized data essential for enhancing understanding of CLC and its implications on athletic performance and rehabilitation of patients with heart deficiencies. This project addresses the lack of an integrated, wireless system capable of efficiently synchronizing biomedical measurements, such as heart rate, body movement, and blood pressure, thus facilitating advanced research into CLC. / Denna avhandling presenterar designen, implementationen och valideringen av ett synkroniserat trådlöst biomedicinskt mätsystem, specifikt utvecklat för forskning inom aktiv länkning av rörelseapparaten och hjärtats pumparbete, även känd som Cardiac-Locomotor Coupling (CLC) på engelska. Genom att integrera EMG- och EKG sensorer i kombination med accelerometermätningar möjliggör systemet simultan inspelning och analys av fysiologiska och biomekaniska data. Användningen av en 433 MHz radio-kommunikationsprotokoll säkerställer pålitlig synkronisering över flera sensorsnoder. Resultatet visar systemets effektivitet när det gäller att tillhandahålla noggranna, synkroniserade data som är avgörande för att förbättra förståelsen för CLC och dess implikationer för idrottsprestation och rehabilitering. Detta projekt ämnar att fylla behovet av ett integrerat trådlöst system som är kapabelt att effektivt synkronisera biomedicinska mätningar, såsom hjärtfrekvens, kroppsrörelse och blodtryck, vilket möjliggör avancerad forskning av CLC.
|
Page generated in 0.1521 seconds