Spelling suggestions: "subject:"realm time data."" "subject:"reals time data.""
281 |
Dyretiva: um método para a verificação das restrições temporais em sistemas embarcadosCadamuro Junior, João 10 2011 (has links)
O Dyretiva é um método desenvolvido para utilização na fase de testes de sistemas embarcados operando em tempo real e, em especial, na verificação das restrições temporais do sistema. Como a fase de testes situa-se no final do processo de desenvolvimento, quando o hardware está disponível e o software codificado, a verificação temporal é feita por meio de monitoração do sistema sob teste. As principais premissas do Dyretiva são considerar a limitação de recursos dos sistemas embarcados e as características intrínsecas dos sistemas em tempo real. O método é definido por uma abordagem de monitoração e por um modelo de falta. A abordagem de monitoração define a interface física e lógica necessárias para observar o sistema sob teste, bem como as estratégias de utilização que permitem otimizar a coleta de dados. O modelo de falta identifica as relações e componentes do sistema onde existe maior probabilidade de encontrar os erros procurados. Para demonstrar os conceitos do Dyretiva, um conjunto de ferramentas de apoio a aplicação do método foi construído. Este conjunto, chamado de SoftScope, é composto por seis ferramentas: um pré-instrumentador de código, um instrumentador de código, um monitor, um programa de controle do monitor, programas para filtragem e análise dos dados capturados e um programa de visualização dos resultados. O Dyretiva e o SoftScope são parte integrante do projeto PERF, que está em andamento no LIT (Laboratório de Inovação e Tecnologia em Sistemas Embarcados) da UTFPR (Universidade Tecnológica Federal do Paraná), cujo objetivo é construir um ambiente completo para o desenvolvimento de sistemas embarcados operando em tempo real. / The Dyretiva is a method used for verifying the time constraints of embedded realtime systems. The verification is performed by monitoring the embedded software when it is running in an embedded hardware. The Dyretiva method takes into account the resource constrained nature of embedded systems and the time bounded nature of real-time systems. The method is comprised by a monitoring approach and a fault model. The monitoring approach defines the physical and the logical interfaces used in the observation of the system under test, as well as the strategies used for an optimized trace data collection. The fault model identifies relationships and components of the system under test that are most likely to have time faults. To demonstrate Dyretiva concepts, a set of support tools called SoftScope has been developed. SoftScope is comprised of a source code pre-instrumentation tool, a source code instrumentation tool, a hybrid monitor, a program for controlling the hybrid monitor, programs for filtering and analyzing trace data, and a graphical presentation tool. The Dyretiva method and the SoftScope tool set are an integral part of the work-inprogress PERF project, which is under development in the LIT (Laboratory of Embedded Systems Innovation and Technology), at the UTFPR (Federal Technological University of Paraná State). The objective of the PERF project is to build a complete environment suitable for the development of embedded and real-time systems.
|
282 |
Dyretiva: um método para a verificação das restrições temporais em sistemas embarcadosCadamuro Junior, João 10 2011 (has links)
O Dyretiva é um método desenvolvido para utilização na fase de testes de sistemas embarcados operando em tempo real e, em especial, na verificação das restrições temporais do sistema. Como a fase de testes situa-se no final do processo de desenvolvimento, quando o hardware está disponível e o software codificado, a verificação temporal é feita por meio de monitoração do sistema sob teste. As principais premissas do Dyretiva são considerar a limitação de recursos dos sistemas embarcados e as características intrínsecas dos sistemas em tempo real. O método é definido por uma abordagem de monitoração e por um modelo de falta. A abordagem de monitoração define a interface física e lógica necessárias para observar o sistema sob teste, bem como as estratégias de utilização que permitem otimizar a coleta de dados. O modelo de falta identifica as relações e componentes do sistema onde existe maior probabilidade de encontrar os erros procurados. Para demonstrar os conceitos do Dyretiva, um conjunto de ferramentas de apoio a aplicação do método foi construído. Este conjunto, chamado de SoftScope, é composto por seis ferramentas: um pré-instrumentador de código, um instrumentador de código, um monitor, um programa de controle do monitor, programas para filtragem e análise dos dados capturados e um programa de visualização dos resultados. O Dyretiva e o SoftScope são parte integrante do projeto PERF, que está em andamento no LIT (Laboratório de Inovação e Tecnologia em Sistemas Embarcados) da UTFPR (Universidade Tecnológica Federal do Paraná), cujo objetivo é construir um ambiente completo para o desenvolvimento de sistemas embarcados operando em tempo real. / The Dyretiva is a method used for verifying the time constraints of embedded realtime systems. The verification is performed by monitoring the embedded software when it is running in an embedded hardware. The Dyretiva method takes into account the resource constrained nature of embedded systems and the time bounded nature of real-time systems. The method is comprised by a monitoring approach and a fault model. The monitoring approach defines the physical and the logical interfaces used in the observation of the system under test, as well as the strategies used for an optimized trace data collection. The fault model identifies relationships and components of the system under test that are most likely to have time faults. To demonstrate Dyretiva concepts, a set of support tools called SoftScope has been developed. SoftScope is comprised of a source code pre-instrumentation tool, a source code instrumentation tool, a hybrid monitor, a program for controlling the hybrid monitor, programs for filtering and analyzing trace data, and a graphical presentation tool. The Dyretiva method and the SoftScope tool set are an integral part of the work-inprogress PERF project, which is under development in the LIT (Laboratory of Embedded Systems Innovation and Technology), at the UTFPR (Federal Technological University of Paraná State). The objective of the PERF project is to build a complete environment suitable for the development of embedded and real-time systems.
|
283 |
Política fiscal, previsões orçamentárias e os determinantes dos desvios de execução no Brasil / Fiscal policy, budget forecasts and the determinants of execution deviations in BrazilElaine Cristina de Piza 18 July 2016 (has links)
O objetivo desta tese é analisar os determinantes políticos e econômicos dos desvios de execução orçamentária no Brasil, no período de 2002 a 2015. Desvios de execução são definidos como a diferença entre o valor observado da variável e o valor previsto no orçamento. A partir do uso de dados em tempo real, isto é, dados disponíveis aos formuladores de política no momento em que as decisões de política fiscal são tomadas, busca-se compreender os desvios de execução da política fiscal à luz de alguns fatores atuantes na fase de planejamento e execução da política, confrontando a intencionalidade declarada pelo executor da política no planejamento orçamentário com os resultados finais obtidos. Para contornar a ausência de uma base de dados em tempo real organizada para a economia brasileira, a base utilizada foi construída a partir dos documentos oficiais gerados ao longo das fases do processo orçamentário. A construção dessa base, no entanto, limitou-se em função da insuficiência de dados em tempo real de acesso público. Para atingir o objetivo proposto, os desvios de execução fiscal foram inicialmente avaliados a partir da análise dos eventos que condicionaram o planejamento e a execução do orçamento. À esta análise somou-se uma análise empírica, na qual buscou-se identificar os determinantes da fase de planejamento, a partir da análise dos fatores que influenciam na previsão de arrecadação, e os condicionantes da fase de execução orçamentária, a partir da identificação dos fatores que contribuem para os desvios de execução. Os resultados indicam que a previsão fiscal distanciou-se mais significativamente dos valores realizados a partir do ano de 2012 e que os desvios nas previsões dos fatores macroeconômicos, sobretudo na previsão de crescimento do PIB ajudam a explicar esse distanciamento. Verificou-se também um otimismo persistente nas projeções de crescimento do PIB utilizadas para a elaboração do orçamento. Adicionalmente, verificou-se que a adoção de subterfúgios contábeis adotados pelo Governo para atingir artificialmente as metas de superávit primário resultam em uma execução orçamentária desvinculada do orçamento planejado, dificultando a análise dos desvios de execução e transformando o processo de planejamento orçamentário em mera exigência legal, sem aplicação prática. / This thesis aims to analyze the politic and economic determinants of the budget execution deviation in Brazil, from 2002 to 2015. Execution deviation is defined as the difference between the observed value of the variable and the forecast budget value. From the use of real-time data, i.e., data available to policy makers at the time the fiscal policy decisions are taken, we seek to understand the execution deviations from fiscal policy in the light of some active factors during the planning phase and policy execution, confronting the intention declared by the policy maker in the budget planning with the final results. To work around the lack of a real-time dataset organized for the Brazilian economy, the dataset used was built from the official documents generated throughout the phases of the budget process. The construction of this dataset, however, was limited due to insufficient data in real-time public access. To achieve this goal, the fiscal execution deviations were initially evaluated from the analysis of the events that conditioned the planning and execution of the budget. To this analysis we added up an empirical analysis, in which we attempted to identify the determinants of the planning phase from the analysis of the factors that influence in the revenue forecast and the conditioners of the budget execution phase from the identification of factors contributing to the execution deviations. The results indicate that the realized values drifted away significantly more from the fiscal forecast from the year 2008 and that the deviations in the forecasts of macroeconomic factors, particularly in the GDP growth forecast, help to explain this gap.
|
284 |
Data acquisition system for optical frequency comb spectroscopySeton, Ragnar January 2017 (has links)
The Optical Frequency Comb Spectroscopy (OFCS) Group at the Department of Physics at Umeå University develops new techniques for extremely high sensitivity trace gas detection, non invasive temperature measurements, and other applications of OFCS. Their setup used primarily for trace gas detection contains several components that have been developed in-house, including a Fourier Transform Spectrometer (FTS) and an auto-balancing detector. This is the one used in this thesis work and it includes a high frequency data acquisition card (DAC) recording interferograms in excess of 10^7 double-precision floating point samples per sweep of the FTS's retarder. For acquisition and analysis to be possible in both directions of the retarder the interferograms needs to be analysed in a sub-second timeframe, something not possible with the present software. The aim of this thesis work has thus been to develop a system with optimized analysis implementations in MATLAB. The latter was a prerequisite from the group to ensure maintainability, as all members are well acquainted with it.Fulfilling its primary purpose MATLAB performs vector and matrix computations quite efficiently, has mostly fully mutable datatypes, and with recent just-in-time (JIT) compilation optimizations vector resizing performance has improved to what in many instances is perceived as equivalent to preallocated variables. This memory management abstraction, however, also means that explicit control of when arguments are passed by value or by reference to a function is not officially supported. The following performance ramifications naturally increase with the size of the data sets (N) passed as arguments and become quite noticeable even at moderate values of N when dealing with data visualization, a key function in system. To circumvent these problems explicit data references were implemented using some of the undocumented functions of MATLAB's libmx library together with a custom data visualization function.The main parts of the near real time interferogram analysis are resampling and a Fourier transformation, both of which had functionally complete but not optimized implementations. The minimal requirement for the reimplementation of these were simply to improve efficiency while maintaining output precision.On experimentally obtained data the new system's (DAQS) resampling implementation increased sample throughput by a factor of 19 which in the setup used corresponds to 10^8 samples per second. Memory usage was decreased by 72% or in terms of the theoretical minimum from a factor 7.1 to 2.0. Due to structural changes in the sequence of execution DAQS has no corresponding implementation of the reference FFT function as the computations performed in it have been parallelized and/or are only executed on demand, their combined CPU-time can however in a worst-case scenario reach 75% of that of the reference. The data visualization performance increase (compared to MATLAB's own, as the old system used LabVIEW) depends on the size in pixels of the surface it is visualized on and N, decreasing with the former and increasing with the latter. In the baseline case of a default surface size of 434x342 pixels and N corresponding to one full sweep of the FTS's retarder DAQS offers a 100x speed-up to the Windows 7 version of MATLAB R2014b's plot.In addition to acquiring and analyzing interferograms the primary objectives of the work included tools to configure the DAC and controlling the FTS's retarder motor, both implemented in DAQS.Secondary to the above was the implementation of acquisition and analysis for both directions of the retarder, a HITRAN reference spectra generator, and functionality to improve the user experience (UX). The first, though computation time allows for it, has not been implemented due to a delay in the DAC-driver. To provide a generic implementation of the second, the HITRAN database was converted from the text-based format it is distributed in to a MySQL database, a wrapper class providing frequency-span selection and the absorption spectra generation was developed together with a graphical front-end. Finally the improved UX functionality mainly focused on providing easy-access documentation of the properties of the DAC.In summation, though the primary objectives of optimizing the data analysis functions were reached, the end product still requires a new driver for the DAC to provide the full functionality of the reference implementation as the existing one is simply too slow. Many of DAQS' components can however be used as stand-alone classes and functions until a new driver is available. It is also worth mentioning that National Instruments (NI), the DAC vendor, has according to their technical support no plans to develop native MATLAB drivers as MathWorks will not sell them licenses.
|
285 |
Virtual design office: A collaborative unified modeling language toolTotapally, Hara 01 January 2001 (has links)
Real-time conferencing and collaborative computing is a great way to make developers more effective. This project is a collaborative framework development comprising configurable client and server components.
|
286 |
Risks in Financial MarketsPai, Yu-Jou 02 June 2020 (has links)
No description available.
|
287 |
Evaluation of Load Scheduling Strategies for Real-Time Data Warehouse EnvironmentsThiele, Maik, Lehner, Wolfgang 13 January 2023 (has links)
The demand for so-called living or real-time data warehouses is increasing in many application areas, including manufacturing, event monitoring and telecommunications. In fields like these, users normally expect short response times for their queries and high freshness for the requested data. However, it is truly challenging to meet both requirements at the same time because of the continuous flow of write-only updates and read-only queries as well as the latency caused by arbitrarily complex ETL processes. To optimize the update flow in terms of data freshness maximization and load minimization, we propose two algorithms - local and global scheduling - that operate on the basis of different system information. We want to discuss the benefits and drawbacks of both approaches in detail and derive recommendations regarding the optimal scheduling strategy for any given system setup and workload.
|
288 |
Multitasking for sensor based systemsReddy, Srinivas T. January 1985 (has links)
Multitasking systems are being used increasingly for real-time applications. Multitasking is suited very well for real-time systems since events in the real world do not occur in strict sequence but rather tend to overlap. Multitasking operating systems coordinate the activities of the different overlapping functions and give the user the appearance of concurrent activity. The coordination and scheduling is performed according to a user defined order of importance or priority. There are many multi tasking operating systems available for all the popular microprocessors. One such multitasking executive is VRTX/86 for the 8086 microprocessor. This executive comes in a PROM and is independent of any specific hardware configuration. Using this executive the IBM PC has been converted into a multitasking environment and multitasking test programs have been executed on the PC.
A general methodology for defining tasks and assigning priorities to these tasks has been defined. Using this methodology a typical real-time application called a Vehicle Instrumentation System was developed. / M.S.
|
289 |
A Client-Server Architecture for Collection of Game-based Learning DataJones, James R. 27 January 2015 (has links)
Advances in information technology are driving massive improvement to the education industry. The ubiquity of mobile devices has triggered a shift in the delivery of educational content. More lessons in a wide range of subjects are being disseminated by allowing students to access digital materials through mobile devices. One of the key materials is digital-based educational games. These games merge education with digital games to maximize engagement while somewhat obfuscating the learning process. The effectiveness is generally measured by assessments, either after or during gameplay, in the form of quizzes, data dumps, and/or manual analyses. Valuable gameplay information lost during the student's play sessions. This gameplay data provides educators and researchers with specific gameplay actions students perform in order to arrive at a solution, not just the correctness of the solution.
This problem illustrates a need for a tool, enabling educators and players to quickly analyze gameplay data. in conjunction with correctness in an unobtrusive manner while the student is playing the game. This thesis describes a client-server software architecture that enables the collection of game-based data during gameplay. We created a collection of web services that enables games to transmit game-data for analysis. Additionally, the web application provides players with a portal to login and view various visualization of the captured data. Lastly, we created a game called "Taffy Town", a mathematics-based game that requires the player to manipulate taffy pieces in order to solve various fractions. Taffy Town transmits students' taffy transformations along with correctness to the web application. Students are able to view several dynamically created visualizations from the data sent by Taffy Town. Researchers are able to log in to the web application and see the same visualizations, however, aggregated across all Taffy Town players. This end-to-end mapping of problems, actions, and results will enable researchers, pedagogists, and teachers to improve the effectiveness of educational games. / Master of Science
|
290 |
Partition-based workload scheduling in living data warehouse environmentsThiele, Maik, Fischer, Ulrike, Lehner, Wolfgang 04 July 2023 (has links)
The demand for so-called living or real-time data warehouses is increasing in many application areas such as manufacturing, event monitoring and telecommunications. In these fields, users normally expect short response times for their queries and high freshness for the requested data. However, meeting these fundamental requirements is challenging due to the high loads and the continuous flow of write-only updates and read-only queries that might be in conflict with each other. Therefore, we present the concept of workload balancing by election (WINE), which allows users to express their individual demands on the quality of service and the quality of data, respectively. WINE exploits these information to balance and prioritize both types of transactions—queries and updates—according to the varying user needs. A simulation study shows that our proposed algorithm outperforms competing baseline algorithms over the entire spectrum of workloads and user requirements.
|
Page generated in 0.0887 seconds