• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 3
  • 3
  • 2
  • 1
  • Tagged with
  • 13
  • 13
  • 6
  • 5
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Sampling Frequency for Semi-Arid Streams and Rivers: Implications for National Parks in the Sonoran Desert Network

Lindsey, Melanie January 2010 (has links)
In developing a water quality monitoring program, the sampling frequency chosen should be able to reliably detect changes in water quality trends. Three datasets are evaluated for Minimal Detectable Change in surface water quality to examine the loss of trend detectability as sampling frequency decreases for sites within the National Park Service's Sonoran Desert Network by re-sampling the records as quarterly and annual datasets and by superimposing step and linear trends over the natural data to estimate the time it takes the Seasonal Kendall Test to detect trends of a specific threshold. Wilcoxon Rank Sum analyses found that monthly and quarterly sampling consistently draw from the same distribution of trend detection times; however, annual sampling can take significantly longer. Therefore, even with a loss in power from reduced sampling, quarterly sampling of Park waters adequately detects trends (70%) compared to monthly whereas annual sampling is insufficient in trend detection (30%).
2

Distributed and privacy preserving algorithms for mobility information processing

Katsikouli, Panagiota January 2018 (has links)
Smart-phones, wearables and mobile devices in general are the sensors of our modern world. Their sensing capabilities offer the means to analyze and interpret our behaviour and surroundings. When it comes to human behaviour, perhaps the most informative feature is our location and mobility habits. Insights from human mobility are useful in a number of everyday practical applications, such as the improvement of transportation and road network infrastructure, ride-sharing services, activity recognition, mobile data pre-fetching, analysis of the social behaviour of humans, etc. In this dissertation, we develop algorithms for processing mobility data. The analysis of mobility data is a non trivial task as it involves managing large quantities of location information, usually spread out spatially and temporally across many tracking sensors. An additional challenge in processing mobility information is to publish the data and the results of its analysis without jeopardizing the privacy of the involved individuals or the quality of the data. We look into a series of problems on processing mobility data from individuals and from a population. Our mission is to design algorithms with provable properties that allow for the fast and reliable extraction of insights. We present efficient solutions - in terms of storage and computation requirements - , with a focus on distributed computation, online processing and privacy preservation.
3

Reliability of Accelerometer Based Performance Measurements during Countermovement Vertical Jumps and the Influence of Sampling Frequency

Haff, G. Gregory, Ruben, R., Saffel, H., McCory, J., Cormie, P., Sands, William A., Stone, Michael H. 01 July 2010 (has links)
The assessment of vertical jump performance is widely undertaken by coaches and sports scientists because of its strong relationship with sports performances including those in weightlifting, sprinting, and cycling. With the development of accelerometer based testing devices the traditional vertical jump field test may offer a more detailed evaluation of an athlete's performance capacity. However, little data are available on the reliability of this technology and the impact of sampling frequency on reliability. PURPOSE: To determine the reliability of accelerometer based performance measurements during countermovement vertical jumps and the influence of sampling frequency on reliability. METHODS: Ten college aged men (age = 23.6 ± 3.1 y; height = 180.1 ± 6.3 cm; mass = 85.0 ± 15.2kg; body fat = 14.2 ± 6.5%) performed two series of five restricted (no arm swing) zero load countermovement vertical jumps. During each jump a triaxial accelerometer that sampled at 500 Hz was used to assess acceleration, from which peak force (PF), rate of force development (RFD), peak power output (PP), peak velocity (PV), flight time (FT), and peak vertical displacement (VD) were derived and analyzed using a custom LabView Program. This program was used to re-sample the data collected at 500 Hz to 250Hz, 125 Hz and 50 Hz, which were then analyzed. The reliability of the accelerometer system was assessed with the use of intraclass correlations, while precision was determined with the use of the coefficient of variation (CV), and criterion validity was assessed via Pearsons correlation. RESULTS: At 500 Hz the accelerometer was reliable for PF (ICC = 0.94), RFD (ICC = 0.92), PP (ICC = 0.87), FT (ICC = 0.93), and VD (ICC = 0.93). Additionally, reliability was maintained at 250Hz for PF(0.95), RFD(0.92), PP(ICC = 0.86), FT(ICC = 0.93) and VD(ICC = 0.92). Good precision was determined for PF (CV = 7.3%), PV (CV = 7.6%), FT (CV = 2.3%), and VD(CV = 4.7%) at 500 Hz. Additionally precision was maintained at 250Hz for PF (CV = 6.8%), PV (CV = 7.7%), FT (CV = 2.4%), and VD(CV = 4.9%). Finally, criterion validity was high for PF(r = 0.96), RFD(r = 0.97), PP(r = 0.99), PV(r = 0.99), FT(r = 0.99) and VD(r = 0.99) when comparing the 250Hz data to the 500 Hz data. When sampling frequency was decreased below 250Hz reliability, precision and criterion validity all decreased. CONCLUSIONS: The accelerometer used in this investigation produced reliable, precise and valid for assessments of PF, PP, FT, and VD data at sampling frequencies ≥250Hz. PRACTICAL APPLICATIONS: For vertical jump applications it appears that accelerometers must have a minimum sampling frequency of 250Hz in order to maintain reliability, precision and validity. Therefore when assessing athlete performance, it is essential that the strength and conditioning professional consider sampling rate when utilizing this technology. ACKNOWLEDGMENTS: This investigation was partially supported by MyoTest Inc., which donated the accelerometer system used in this investigation.
4

Real-time estimation of travel time using low frequency GPS data from moving sensors

Sanaullah, Irum January 2013 (has links)
Travel time is one of the most important inputs in many Intelligent Transport Systems (ITS). As a result, this information needs to be accurate and dynamic in both spatial and temporal dimensions. For the estimation of travel time, data from fixed sensors such as Inductive Loop Detectors (ILD) and cameras have been widely used since the 1960 s. However, data from fixed sensors may not be sufficiently reliable to estimate travel time due to a combination of limited coverage and low quality data resulting from the high cost of implementing and operating these systems. Such issues are particularly critical in the context of Less Developed Countries, where traffic levels and associated problems are increasing even more rapidly than in Europe and North America, and where there are no pre-existing traffic monitoring systems in place. As a consequence, recent developments have focused on utilising moving sensors (i.e. probe vehicles and/or people equipped with GPS: for instance, navigation and route guidance devices, mobile phones and smartphones) to provide accurate speed, positioning and timing data to estimate travel time. However, data from GPS also have errors, especially for positioning fixes in urban areas. Therefore, map-matching techniques are generally applied to match raw positioning data onto the correct road segments so as to reliably estimate link travel time. This is challenging because most current map-matching methods are suitable for high frequency GPS positioning data (e.g. data with 1 second interval) and may not be appropriate for low frequency data (e.g. data with 30 or 60 second intervals). Yet, many moving sensors only retain low frequency data so as to reduce the cost of data storage and transmission. The accuracy of travel time estimation using data from moving sensors also depends on a range of other factors, for instance vehicle fleet sample size (i.e. proportion of vehicles equipped with GPS); coverage of links (i.e. proportion of links on which GPS-equipped vehicles travel); GPS data sampling frequency (e.g. 3, 6, 30, 60 seconds) and time window length (e.g. 5, 10 and 15 minutes). Existing methods of estimating travel time from GPS data are not capable of simultaneously taking into account the issues related to uncertainties associated with GPS and spatial road network data; low sampling frequency; low density vehicle coverage on some roads on the network; time window length; and vehicle fleet sample size. Accordingly this research is based on the development and application of a methodology which uses GPS data to reliably estimate travel time in real-time while considering the factors including vehicle fleet sample size, data sampling frequency and time window length in the estimation process. Specifically, the purpose of this thesis was to first determine the accurate location of a vehicle travelling on a road link by applying a map-matching algorithm at a range of sampling frequencies to reduce the potential errors associated with GPS and digital road maps, for example where vehicles are sometimes assigned to the wrong road links. Secondly, four different methods have been developed to estimate link travel time based on map-matched GPS positions and speed data from low frequency data sets in three time windows lengths (i.e. 5, 10 and 15 minutes). These are based on vehicle speeds, speed limits, link distances and average speeds; initially only within the given link but subsequently in the adjacent links too. More specifically, the final method draws on weighted link travel times associated with the given and adjacent links in both spatial and temporal dimensions to estimate link travel time for the given link. GPS data from Interstate I-880 (California, USA) for a total of 73 vehicles over 6 hours were obtained from the UC-Berkeley s Mobile Century Project. The original GPS dataset which was broadcast on a 3 second sampling frequency has been extracted at different sampling frequencies such as 6, 30, 60 and 120 seconds so as to evaluate the performance of each travel time estimation method at low sampling frequencies. The results were then validated against reference travel time data collected from 4,126 vehicles by high resolution video cameras, and these indicate that factors such as vehicle sample size, data sampling frequency, vehicle coverage on the links and time window length all influence the accuracy of link travel time estimation.
5

Statistics for diffusion processes with low and high-frequency observations

Chorowski, Jakub 11 November 2016 (has links)
Diese Dissertation betrachtet das Problem der nichtparametrischen Schätzung der Diffusionskoeffizienten eines ein-dimensionalen und zeitlich homogenen Itô-Diffusionsprozesses. Dabei werden verschiedene diskrete Sampling Regimes untersucht. Im ersten Teil zeigen wir, dass eine Variante des von Gobet, Hoffmann und Reiß konstruierten Niedrigfrequenz-Schätzers auch im Fall von zufälligen Beobachtungszeiten verwendet werden kann. Wir beweisen, dass der Schätzer optimal im Minimaxsinn und adaptiv bezüglich der Verteilung der Beobachtungszeiten ist. Außerdam wenden wir die Lepski Methode an um einen Schätzer zu erhalten, der zusätzlich adaptiv bezüglich der Sobolev-Glattheit des Drift- und Volatilitätskoeffizienten ist. Im zweiten Teil betrachten wir das Problem der Volatilitätsschätzung für äquidistante Beobachtungen. Im Fall eines stationären Prozesses, mit kompaktem Zustandsraum, erhalten wir einen Schätzer, der sowohl bei hochfrequenten als auch bei niedrigfrequenten Beobachtungen die optimale Minimaxrate erreicht. Die Konstruktion des Schätzers beruht auf spektralen Methoden. Im Fall von niedrigfrequenten Beobachtungen ist die Analyse des Schätzers ähnlich wie diejenige in der Arbeit von Gobet, Hoffmann und Reiß. Im hochfrequenten Fall hingegen finden wir die Konvergenzraten durch lokale Mittelwertbildung und stellen daubt eine Verbindung zum Hochfrequenzschätzer von Florens-Zmirou her. In der Analyse unseres universalen Schätzers benötigen wir scharfe obere Schranken für den Schätzfehler von Funktionalen der Occupation time für unstetige Funktionen. Wir untersuchen eine auf Riemannsummen basierende Approximation der Occupation time eines stationären, reversiblen Markov-Prozesses und leiten obere Schranken für den quadratischen Fehler her. Im Fall von Diffusionsprozessen erhalten wir Konvergenzraten für Sobolev Funktionen. / In this thesis, we consider the problem of nonparametric estimation of the diffusion coefficients of a scalar time-homogeneous Itô diffusion process from discrete observations under various sampling assumptions. In the first part, the low-frequency estimation method proposed by Gobet, Hoffmann and Reiß is modified to cover the case of random sampling times. The estimator is shown to be optimal in the minimax sense and adaptive to the sampling distribution. Moreover, Lepski''s method is applied to adapt to the unknown Sobolev smoothness of the drift and volatility coefficients. In the second part, we address the problem of volatility estimation from equidistant observations without a predefined frequency regime. In the case of a stationary diffusion with compact state space and boundary reflection, we introduce a universal estimator that attains the minimax optimal convergence rates for both low and high-frequency observations. Being based on the spectral method, the low-frequency analysis is similar to the study conducted by Gobet, Hoffmann and Reiß. On the other hand, the derivation of the convergence rates in the high-frequency regime requires local averaging of the low-frequency estimator, which makes it mimic the behaviour of the classical high-frequency estimator introduced by Florens-Zmirou. The analysis of the universal estimator requires tight upper bounds on the estimation error of the occupation time functional for non-continuous functions. In the third part of the thesis, we thus consider the Riemann sum approximation of the occupation time functional of a stationary, time-reversible Markov process. Upper bounds on the squared mean estimation error are provided. In the case of diffusion processes, convergence rates for Sobolev regular functions are obtained.
6

Wireless Multi-Sensor Feedback Systems for SportsPerformance Monitoring : Design and Development

Sturm, Dennis January 2012 (has links)
Wireless applications have become a common part of daily life. Whether it is mobile phones, the Wi-Fi router at home, the keycard which has replaced the car key, a radio frequency identification access system to a building or a Bluetooth headset for your computer or phone, the means of modern wireless data exchange is an omnipresent technology. In sports, the market is in its infancy for wireless, technical applications or gadgets. Only heart rate monitors and GPS watches are currently used by recreational athletes. Even though most of the larger sports equipment companies regularly launch new products related to sports performance monitoring and mobile phone technology, product innovation leaps are rare.In this work the design of a wireless sports performance measurement platform is presented. Using the example of kayaking, this platform is configured as a paddle performance measuring system, the Kayak XL System, which can monitor propulsive paddle force, paddle kinematics and boat velocity, interalia. A common mobile phone platform has been chosen as the user interface for this system. The design approach focussing on user requests, demands and expectations in combination with the process of iterative technical development are unveiled in this thesis. An evaluation of the system is presented and the work is finalised with an overview of further systems which have been designed based on the developed measurement platform. The Kayak XL System is a flexible system designed to be mounted onto any standard kayak paddle and installed in any competition kayak. Versatility, unobtrusiveness and usability were major design concerns. The developed system consists of four modules plus a software which has been designed for Android mobile phones. The phone communicates with each of the four modules trough Bluetooth radio. These four modules are also referred to as nodes and have specific measurement purposes. Two nodes have been designed to measure paddle force and kinematics, one node has the purpose to measure foot stretcher force and boat motion data, and the fourth node enables a more convenient method of calibrating paddle force measurement. The fourth node is therefore only needed prior to performance data acquisition. Results show that paddle and foot stretcher force can be measured with a resolution below 1N after calibration. Installing the paddle nodes on a previously configured paddle without repeated calibration is facilitated with the compromise of a doubled error margin. The default sampling frequency is set to 100 Hz and can, like all system parameters, be configured on the mobile phone. Real-time computation of complex performance parameters is only limited by the phone CPU. The system adds twice 109 g to the paddle and approximately 850 g to the kayak, excluding the mass of the mobile phone / <p>QC 20120827</p>
7

Monitoramento da qualidade da água em tempo quase-real : um alternativa para a gestão de recursos hídricos

Silva, Régis Leandro Lopes da January 2018 (has links)
A dificuldade que se tem em promover uma adequada gestão da qualidade dos recursos hídricos tem como uma das causas a escassez de dados, seja espacial ou temporalmente. Os programas de monitoramento da qualidade da água convencionais possuem uma taxa de amostragem baixa, que em muitos casos se resume a algumas amostras por ano. Na busca por tornar as séries históricas de qualidade da água mais densas e representativas, desenvolvem-se técnicas de monitoramento automatizado, com equipamentos fixos em campo, coleta de dados automática e envio em tempo real. Embora estas técnicas tornem as séries históricas muito mais representativas e possibilitem uma tomada de decisão em tempo hábil para o gestor, elas ainda não estão consolidadas e possuem uma série de obstáculos na sua utilização, como o alto custo, as dificuldades de instalação dos equipamentos em campo, e a manutenção e calibração dispendiosas. Uma alternativa seria o chamado monitoramento da qualidade da água em tempo quase real (MQATQR), no qual um operador vai até o curso d’água portando o sensor, que pode ser uma sonda multiparamétrica, faz as leituras com uma frequência maior que o monitoramento convencional, e vai até uma base onde possa enviar esta informação. Com um mesmo aparelho pode ser feita a cobertura de um grande número de pontos e o equipamento pode ter sua calibração controlada em laboratório. Nesse contexto, o objetivo desta tese é avaliar a efetividade e o impacto econômico da utilização de uma estratégia de MQATQR como ferramenta de geração de dados de qualidade da água em um cenário de escassez de dados. Para isso foram utilizadas séries históricas de pontos de monitoramento da qualidade da água em tempo real no Brasil, Canadá e EUA. Estas séries foram submetidas à análise espectral para identificação das frequências mais densas e da representatividade delas dentro das séries. Os valores das frequências obtidos foram relacionados com atributos físicos e hidrológicos das bacias hidrográficas dos pontos de monitoramento. Para obtenção dos intervalos de amostragem foi aplicado o teorema de Nyquist-Shannon. A avaliação da viabilidade econômica da estratégia foi feita com dois estudos de caso, um é a aplicação na Rede Hidrometeorológica Nacional de Referência e o outro a aplicação no âmbito dos empreendimentos hidrelétricos sujeitos à resolução conjunta ANA/ANEEL nº 03/2010. Foi possível observar uma boa relação entre as frequências de amostragem e a área das bacias, possibilitando a prescrição de diferentes tipos de MQATQR para diferentes tipos de bacia. Os intervalos de amostragem obtidos por meio das frequências características em geral se mostraram executáveis nos moldes do MQATQR até a permanência de frequência de 90%. Para as permanências maiores que 90% os intervalos se aproximam do diário, sendo mais aconselhável a utilização de estratégias em tempo real. A estratégia de MQATQR se mostrou com melhor custo-efetividade para a maioria dos programas de monitoramento quando se utilizam permanências de frequências inferiores a 65%. Para permanências maiores que 65% a estratégia se mostrou economicamente viável para programas de monitoramento cujos pontos de amostragem estão próximos à base de operação. A estratégia de MQATQR se mostra como alternativa efetiva para o aumento da densidade temporal dos dados para diversos tipos de programas de monitoramento, com exceção daqueles que exigem o acompanhamento das variações bruscas na qualidade da água, como sistemas de alerta de qualidade da água. / The difficulty in promoting an adequate water quality management is mainly because data scarcity, either spatially or temporally. Conventional water quality monitoring programs have a low sampling rate, in many cases only few samples per year. By using appropriated equipment in field for data collection and real time sending, it is possible to develop some automated monitoring techniques to make the water quality series denser and representative. Although these techniques make the series more representative and allow a time decision-making for the manager, there are some barriers in their application such as higher costs, difficulties of installing the equipment and expensive maintenance and calibration. An alternative is the near-real-time water quality monitoring (NRTWQM), in which an operator goes to the river carrying the sensor, as a multi-parametric probe for example, to obtain a higher sampling frequency than monitoring conventional. In sequence the operator goes to a base where the information can be sent. By using the same sensor, it is possible to cover a large number of points. The equipment can also be calibrated in the laboratory. In this context, the objective of this work is to evaluate the effectiveness and economic impact in using a NRTWQM strategy as a tool to generate water quality data in a data scarcity scenario. For this purpose, water quality series are considered in real time at monitoring points in Brazil, Canada and the USA. The series were submitted to the spectral analysis to identify dense frequencies and their representativeness within the series. The frequencies values were related to the physical and hydrological attributes of the hydrographic basins. The Nyquist-Shannon theorem was applied to obtain the sampling intervals. The economic viability of the strategy is evaluated in two study cases. The National Hydrometeorological Reference Network is applied in the first, whilst the second is based on the application in the scope of hydroelectric projects subject to ANA / ANEEL nº 03/2010 normative resolution. It is observed a good relationship between the sampling frequencies and the basin area, which makes possible to prescribe different types of NRTWQM for different types of basin. In general, the sampling intervals obtained by means of the characteristic frequencies were shown to be executable in the NRTWQM models up to the 90% of cumulative frequency. For cumulative frequency higher than 90% the intervals approach the diary, consequently it is more advisable to use real-time strategies. The NRTWQM strategy proved to be the most cost-effective for mainly monitoring programs when using cumulative frequencies less than 65%. For cumulative frequency higher than 65%, the strategy proved to be economically viable for monitoring programs where sampling points are close to the base of operation. The NRTWQM strategy is an effective alternative to increase the temporal density of data for several types of monitoring programs, except for those that require the monitoring, of sudden changes in water quality, such as water quality warning systems.
8

Monitoramento da qualidade da água em tempo quase-real : um alternativa para a gestão de recursos hídricos

Silva, Régis Leandro Lopes da January 2018 (has links)
A dificuldade que se tem em promover uma adequada gestão da qualidade dos recursos hídricos tem como uma das causas a escassez de dados, seja espacial ou temporalmente. Os programas de monitoramento da qualidade da água convencionais possuem uma taxa de amostragem baixa, que em muitos casos se resume a algumas amostras por ano. Na busca por tornar as séries históricas de qualidade da água mais densas e representativas, desenvolvem-se técnicas de monitoramento automatizado, com equipamentos fixos em campo, coleta de dados automática e envio em tempo real. Embora estas técnicas tornem as séries históricas muito mais representativas e possibilitem uma tomada de decisão em tempo hábil para o gestor, elas ainda não estão consolidadas e possuem uma série de obstáculos na sua utilização, como o alto custo, as dificuldades de instalação dos equipamentos em campo, e a manutenção e calibração dispendiosas. Uma alternativa seria o chamado monitoramento da qualidade da água em tempo quase real (MQATQR), no qual um operador vai até o curso d’água portando o sensor, que pode ser uma sonda multiparamétrica, faz as leituras com uma frequência maior que o monitoramento convencional, e vai até uma base onde possa enviar esta informação. Com um mesmo aparelho pode ser feita a cobertura de um grande número de pontos e o equipamento pode ter sua calibração controlada em laboratório. Nesse contexto, o objetivo desta tese é avaliar a efetividade e o impacto econômico da utilização de uma estratégia de MQATQR como ferramenta de geração de dados de qualidade da água em um cenário de escassez de dados. Para isso foram utilizadas séries históricas de pontos de monitoramento da qualidade da água em tempo real no Brasil, Canadá e EUA. Estas séries foram submetidas à análise espectral para identificação das frequências mais densas e da representatividade delas dentro das séries. Os valores das frequências obtidos foram relacionados com atributos físicos e hidrológicos das bacias hidrográficas dos pontos de monitoramento. Para obtenção dos intervalos de amostragem foi aplicado o teorema de Nyquist-Shannon. A avaliação da viabilidade econômica da estratégia foi feita com dois estudos de caso, um é a aplicação na Rede Hidrometeorológica Nacional de Referência e o outro a aplicação no âmbito dos empreendimentos hidrelétricos sujeitos à resolução conjunta ANA/ANEEL nº 03/2010. Foi possível observar uma boa relação entre as frequências de amostragem e a área das bacias, possibilitando a prescrição de diferentes tipos de MQATQR para diferentes tipos de bacia. Os intervalos de amostragem obtidos por meio das frequências características em geral se mostraram executáveis nos moldes do MQATQR até a permanência de frequência de 90%. Para as permanências maiores que 90% os intervalos se aproximam do diário, sendo mais aconselhável a utilização de estratégias em tempo real. A estratégia de MQATQR se mostrou com melhor custo-efetividade para a maioria dos programas de monitoramento quando se utilizam permanências de frequências inferiores a 65%. Para permanências maiores que 65% a estratégia se mostrou economicamente viável para programas de monitoramento cujos pontos de amostragem estão próximos à base de operação. A estratégia de MQATQR se mostra como alternativa efetiva para o aumento da densidade temporal dos dados para diversos tipos de programas de monitoramento, com exceção daqueles que exigem o acompanhamento das variações bruscas na qualidade da água, como sistemas de alerta de qualidade da água. / The difficulty in promoting an adequate water quality management is mainly because data scarcity, either spatially or temporally. Conventional water quality monitoring programs have a low sampling rate, in many cases only few samples per year. By using appropriated equipment in field for data collection and real time sending, it is possible to develop some automated monitoring techniques to make the water quality series denser and representative. Although these techniques make the series more representative and allow a time decision-making for the manager, there are some barriers in their application such as higher costs, difficulties of installing the equipment and expensive maintenance and calibration. An alternative is the near-real-time water quality monitoring (NRTWQM), in which an operator goes to the river carrying the sensor, as a multi-parametric probe for example, to obtain a higher sampling frequency than monitoring conventional. In sequence the operator goes to a base where the information can be sent. By using the same sensor, it is possible to cover a large number of points. The equipment can also be calibrated in the laboratory. In this context, the objective of this work is to evaluate the effectiveness and economic impact in using a NRTWQM strategy as a tool to generate water quality data in a data scarcity scenario. For this purpose, water quality series are considered in real time at monitoring points in Brazil, Canada and the USA. The series were submitted to the spectral analysis to identify dense frequencies and their representativeness within the series. The frequencies values were related to the physical and hydrological attributes of the hydrographic basins. The Nyquist-Shannon theorem was applied to obtain the sampling intervals. The economic viability of the strategy is evaluated in two study cases. The National Hydrometeorological Reference Network is applied in the first, whilst the second is based on the application in the scope of hydroelectric projects subject to ANA / ANEEL nº 03/2010 normative resolution. It is observed a good relationship between the sampling frequencies and the basin area, which makes possible to prescribe different types of NRTWQM for different types of basin. In general, the sampling intervals obtained by means of the characteristic frequencies were shown to be executable in the NRTWQM models up to the 90% of cumulative frequency. For cumulative frequency higher than 90% the intervals approach the diary, consequently it is more advisable to use real-time strategies. The NRTWQM strategy proved to be the most cost-effective for mainly monitoring programs when using cumulative frequencies less than 65%. For cumulative frequency higher than 65%, the strategy proved to be economically viable for monitoring programs where sampling points are close to the base of operation. The NRTWQM strategy is an effective alternative to increase the temporal density of data for several types of monitoring programs, except for those that require the monitoring, of sudden changes in water quality, such as water quality warning systems.
9

Identifiering av variabler vid framtagning av optimerad stickprovsfrekvens / Identifying Variables for Developing Optimized Sampling Frequency

Gunnarsson Ljungblom, Joel, Larsson, Rikard January 2017 (has links)
Arbetet kring mätfrekvenser, alltså hur ofta en producerad detalj ska mätas, inom produktionen på Volvo Cars följer i dagsläget inget standardiserat arbetssätt. Arbetet kring det bygger i stort på tidigare erfarenheter och vad liknande utrustningar har för mätfrekvens. Volvo Cars efterfrågar mer kunskap inom området för att få en mer kostnadseffektiv kvalitetssäkring. Arbetets huvudsyfte har innefattats av identifiering gällande vilka variabler som påverkar mätfrekvensen, samt uppbyggnad av en enklare modell där variablerna applicerats. Intervjuer har även genomförts på ett flertal företag, där några av de viktigaste slutsatserna är: Mätfrekvenser arbetas med retroaktivt, snarare än proaktivt. Duglighet är i dagsläget vanligast att använda vid arbete med mätfrekvenser. Arbete med mätfrekvenser sker inte standardiserat. Förbättring av mätfrekvenser jobbas med i låg grad och när det väl görs är det ofta triggat av en mantidsanalys. Arbetet har resulterat i identifiering av två huvudvariabler; duglighet och kvalitetskostnader. Även om verkligheten är mer komplicerad, kan dessa två variabler ses som huvudkategorier. Under duglighet och kvalitetskostnader finns sedan underkategorier. För duglighet finns verktygsrelaterade egenskaper såsom förslitning och dess material. Även detaljens material och dess termodynamiska egenskaper har inverkan på dugligheten. Slutligen återfinns felintensitet, vibrationer som uppstår och processens stabilitet. Gällande kvalitetsbristkostnader finns felkostnader som uppstår inom företagets väggar, interna felkostnader, och de felkostnader som uppstår när produkt levererats till kund, externa felkostnader. Utöver de två finns även kontrollkostnader och förebyggande kostnader. Arbetet har dessutom mynnat ut i en enklare modell där erfarenhet från intervjuer och data från Volvo Cars tagits i beaktande. Flera av de data som återfinns i modellen har tagits fram genom analysering av tre veckors produktionsdata från Volvo Cars. Data som används i modellen berörande kvalitet är duglighet och den procentuella fördelningen av den aktuella varianten. De data som har inverkan på kvalitetskostnaderna är hur många operationer flödet har och aktuell operations placering i relation till totala antalet. Även råämnets kostnad, allvarlighetsgraden för kvalitetsbristen hos aktuell egenskap och skrotkostnaden används. Modellen har sedan applicerat på en maskinerna som omfattats av arbetet för att kontrollera utfallet. Med data införd baserad på produktionsdata från Volvo Cars har en stickprovsfrekvens på 62 genererats. / Work on measuring frequencies, which is how often a produced detail is to be measured, within Volvo Cars’ production currently does not follow a standardized approach. The work around it basically builds on past experiences and what similar equipment has for measurement frequency. Volvo Cars requests more knowledge in the area to get more cost-effective quality assurance. The main objective of the work has contained identification of the variables that affect the measurement frequency, as well as construction of a simpler model where the variables are applied. Interviews have also been conducted on a number of companies, where some of the key conclusions are: Measuring frequencies are worked retrospectively, rather than proactively. Capability is currently the most common for work with measurement frequencies. Working with measurement frequencies does not occur standardized. Improving measurement frequencies occur to a low extent, and when done, it is often triggered by a man-time analysis. The work has resulted in the identification of two main variables; capability and quality costs. Although the reality is more complicated, these two variables can be seen as main categories. Under capability and quality costs, there are subcategories. For capability, tool-related properties such as wear and its material are available. The material of the detail and its thermodynamic properties also affect the capability. Finally, error intensity, vibrations and stability of the process are found. Regarding quality deficiency there are error costs arising within the company's walls, internal error costs, and the error costs that occur when the product is delivered to the customer, external error costs. In addition to the two, there are also control costs and prevention costs. In addition, the work has resulted in a simpler model, taking into account experience from interviews and data from Volvo Cars. Several of the data contained in the model have been developed by analyzing three-week production data from Volvo Cars. Data used in the model related to quality is the capability and the percentage distribution of the current variant. The data that impact on quality costs is how many operations the flow has and the current operation location in relation to the total number. The cost of the raw material, the severity of the quality lack of the current property and the scrap cost is also used. The model has then been applied to one of the machines covered by the work to check the outcome. With data imported based on production data from Volvo Cars, a sampling rate of 62 has been generated.
10

Monitoramento da qualidade da água em tempo quase-real : um alternativa para a gestão de recursos hídricos

Silva, Régis Leandro Lopes da January 2018 (has links)
A dificuldade que se tem em promover uma adequada gestão da qualidade dos recursos hídricos tem como uma das causas a escassez de dados, seja espacial ou temporalmente. Os programas de monitoramento da qualidade da água convencionais possuem uma taxa de amostragem baixa, que em muitos casos se resume a algumas amostras por ano. Na busca por tornar as séries históricas de qualidade da água mais densas e representativas, desenvolvem-se técnicas de monitoramento automatizado, com equipamentos fixos em campo, coleta de dados automática e envio em tempo real. Embora estas técnicas tornem as séries históricas muito mais representativas e possibilitem uma tomada de decisão em tempo hábil para o gestor, elas ainda não estão consolidadas e possuem uma série de obstáculos na sua utilização, como o alto custo, as dificuldades de instalação dos equipamentos em campo, e a manutenção e calibração dispendiosas. Uma alternativa seria o chamado monitoramento da qualidade da água em tempo quase real (MQATQR), no qual um operador vai até o curso d’água portando o sensor, que pode ser uma sonda multiparamétrica, faz as leituras com uma frequência maior que o monitoramento convencional, e vai até uma base onde possa enviar esta informação. Com um mesmo aparelho pode ser feita a cobertura de um grande número de pontos e o equipamento pode ter sua calibração controlada em laboratório. Nesse contexto, o objetivo desta tese é avaliar a efetividade e o impacto econômico da utilização de uma estratégia de MQATQR como ferramenta de geração de dados de qualidade da água em um cenário de escassez de dados. Para isso foram utilizadas séries históricas de pontos de monitoramento da qualidade da água em tempo real no Brasil, Canadá e EUA. Estas séries foram submetidas à análise espectral para identificação das frequências mais densas e da representatividade delas dentro das séries. Os valores das frequências obtidos foram relacionados com atributos físicos e hidrológicos das bacias hidrográficas dos pontos de monitoramento. Para obtenção dos intervalos de amostragem foi aplicado o teorema de Nyquist-Shannon. A avaliação da viabilidade econômica da estratégia foi feita com dois estudos de caso, um é a aplicação na Rede Hidrometeorológica Nacional de Referência e o outro a aplicação no âmbito dos empreendimentos hidrelétricos sujeitos à resolução conjunta ANA/ANEEL nº 03/2010. Foi possível observar uma boa relação entre as frequências de amostragem e a área das bacias, possibilitando a prescrição de diferentes tipos de MQATQR para diferentes tipos de bacia. Os intervalos de amostragem obtidos por meio das frequências características em geral se mostraram executáveis nos moldes do MQATQR até a permanência de frequência de 90%. Para as permanências maiores que 90% os intervalos se aproximam do diário, sendo mais aconselhável a utilização de estratégias em tempo real. A estratégia de MQATQR se mostrou com melhor custo-efetividade para a maioria dos programas de monitoramento quando se utilizam permanências de frequências inferiores a 65%. Para permanências maiores que 65% a estratégia se mostrou economicamente viável para programas de monitoramento cujos pontos de amostragem estão próximos à base de operação. A estratégia de MQATQR se mostra como alternativa efetiva para o aumento da densidade temporal dos dados para diversos tipos de programas de monitoramento, com exceção daqueles que exigem o acompanhamento das variações bruscas na qualidade da água, como sistemas de alerta de qualidade da água. / The difficulty in promoting an adequate water quality management is mainly because data scarcity, either spatially or temporally. Conventional water quality monitoring programs have a low sampling rate, in many cases only few samples per year. By using appropriated equipment in field for data collection and real time sending, it is possible to develop some automated monitoring techniques to make the water quality series denser and representative. Although these techniques make the series more representative and allow a time decision-making for the manager, there are some barriers in their application such as higher costs, difficulties of installing the equipment and expensive maintenance and calibration. An alternative is the near-real-time water quality monitoring (NRTWQM), in which an operator goes to the river carrying the sensor, as a multi-parametric probe for example, to obtain a higher sampling frequency than monitoring conventional. In sequence the operator goes to a base where the information can be sent. By using the same sensor, it is possible to cover a large number of points. The equipment can also be calibrated in the laboratory. In this context, the objective of this work is to evaluate the effectiveness and economic impact in using a NRTWQM strategy as a tool to generate water quality data in a data scarcity scenario. For this purpose, water quality series are considered in real time at monitoring points in Brazil, Canada and the USA. The series were submitted to the spectral analysis to identify dense frequencies and their representativeness within the series. The frequencies values were related to the physical and hydrological attributes of the hydrographic basins. The Nyquist-Shannon theorem was applied to obtain the sampling intervals. The economic viability of the strategy is evaluated in two study cases. The National Hydrometeorological Reference Network is applied in the first, whilst the second is based on the application in the scope of hydroelectric projects subject to ANA / ANEEL nº 03/2010 normative resolution. It is observed a good relationship between the sampling frequencies and the basin area, which makes possible to prescribe different types of NRTWQM for different types of basin. In general, the sampling intervals obtained by means of the characteristic frequencies were shown to be executable in the NRTWQM models up to the 90% of cumulative frequency. For cumulative frequency higher than 90% the intervals approach the diary, consequently it is more advisable to use real-time strategies. The NRTWQM strategy proved to be the most cost-effective for mainly monitoring programs when using cumulative frequencies less than 65%. For cumulative frequency higher than 65%, the strategy proved to be economically viable for monitoring programs where sampling points are close to the base of operation. The NRTWQM strategy is an effective alternative to increase the temporal density of data for several types of monitoring programs, except for those that require the monitoring, of sudden changes in water quality, such as water quality warning systems.

Page generated in 0.1325 seconds