Spelling suggestions: "subject:"time tarp"" "subject:"time parp""
11 |
OFDM for Underwater Acoustic CommunicationThottappilly, Arjun 26 September 2011 (has links)
Communicating wirelessly underwater has been an area of interest for researchers, engineers, and practitioners alike. One of the main reasons for the slow rate of progress in this area is that the underwater acoustic channel is in general much more hostile — in terms of multipath, frequency selectivity, noise, and the Doppler effect — than the over-the-air radio frequency channel. In this work a time warp based technique which can be used to model time-varying wideband Doppler shifts (as seen in an UWA channel) in MATLAB is proposed. A corresponding procedure to estimate the parameters from observed data, required for inverting the effect of the time warp, is also proposed. Two different Doppler correction methods are compared; both can be used to undo the Doppler effect in measured data from an experiment subject to the wideband Doppler effect.
The techniques presented correct for the wideband Doppler effect as if it changed the time scale of the received signal. The first resampling based technique corrects for the average expansion/contraction over a packet, inherently assuming the relative velocity to be constant over the duration of the packet. The second time warp based technique models time-varying Doppler shift. Sinusoids, added to the beginning and end of each packet, are used to estimate the parameters required to invert the effect of the warp.
The time warp based methods are demonstrated using Orthogonal Frequency Division Multiplexing (OFDM) signals, but will in principle work for other kinds of wideband signals also. The presented results — using MATLAB based simulations, and over-the-air experiments performed in such a way as to introduce the Doppler effect in the received signals — emphasize the improvements that can be attained by using the time warp based Doppler modeling and correction method. The thesis concludes with suggestions for future work. / Master of Science
|
12 |
Nonlinear Doppler Warp Correction for Acoustic OFDMDayal, Avik 21 September 2016 (has links)
The Underwater Acoustic (UWA) channel has been an area of interest for many researchers and engineers, but also a very challenging area. Compared to the over-the-air radio frequency (RF) channel, the underwater acoustic channel causes multiple distortions – due to multipath, frequency selectivity, noise, and Doppler – some of which are more severe. The increased distortion causes many techniques and assumptions made for typical RF communication systems to break down.
One of the assumptions that breaks down is that the Doppler effect that the signal undergoes can be modelled with a simple shift in frequency, since the signals used to communicate in a UWA channel are effectively wideband. In this work the Doppler Effect that a signal undergoes is modelled as a nonlinear time warp. A procedure is developed to estimate the parameters of the time warp from the observed signal. These time warp parameters are then used to reverse the effect of the time warp. Two different methods for estimating the time warp parameters and correcting the Doppler are compared.
The first technique uses sinusoids placed at the beginning and end of the signal to estimate the parameters of the warp that the signal undergoes. The second technique uses sinusoids that are present during the signal to estimate and correct for the warp. The frequencies of the sinusoids are outside of the frequency range used for the transmitted data signal, so there is no interference with the information that is being sent.
The transmitted data signal uses Orthogonal Frequency Division Multiplexing (OFDM) to encode the data symbols, but the Doppler Correction technique will in principle work for other kinds of wideband signals as well. The results, which include MATLAB based simulations and over-the-air experiments, show that performance improvements can be realized using the time warp correction model. / Master of Science
|
13 |
High Performance Simulation of DEVS Based Large Scale Cellular Space ModelsSun, Yi 16 July 2009 (has links)
Cellular space modeling is becoming an increasingly important modeling paradigm for modeling complex systems with spatial-temporal behaviors. The growing demand for cellular space models has directed researchers to use different modeling formalisms, among which Discrete Event System Specification (DEVS) is widely used due to its formal modeling and simulation framework. The increasing complexity of systems to be modeled asks for cellular space models with large number of cells for modeling the systems¡¯ spatial-temporal behavior. Improving simulation performance becomes crucial for simulating large scale cellular space models. In this dissertation, we proposed a framework for improving simulation performance for large scale DEVS-based cellular space models. The framework has a layered structure, which includes modeling, simulation, and network layers corresponding to the DEVS-based modeling and simulation architecture. Based on this framework, we developed methods at each layer to overcome performance issues for simulating large scale cellular space models. Specifically, to increase the runtime and memory efficiency for simulating large number of cells, we applied Dynamic Structure DEVS (DSDEVS) to cellular space modeling and carried out comprehensive performance measurement. DSDEVS improves simulation performance by making the simulation focus only on those active models, and thus be more efficient than when the entire cellular space is loaded. To reduce the number of simulation cycles caused by extensive message passing among cells, we developed a pre-schedule modeling approach that exploits the model behavior for improving simulation performance. At the network layer, we developed a modified time-warp algorithm that supports parallel simulation of DEVS-based cellular space models. The developed methods have been applied to large scale wildfire spread simulations based on the DEVS-FIRE simulation environment and have achieved significant performance results.
|
14 |
Muse a parallel agent-based simulation environment /Gebre, Meseret Redae. January 2009 (has links)
Thesis (M.C.S.)--Miami University, Dept. of Computer Science and Systems Analysis, 2009. / Title from first page of PDF document. Includes bibliographical references (p. 72-75).
|
15 |
Pending Event Set Management in Parallel Discrete Event SimulationGupta, Sounak 02 October 2018 (has links)
No description available.
|
16 |
The Needed Input Data for Accurate On-line Signature Verification : The relevance of pressure and pen inclination for on-line signature verification / Indatan som behövs för bra signaturverifiering : Relevansen av tryckkänslighet och pennvinklar för signaturverifieringSjöholm, Thomas January 2015 (has links)
Signatures have been used to authenticate documents and transactions for over 1500 years and are still being used today. In this project a method for verifying signatures written on a tablet has been developed and tested in order to test whether pressure information is vital for a well performing on-line signature verification systems. First a background study was conducted to learn about the state-of-the-art methods and what features several research systems used, then the method was developed. The method is a Dynamic Time Warp with 8 local features, 2 of them were pressure values or derived from pressure, and 1 global feature. The developed method was tested on SUSig visual corpus containing signatures from 94 persons. The Equal Error Rate (EER) when not using pressure was 5.39 % for random forgeries and 3.24 % for skilled forgeries. EER when using pressure was 5.19 % for random forgeries and 2.80 % for skilled forgeries. The background study concluded that pen inclination is not required for a well performing system. Considering the result of this project and the result of others, it seems that pressure information is not vital, but provide some valuable information that can be used to classify signatures more accurately. / Signaturer har blivit använda för att autentisera dokument och transaktioner i över 1500 år och används än idag. En metod för att testa signaturer skrivna på en digital platta har utvecklats för att testa huruvida tryckkänslighet och vinkeln på pennan är kritiskt för ett välpresterande on-line signature verification system. Först så genomfördes en bakgrundsstudie för att se hur andra moderna metoder gör och vad för features de använder för att sen utveckla metoden. Den använda metoden är en Dynamic Time Warp med 8 lokala features varav 2 är tyckkänslighet eller utvunna från tryckkänslighet samt en global feature. Metoden testades sedan på SUSig visual corpus som har signaturer från 94 personer. Equal Error Rate (EER) för de feature kombinationerna som inte använde tryckkänslighet blev 5.39 % för slumpmässiga signaturer och 3.24 % för förfalskningar. EER för kombinationer av features som innehåller tryckkänslighet blev 5.19 % för slumpmässiga signaturer och 2.80 % för förfalskningar. Givet resultatet av det här projektet samt andra projekt utforskade i bakgrundsstudien så verkar tryckkänslighet inte vara kritiskt men ger en del värdeful information för klassificera signaturer mer träffsäkert. Bakgrundsstudien gav att vinkeln på pennan inte var kritisk för att välpresterande system.
|
17 |
Master/worker parallel discrete event simulationPark, Alfred John 16 December 2008 (has links)
The execution of parallel discrete event simulation across metacomputing infrastructures is examined. A master/worker architecture for parallel discrete event simulation is proposed providing robust executions under a dynamic set of services with system-level support for fault tolerance, semi-automated client-directed load balancing, portability across heterogeneous machines, and the ability to run codes on idle or time-sharing clients without significant interaction by users. Research questions and challenges associated with issues and limitations with the work distribution paradigm, targeted computational domain, performance metrics, and the intended class of applications to be used in this context are analyzed and discussed. A portable web services approach to master/worker parallel discrete event simulation is proposed and evaluated with subsequent optimizations to increase the efficiency of large-scale simulation execution through distributed master service design and intrinsic overhead reduction. New techniques for addressing challenges associated with optimistic parallel discrete event simulation across metacomputing such as rollbacks and message unsending with an inherently different computation paradigm utilizing master services and time windows are proposed and examined. Results indicate that a master/worker approach utilizing loosely coupled resources is a viable means for high throughput parallel discrete event simulation by enhancing existing computational capacity or providing alternate execution capability for less time-critical codes.
|
Page generated in 0.0436 seconds