Spelling suggestions: "subject:"highfrequency"" "subject:"highffrequency""
471 |
A New Approach For The Assessment Of Hf Channel Availability Under Ionospheric DisturbancesSari, Murat Ozgur 01 September 2006 (has links) (PDF)
High Frequency (3-30 MHz) (HF) Ionospheric Channel is used for military, civilian and amateur communications. By using Ionosphere, communication for distances beyond the line of sight is achieved. The main advantage of this type of communication is that it does not to require a satellite to communicate with a point beyond the line of sight. Actually the Ionosphere is used instead of a satellite. To use Ionosphere but not a satellite means independent communication for a country.
The disadvantage of HF Ionospheric Communication is that the characteristics of the reflecting media (i.e. channel&rsquo / s transfer function) depends on many variables, e.g. sun spot number, hour of the day, season, solar cycles etc., so that mathematically modeling the channel is very difficult.
Since military standards like STANAG 4538, STANAG 4285, STANAG 4415, MIL-STD-188-110A and MIL-STD-188-141A define the required performance of an HF modem in terms of Signal to Noise Ratio (SNR), Doppler Spread and Delay Spread according to desired conditions, a new approach to characterize the channel in terms of these three parameters is presented.
In this thesis, HF Channel is considered as a system which involves various physical and chemical processes. A new method to characterize the HF channel to be used for modem performance evaluation is presented.
In this study, it is aimed to relate modem/channel availability with the magnetic indices, which may be considered as the disturbances to the system. For this purpose the data taken from an HF communication experiment is used to model the channel to be used for modem availability calculations.
The aim of the study is to asses the HF Channel Availability under Ionospheric Disturbances.
This new technique will be a useful tool for HF Modem operators to select the optimum data rate or modulation method during HF Communication.
|
472 |
Optimizable Multiresolution Quadratic Variation Filter For High-frequency Financial DataSen, Aykut 01 February 2009 (has links) (PDF)
As the tick-by-tick data of financial transactions become easier to reach, processing that much of information in an efficient and correct way to estimate the integrated volatility gains importance. However, empirical findings show that, this much of data may become unusable due to microstructure effects. Most common way to get over this problem is to sample the data in equidistant intervals of calendar, tick or business time scales. The comparative researches
on that subject generally assert that, the most successful sampling scheme is a calendar time sampling which samples the data every 5 to 20 minutes. But this generally means throwing out more than 99 percent of the data. So it is obvious that a more efficient sampling method is needed. Although there are some researches on using alternative techniques, none of them is proven to be the best.
Our study is concerned with a sampling scheme that uses the information in different scales of frequency and is less prone to microstructure effects. We introduce a new concept of business intensity, the sampler of which is named Optimizable Multiresolution Quadratic Variation Filter. Our filter uses multiresolution analysis techniques to decompose the data into different scales and quadratic variation to build up the new business time scale. Our empirical findings show that our filter is clearly less prone to microstructure effects than any other common sampling method.
We use the classified tick-by-tick data for Turkish Interbank FX market. The market is closed for nearly 14 hours of the day, so big jumps occur between closing and opening prices. We also propose a new smoothing algorithm to reduce the effects of those jumps.
|
473 |
Improvement And Development Of High-frequency Wireless Token-ring ProtocolKurtulus, Taner 01 January 2011 (has links) (PDF)
STANAG 5066 Edition 2 is a node-to-node protocol developed by NATO in order to
communicate via HF media. IP integration is made to be able to spread the use of
STANAG 5066 protocol. However, this integration made the communication much
slower which is already slow. In order to get faster the speed and communicate
within single-frequency multi-node network, HFTRP, which is a derivative of
WTRP, is developed. This protocol is in two parts, first is a message design for
management tokens exchanged by communicating nodes, and second is the
algorithms used to create, maintain, and repair the ring of nodes in the network.
Scope of this thesis is to find out a faster ring setup, growing procedure and to
implement. Beside, finding optimum values of tuning parameters for HFTRP is also
in the scope of this thesis.
|
474 |
Pattern Matching for Financial Time Series DataLiu, Ching-An 29 July 2008 (has links)
In security markets, the stock price movements are closely linked to the market information. For example, the subprime mortgage triggered a global financial crisis in 2007. Drops occurred in virtually every stock market in the world. After the Federal Reserve took several steps to address the crisis, the stock markets have been gradually stable. Reaction of the traders to the arrival information results in different patterns of the stock price movements. Thus pattern matching is an important subject in future movement prediction, rule discovery and computer aided diagnosis. In this research, we propose a pattern matching procedure to seize the similar stock price movements of two listed companies during one day. First, the algorithm of searching the longest common subsequence is introduced to sieve out the time intervals where the two listed companies have the same integrated volatility levels and price rise/drop trends. Next we transform the raw price data in the found matching time periods to the Bollinger Band Percent data, then use the power spectrum to extract low frequency components. Adjusted Pearson chi-squared tests are performed to analyze the similarity of the price movement patterns in these periods. We perform the study by simulation investigation first, then apply the procedure to empirical analysis of high frequency transaction data of NYSE.
|
475 |
Implementing Automated Trading Systems in The Swedish Financial Industry : Establishing a Framework for Successful DiffusionSalmela, Markus, Ström, Rickard January 2010 (has links)
<p><strong><p><strong>Purpose: </strong></p><p><em>Our main purpose is to explore, describe and analyze the organizational conduct when implementing automated trading systems (ATS) in companies, investigate the organizational challenges arising from this, and the effects these have on a successful diffusion</em>. As the extent of implementing ATS in the Swedish financial industry has not been explored to any greater extent, it is therefore also imperative to explore this; which will be seen as a secondary purpose to this article.<strong></strong></p><p><strong>Background: </strong></p><p>The study is based on innovation and diffusion theories, as well as those of power structures and organization. Further, an explanation of ATS and its dynamics is provided and discussed to facilitate a definition of the term.</p><p><strong>Method: </strong></p><p>The research has been carried out as an exploratory, descriptive and analytical qualitative study.<strong> </strong>We have conducted case studies of 7 companies that are implementing, or evaluating the implementation, of ATS. The data was collected through interviews.</p><p><strong>Conclusion: </strong></p><p>The majority of the case companies are in the clarifying and routinizing stages of the innovation process. What is found unique with ATS is that it can be implemented partly. The dimensions found central to a smooth diffusion in the companies are the <em>required level of competence-sharing</em> and <em>complexity of implementation.</em></p></strong></p>
|
476 |
Low-Frequency Noise in Si-Based High-Speed Bipolar TransistorsSandén, Martin January 2001 (has links)
No description available.
|
477 |
SiGeC Heterojunction Bipolar TransistorsSuvar, Erdal January 2003 (has links)
<p>Heterojunction bipolar transistors (HBT) based on SiGeC havebeen investigated. Two high-frequency architectures have beendesigned, fabricated and characterized. Different collectordesigns were applied either by using selective epitaxial growthdoped with phosphorous or by non-selective epitaxial growthdoped with arsenic. Both designs have a non-selectivelydeposited SiGeC base doped with boron and a poly-crystallineemitter doped with phosphorous.</p><p>Selective epitaxial growth of the collector layer has beendeveloped by using a reduced pressure chemical vapor deposition(RPCVD) technique. The incorporation of phosphorous and defectformation during selective deposition of these layers has beenstudied. A major problem of phosphorous-doping during selectiveepitaxy is segregation. Different methods, e.g. chemical orthermal oxidation, are shown to efficiently remove thesegregated dopants. Chemical-mechanical polishing (CMP) hasalso been used as an alternative to solve this problem. The CMPstep was successfully integrated in the HBT process flow.</p><p>Epitaxial growth of Si1-x-yGexCy layers for base layerapplications in bipolar transistors has been investigated indetail. The optimization of the growth parameters has beenperformed in order to incorporate carbon substitutionally inthe SiGe matrix without increasing the defect density in theepitaxial layers.</p><p>The thermal stability of npn SiGe-based heterojunctionstructures has been investigated. The influence of thediffusion of dopants in SiGe or in adjacent layers on thethermal stability of the structure has also been discussed.</p><p>SiGeC-based transistors with both non-selectively depositedcollector and selectively grown collector have been fabricatedand electrically characterized. The fabricated transistorsexhibit electrostatic current gain values in the range of 1000-2000. The cut-off frequency and maximum oscillation frequencyvary from 40-80 GHz and 15-30 GHz, respectively, depending onthe lateral design. The leakage current was investigated usinga selectively deposited collector design and possible causesfor leakage has been discussed. Solutions for decreasing thejunction leakage are proposed.</p><p><b>Key words:</b>Silicon-Germanium-Carbon (SiGeC),Heterojunction bipolar transistor (HBT), chemical vapordeposition (CVD), selective epitaxy, non-selective epitaxy,collector design, high-frequency measurement, dopantsegregation, thermal stability.</p>
|
478 |
Ανάλυση αισθητηριακών και ολοκληρωτικών οπτικών διαδικασιών με εργαλεία πληροφορικής / Analysis of sensory and integrative visual processes by informatics toolsΤσαρούχας, Νικόλαος 29 June 2007 (has links)
Χρονοφασματική και χωροχρονική ανάλυση σύγχρονης (φασικά-κλειδωμένης) υψίσυχνης (γ-ζώνης) ταλαντωτικής ηλεκτροεγκεφαλογραφικής δραστηριότητας σε ανώτερης τάξης oπτικογνωστικές αποκρίσεις του ανθρωπίνου εγκεφάλου διεξαγόμενη με το συνεχή μετασχηματισμό του κυματίου και υλοποιούμενη με προηγμένα εργαλεία πληροφορικής της Βιοϊατρικής Μηχανικής στην ψηφιακή επεξεργασία του ΗΕΓ σήματος. / Spectrotemporal and spatiotemporal analysis of synchronous (phase-locked) high-frequency (gamma-band)oscillatory electroencephalographic activity in higher-order visual cognitive responses of the human brain conducted with the continuous wavelet transform and implemented by advanced informatics tools of Biomedical Engineering in digital EEG signal processing.
|
479 |
Möglichkeiten der prozessbegleitenden Charakterisierung von Fleisch auf Basis der dielektrischen ZeitbereichsreflektometrieHerm, Cornelia 27 November 2013 (has links) (PDF)
Da die dielektrischen Eigenschaften eines jeden Materials, also auch von Fleisch, charakteristisch sind und von Faktoren wie beispielsweise dem Feuchtigkeitsgehalt und der chemischen Zusammensetzung sowie der physikalischen Struktur abhängig sind, gibt es bereits seit einigen Jahren verschiedene Forschungsansätze im Bereich der dielektrischen Messmethoden. Ein neu entwickeltes Messverfahren für die Be-stimmung der Qualitätsparameter Lagerdauer und eventuell erfolgter Gefrierprozesse bei Fisch basiert auf der dielektrischen Zeitbereichsreflektometrie. Die Klärung der Frage, in wie weit dieses Verfahren auch bei Frischfleisch angewendet werden kann, ist das Ziel der vorliegenden Dissertation.
In Vorversuchen wurde zunächst die generelle Anwendbarkeit des Verfahrens im Fleischbereich anhand verschiedener Lager- und Gefrierversuche geprüft. Basierend auf diesen Ergebnissen gliederten sich die experimentellen Untersuchungen in fol-gende Abschnitte: Ermittlung der Lagerdauer von SB-verpackten Minutensteaks vom Schweinelachs (MAP) sowie von Hähnchenbrustfilets, Erkennung von Gefrierprozes-sen bei Hähnchenbrustfilets und Schweinelachsen, Gefrierlagerdauerbestimmung verschiedener Fleischsorten (Schweinelachs, Schweinebauch und Rinderbugstück) bei drei unterschiedlichen Gefriertemperaturen und Erkennung von Fremdwasserzu-sätzen bei Hähnchenbrustfilets.
Die Vorversuche zeigten eine generelle Eignung des Verfahrens zur Untersuchung von Lagerzeiten und Gefrierprozessen bei Fleisch.
Im Bereich der Lagerdauererkennung ist eine Anwendung, auch für SB-verpackte Produkte (in Schutzatmosphäre), denkbar, das Verfahren muss hier allerdings bezüg-lich der zu Grunde liegenden linearen Auswertungsmodelle noch optimiert werden (Kurvenanpassung notwendig). Ebenso konnten gute Ergebnisse in der Gefrierlager-dauerbestimmung erzielt werden, auffällig waren die Fleischsorten-übergreifend bes-seren Ergebnisse der Messungen an vor dem Einfrieren einige Tage gelagerten Pro-ben. Gute Ergebnisse erbrachten auch die Messungen bezüglich der Gefrierprozess-erkennung (Proben frisch, einmal-gefroren und doppelt-gefroren). Ein Einsatz des Verfahrens ist besonders im Geflügelfleischbereich denkbar und angesichts der ge-setzlichen Bestimmungen für Geflügelfleisch (keine Fleischzubereitungen aus gefro-rener und aufgetauter Ware) auch überaus wünschenswert. Die besten Ergebnisse lieferte das Verfahren bei der Fremdwassererkennung an Hähnchenbrustfilets, was auf Grund des starken Dipolcharakters von Wasser bei einem auf Dielektrizität beru-henden Messverfahren auch erklärbar ist. Selbst die zusätzlich mit Geflügelfleisch-Proteinhydrolysatpulver versetzten Proben konnten (bedingt durch die Strukturverän-derungen im Probenmaterial) eindeutig von den unbehandelten Proben abgegrenzt werden, was mittels laborchemischer Analyse über das Wasser-Protein-Verhältnis nicht möglich gewesen wäre. Ein Einsatz des Verfahrens für diese Anwendung
erscheint besonders sinnvoll, um der Überwachung und Qualitätssicherung ein ge-eignetes Instrument für die schnelle Feststellung von Wasserzusätzen (bei unverän-dertem Wasser-Eiweiß-Verhältnis) geben zu können.
Vor dem praktischen Einsatz dieser Messmethode müssen größere Validierungsver-suche durchgeführt werden, um geeignete Kalibrationssätze für die Anwendung an unbekannten Proben, besonders im Geflügelfleischbereich, zur Verfügung zu haben. Ein Einsatz des Verfahrens zur Qualitätsbestimmung im Fleischbereich ist generell möglich und wäre auf Grund der schnellen Ergebnislieferung und einfachen Handha-bung der (mobilen) Messeinrichtung für Überwachung und Qualitätskontrolle
wünschenswert. / As the dielectric properties of each material, thus also for meat, are characteristically and depend on factors as for instance humidity and the chemical composite as well as physical structure there are existing different research approaches in the array of dielectric measurement methods already for many years. A new developed meas-urement method for the estimation of the quality parameters storage time and option-ally occurred freezing processes on fish is basing on the dielectric time domain reflec-tometry. The clarification of the question how far this method can also be used for fresh meat is the intention of the present graduate thesis.
In preliminary tests initially the general applicability of the method in the meat sector by means of different storage and freezing examinations was reviewed. Basing on these results the experimental examinations were divided into the following parts: evaluation of the storage time of packaged minute steaks from the pork chop (MAP) as also of chicken breast filets, estimation of freezing processes of chicken breast filets and pork chops, freezing storage time recognition of different meat varieties (pork chop, pork belly and beef bow pieces) with three different freezing temperatures and evaluation of added water in chicken breast filets.
The preliminary tests indicated a general ability of the method for the recognition of storage times and freezing processes on meat.
In the array of storage time evaluation an appliance, also for MAP-packaged products (in controlled atmosphere), is thinkable, however the method has to be optimized concerning the underlying linear evaluation models (curve fitting is necessary). As well there could be aimed good results for the freezing storage time recognition, the meat variety comprehensive better results for measurements on pattern which were stored some days previously were flashy. Good results were also adduced by the measurements concerning the freezing process recognition (pattern fresh, once- and double-frozen). An application of the technology is notably thinkable for the poultry meat sector and, in the face of the legal requirements for poultry meat (no preparation of frozen and thawed material allowed) also exceedingly desirable. The best results were carried out using the method for the recognition of added water in chicken breast filets, what is explainable by reason of the strong dipole character of water within a measurement method based on dielectricity. Even the pattern additionally glazed with poultry hydrolyzed protein powder could definitely be differentiated from the untreated pattern, what could not be realized with the water protein percentage using a laboratory chemical analysis. After all an appliance of the method for this application seems particularly reasonable, as a suitable instrument for the fast recognition of added water (with an unmodified water-protein-ratio) could be given to the monitoring and quality assurance.
Before starting the practically application of this measurement method there have to be accomplished larger validation examinations, to have suitable calibration kits for the application on unknown pattern at one´s disposal, especially in the poultry meat sector.
The assignment of the method for quality evaluation in the meat sector is generally possible and would be desirable for the monitoring and quality control by the reason of its fast result delivery and easy handling of the (mobile) measurement arrangement.
|
480 |
Mesure et Prévision de la Volatilité pour les Actifs LiquidesChaker, Selma 04 1900 (has links)
Le prix efficient est latent, il est contaminé par les frictions microstructurelles ou
bruit. On explore la mesure et la prévision de la volatilité fondamentale en utilisant les
données à haute fréquence.
Dans le premier papier, en maintenant le cadre standard du modèle additif du bruit et
le prix efficient, on montre qu’en utilisant le volume de transaction, les volumes d’achat
et de vente, l’indicateur de la direction de transaction et la différence entre prix d’achat et
prix de vente pour absorber le bruit, on améliore la précision des estimateurs de volatilité.
Si le bruit n’est que partiellement absorbé, le bruit résiduel est plus proche d’un bruit
blanc que le bruit original, ce qui diminue la misspécification des caractéristiques du
bruit.
Dans le deuxième papier, on part d’un fait empirique qu’on modélise par une forme
linéaire de la variance du bruit microstructure en la volatilité fondamentale. Grâce à la
représentation de la classe générale des modèles de volatilité stochastique, on explore
la performance de prévision de différentes mesures de volatilité sous les hypothèses de
notre modèle.
Dans le troisième papier, on dérive de nouvelles mesures réalizées en utilisant les prix
et les volumes d’achat et de vente. Comme alternative au modèle additif standard pour
les prix contaminés avec le bruit microstructure, on fait des hypothèses sur la distribution
du prix sans frictions qui est supposé borné par les prix de vente et d’achat. / The high frequency observed price series is contaminated with market microstructure
frictions or noise. We explore the measurement and forecasting of the fundamental
volatility through novel approaches to the frictions’ problem.
In the first paper, while maintaining the standard framework of a noise-frictionless price
additive model, we use the trading volume, quoted depths, trade direction indicator and
bid-ask spread to get rid of the noise. The econometric model is a price impact linear
regression. We show that incorporating the cited liquidity costs variables delivers more
precise volatility estimators. If the noise is only partially absorbed, the remaining noise
is closer to a white noise than the original one, which lessens misspecification of the
noise characteristics. Our approach is also robust to a specific form of endogeneity under
which the common robust to noise measures are inconsistent.
In the second paper, we model the variance of the market microstructure noise that contaminates
the frictionless price as an affine function of the fundamental volatility. Under
our model, the noise is time-varying intradaily. Using the eigenfunction representation
of the general stochastic volatility class of models, we quantify the forecasting performance
of several volatility measures under our model assumptions.
In the third paper, instead of assuming the standard additive model for the observed price
series, we specify the conditional distribution of the frictionless price given the available
information which includes quotes and volumes. We come up with new volatility measures
by characterizing the conditional mean of the integrated variance.
|
Page generated in 0.0636 seconds