• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 219
  • 80
  • 36
  • 26
  • 26
  • 10
  • 9
  • 9
  • 7
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 513
  • 160
  • 150
  • 70
  • 57
  • 52
  • 44
  • 43
  • 40
  • 37
  • 37
  • 36
  • 35
  • 35
  • 34
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
141

RADIO FREQUENCY OVERVIEW OF THE HIGH EXPLOSIVE RADIO TELEMETRY PROJECT

Bracht, Roger, Dimsdle, Jeff, Rich, Dave, Smith, Frank 10 1900 (has links)
International Telemetering Conference Proceedings / October 26-29, 1998 / Town & Country Resort Hotel and Convention Center, San Diego, California / High explosive radio telemetry (HERT) is a project that is being developed jointly by Los Alamos National Laboratory and AlliedSignal FM&T. The ultimate goal is to develop a small, modular telemetry system capable of high-speed detection of explosive events, with an accuracy on the order of 10 nanoseconds. The reliable telemetry of this data, from a high-speed missile trajectory, is a very challenging opportunity. All captured data must be transmitted in less than 20 microseconds of time duration. This requires a high bits/Hertz microwave telemetry modulation code to insure transmission of the data within the limited time interval available.
142

IFM EFFECTS ON PCM/FM TELEMETRY SYSTEMS

Law, Gene, Whiteman, Don 10 1900 (has links)
International Telemetering Conference Proceedings / October 26-29, 1998 / Town & Country Resort Hotel and Convention Center, San Diego, California / Incidental Frequency Modulation (IFM) products in telemetry transmitters can be a significant cause of bit errors in received Pulse Code Modulation/Frequency Modulation (PCM/FM) telemetry data. Range Commanders Council (RCC) and other documents give little or no guidance as to acceptable levels of IFM for telemetry applications. The expected higher vibration levels of future high velocity missile systems means that IFM levels are likely to be higher than previously encountered. This paper presents measured data on Bit Error Rate (BER) versus IFM levels at given Signal to Noise Ratios (SNR’s) for PCM/FM telemetry systems. The information presented can be utilized with BER versus SNR plots in the Telemetry Applications Handbook, RCC Document 119, to determine the additional link margin required to minimize IFM effects on telemetry data quality.
143

EASTERN RANGE TITAN IV/CENTAUR-TDRSS OPERATIONAL COMPATIBILITY TESTING

Bocchino, Chris, Hamilton, William 10 1900 (has links)
International Telemetering Conference Proceedings / October 28-31, 1996 / Town and Country Hotel and Convention Center, San Diego, California / The future of range operations in the area of expendable launch vehicle (ELV) support is unquestionably headed in the direction of space-based rather than land- or air-based assets for such functions as metric tracking or telemetry data collection. To this end, an effort was recently completed by the Air Force’s Eastern Range (ER) to certify NASA’s Tracking and Data Relay Satellite System (TDRSS) as a viable and operational asset to be used for telemetry coverage during future Titan IV/Centaur launches. The test plan developed to demonstrate this capability consisted of three parts: 1) a bit error rate test; 2) a bit-by-bit compare of data recorded via conventional means vice the TDRSS network while the vehicle was radiating in a fixed position from the pad; and 3) an in-flight demonstration to ensure positive radio frequency (RF) link and usable data during critical periods of telemetry collection. The subsequent approval by the Air Force of this approach allows future launch vehicle contractors a relatively inexpensive and reliable means of telemetry data collection even when launch trajectories are out of sight of land-based assets or when land- or aircraft-based assets are not available for support.
144

ANTENNA PATTERN EVALUATION FOR LINK ANALYSIS

Pedroza, Moises 10 1900 (has links)
International Telemetering Conference Proceedings / October 28-31, 1996 / Town and Country Hotel and Convention Center, San Diego, California / The use of high bit rates in the missile testing environment requires that the receiving telemetry system(s) have the correct signal margin for no PCM bit errors. This requirement plus the fact that the use of “redundant systems” are no longer considered optimum support scenarios has made it necessary to select the minimum number of tracking sites that will gather the data with the required signal margin. A very basic link analysis can be made by using the maximum and minimum gain values from the transmitting antenna pattern. Another way of evaluating the transmitting antenna gain is to base the gain on the highest percentile appearance of the highest gain value. This paper discusses the mathematical analysis the WSMR Telemetry Branch uses to determine the signal margin resulting from a radiating source along a nominal trajectory. The mathematical analysis calculates the missile aspect angles (Theta, Phi, and Alpha) to the telemetry tracking system that yields the transmitting antenna gain. The gain is obtained from the Antenna Radiation Distribution Table (ARDT) that is stored in a computer file. An entire trajectory can be evaluated for signal margin before an actual flight. The expected signal strength level can be compared to the actual signal strength level from the flight. This information can be used to evaluate any plume effects.
145

Test and Evaluation of Ultra High Spectral Efficient Feher Keying (FK)

Lin, Jin-Song, Feher, Kamilo 10 1900 (has links)
International Telemetering Conference Proceedings / October 22-25, 2001 / Riviera Hotel and Convention Center, Las Vegas, Nevada / Performances of a subclass of a new spectral efficient modulation scheme, designated as Feher Keying [1], or FK, is evaluated. The Power Spectral Density (PSD) and Bit Error Rate (BER) characteristics of FK are presented. FK has ultra high spectral efficiency and satisfies the frequency mask for WLAN defined in FCC part 15, and it has a simple structure for high bit rate implementation.
146

Telemetry Data Processing: A Modular, Expandable Approach

Devlin, Steve 10 1900 (has links)
International Telemetering Conference Proceedings / October 17-20, 1988 / Riviera Hotel, Las Vegas, Nevada / The growing complexity of missle, aircraft, and space vehicle systems, along with the advent of fly-by-wire and ultra-high performance unstable airframe technology has created an exploding demand for real time processing power. Recent VLSI developements have allowed addressing these needs in the design of a multi-processor subsystem supplying 10 MIPS and 5 MFLOPS per processor. To provide up to 70 MIPS a Digital Signal Processing subsystem may be configured with up to 7 Processors. Multiple subsystems may be employed in a data processing system to give the user virtually unlimited processing power. Within the DSP module, communication between cards is over a high speed, arbitrated Private Data bus. This prevents the saturation of the system bus with intermediate results, and allows a multiple processor configuration to make full use of each processor. Design goals for a single processor included executing number system conversions, data compression algorithms and 1st order polynomials in under 2 microseconds, and 5th order polynomials in under 4 microseconds. The processor design meets or exceeds all of these goals. Recently upgraded VLSI is available, and makes possible a performance enhancement to 11 MIPS and 9 MFLOPS per processor with reduced power consumption. Design tradeoffs and example applications are presented.
147

Στεγανογραφία ψηφιακών εικόνων

Μπαλκούρας, Σωτήριος 14 October 2013 (has links)
Η ανάπτυξη του διαδικτύου τα τελευταία χρόνια έχει φέρει αλλαγές στο μέγεθος και την ποιότητα του διαθέσιμου περιεχομένου. Οι χρήστες κυριολεκτικά κατακλύζονται από πληροφορία η οποία μπορεί να έχει διάφορες μορφές όπως κείμενο, ήχο, εικόνα, βίντεο. Η μεγάλη εξάπλωση του διαδικτύου, η εύκολη αναζήτηση σε μεγάλο όγκο πληροφορίας καθώς και η παρουσίαση του περιεχομένου με φιλικό τρόπο προς το χρήστη συνέβαλε στην ολοένα αυξανόμενη ανάγκη για προμήθεια εικόνων, βίντεο και μουσικής. Η ψηφιοποίηση του μεγαλύτερου όγκου περιεχομένου που διαχειρίζονται οι χρήστες τόσο στην προσωπική όσο και στην επαγγελματική ζωή τους οδήγησε στην ανάπτυξη νέων τεχνικών στεγανογραφίας για την ανταλλαγή κρυφής πληροφορίας, έννοια η οποία είναι ευρέως γνωστή από την αρχαιότητα. Η παρούσα μεταπτυχιακή εργασία υλοποιεί δύο από τους πιο δημοφιλείς αλγορίθμους στεγανογράφησης τον (Least Significant Bit) και τον LBP (Local Binary Pattern). Το σύστημα που αναπτύχθηκε είναι διαθέσιμο στο διαδίκτυο και μπορεί να χρησιμοποιηθεί από οποιοδήποτε χρήστη επιθυμεί να αποκρύψει πληροφορία (κείμενο ή εικόνα) μέσα σε μια εικόνα. Το σύστημα υλοποιεί όλο τον κύκλο της στεγανογράφησης δίνοντας τη δυνατότητα στο χρήστη όχι μόνο να κάνει απόκρυψη της πληροφορίας που επιθυμεί αλλά και την αντίστροφη διαδικασία δηλαδή την ανάκτηση της κρυμμένης πληροφορίας. Η διαδικασία είναι απλή και απαιτεί από τον αποστολέα (αυτός που κρύβει το μήνυμα) το ανέβασμα της εικόνας στο σύστημα, την εισαγωγή ενός μυστικού κλειδιού το οποίο πρέπει να είναι γνωστό για την ανάκτηση του μηνύματος, και φυσικά το μήνυμα, δηλαδή η προς απόκρυψη πληροφορία. Στη συνέχεια ο παραλήπτης για να ανακτήσει το μήνυμα θα πρέπει να ανεβάσει στο σύστημα τη στεγανογραφημένη εικόνα καθώς και το μυστικό κλειδί που έχει συμφωνήσει με τον αποστολέα. Τέλος, με κάποια σενάρια χρήσης, πραγματοποιούνται μετρήσεις, οι οποίες δείχνουν την απόδοση κάθε αλγορίθμου και γίνονται οι αντίστοιχες συγκρίσεις. Το σύστημα που υλοποιήθηκε στην παρούσα εργασία μπορεί να συμπεριλάβει και άλλες μεθόδους στεγανογράφησης καθώς επίσης και με την επέκταση του αλγορίθμου LBP ώστε να χρησιμοποιεί και τις τρεις χρωματικές συνιστώσες για την απόκρυψη της πληροφορίας.. Επίσης, θα είχε ιδιαίτερο ενδιαφέρον η παροχή της συγκεκριμένης διαδικασίας σαν ηλεκτρονική υπηρεσία (web service) ώστε να είναι εφικτό να χρησιμοποιηθεί ανεξάρτητα και να μπορεί να εισαχθεί ως αυτόνομο κομμάτι λογισμικού σε κάθε πλατφόρμα που υποστηρίζει web services. / The development of the internet in recent years has brought changes in the size and quality of the available content. Users literally flooded with information which may have various forms like text, audio, image, and video. The wide spread of the internet, the ease of search in a large amount of information and the presentation of the available content in a friendly way resulted in the need for more images, videos and music. With the digitization of the available content new steganography techniques were necessary so that users can exchange secret information. In the current thesis two of the most popular steganography algorithms are implemented: the LSB (Least Significant Bit) and the LBP (Local Binary Pattern). The system is publicly available and can be used by any user who wishes to hide information (text or image) within an image. The system provides functionalities so that user can hide information within an image and recover the hidden information. The sender (the person who wishes to hide a message) has to provide the following information in the system: upload the image, provide the secret key needed to retrieve the message, and upload the message. The receiver has to upload the image containing the message and the secret key needed to recover the message. Anumber of usage scenarios are implemented to measure the performance of the algorithms and make comparisons. The implemented system can easily include more steganografy methods and also the extension of the LBP algorithm so that the three color components are used to hide the information. It would be interested to provide the current process as an e-service (web service) that it is feasible to be used independently and can be introduced as a standalone piece of software in any platform that supports web services.
148

Novel BICM HARQ Algorithm Based on Adaptive Modulations

Kumar, Kuldeep, Perez-Ramirez, Javier 10 1900 (has links)
ITC/USA 2011 Conference Proceedings / The Forty-Seventh Annual International Telemetering Conference and Technical Exhibition / October 24-27, 2011 / Bally's Las Vegas, Las Vegas, Nevada / A novel type-II hybrid automatic repeat request (HARQ) algorithm using adaptive modulations and bit-interleaved coded modulation (BICM) is presented. The algorithm uses different optimized puncturing patterns for different transmissions of the same data packet. The proposed approach exploits mapping diversity through BICM with iterative decoding. The modulation order is changed in each transmission to keep the number of symbols transmitted constant. We present new bit error rate and frame error rate analytical results for the proposed technique showing good agreement with simulation results. We compare the throughput performance of our proposed HARQ technique with a reference HARQ technique that uses different mapping arrangements but keeps the modulation order fixed. By using optimized puncturing patterns and adaptive modulations, our method provides significantly better throughput performance over the reference HARQ method in the whole signalto- noise ratio (SNR) range, and achieves a gain of 12 dB in the medium SNR region.
149

No bad memories : a feminist, critical design approach to video game histories

Weil, Rachel Simone 07 October 2014 (has links)
Certain unique sights and sounds of video games from the 1980s and 1990s have been codified as a retro game style, celebrated by collectors, historians, and game developers alike. In this report, I argue that this nostalgic celebration has escaped critical scrutiny and in particular omits the diverse experiences of girls and women who may have been alienated by the tough, intimidating nature of a twentieth-century video-game culture that was primarily created by and for boys. Indeed, attempts to attract girls to gaming, such as the 1990s girls' game movement, are usually criticized in or absent from mainstream video-game histories, and girly video games are rarely viewed with the same nostalgic fondness as games like Super Mario Bros. This condition points to a larger cultural practice of trivializing media for girls and, by extension, girlhood and girls themselves. My critical design response to this condition has been twofold. First, I have recuperated and resituated twentieth-century girly games as collectible, valuable, and nostalgic, thereby subverting conventional historical narratives and suggesting that these games have inherent cultural value. Second, I have created new works that reimagine 8-bit style as an expression of nostalgia for twentieth-century girlhood rather than for twentieth-century boyhood. This report contains documentation of some relevant projects I have undertaken, such as the creation of a video-game museum and an 8-bit video game called Electronic Sweet-N Fun Fortune Teller. In these projects and in future works, I hope to disrupt dominant narratives about video game history and nostalgia that continue to marginalize and trivialize girls' and women's experiences and participation in contemporary game cultures. / text
150

Implémentation des filtres non-linéaires de rang sur des architectures universelles et reconfigurables

Milojevic, Dragomir 08 November 2004 (has links)
Les filtres non-linéaires de rang sont souvent utilisés dans le but de rehausser la qualité d'une image numérique. Leur application permet de faciliter l'interprétation visuelle et la compréhension du contenu des images que ce soit pour un opérateur humain ou pour un traitement automatique ultérieur. Dans le pipeline d'une chaîne habituelle de traitement des images, ces filtres sont appliqués généralement dans la phase de pré-traitement, juste après l'acquisition et avant le traitement et l'analyse d'image proprement dit. Les filtres de rang sont considérés comme un important goulot d'étranglement dans la chaîne de traitement, à cause du tri des pixels dans chaque voisinage, à effectuer pour tout pixel de l'image. Les temps de calcul augmentent de façon significative avec la taille de l'image à traiter, la taille du voisinage considéré et lorsque le rang approche la médiane. Cette thèse propose deux solutions à l'accélération du temps de traitement des filtres de rang. La première solution vise l'exploitation des différents niveaux de parallélisme des ordinateurs personnels d'aujourd'hui, notamment le parallélisme de données et le parallélisme inter-processeurs. Une telle approche présente un facteur d'accélération de l'ordre de 10 par rapport à une approche classique qui fait abstraction du matériel grâce aux compilateurs des langages évolués. Si le débit résultant des pixels traités, de l'ordre d'une dizaine de millions de pixels par seconde, permet de travailler en temps réel avec des applications vidéo, peu de temps reste pour d'autres traitements dans la chaîne. La deuxième solution proposée est basée sur le concept de calcul reconfigurable et réalisée à l'aide des circuits FPGA (Field Programmable Gate Array). Le système décrit combine les algorithmes de type bit-série et la haute densité des circuits FPGA actuels. Il en résulte un système de traitement hautement parallèle, impliquant des centaines d'unités de traitement par circuit FPGA et permet d'arriver à un facteur d'accélération supplémentaire de l'ordre de 10 par rapport à la première solution présentée. Un tel système, inséré entre une source d'image numérique et un système hôte, effectue le calcul des filtres de rang avec un débit de l'ordre de centaine de millions de pixels par seconde.

Page generated in 0.0439 seconds