• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 179
  • 64
  • 23
  • 21
  • 20
  • 13
  • 11
  • 11
  • 6
  • 6
  • 3
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 420
  • 47
  • 40
  • 33
  • 32
  • 26
  • 25
  • 24
  • 23
  • 22
  • 22
  • 22
  • 20
  • 19
  • 19
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
281

AN AUTONOMOUS SATELLITE TRACKING STATION

Anderson, Mike, Militch, Peter, Pickens, Hugh 10 1900 (has links)
International Telemetering Conference Proceedings / October 25-28, 1999 / Riviera Hotel and Convention Center, Las Vegas, Nevada / In 1998, AlliedSignal Technical Services (ATSC) installed three fully autonomous 13-meter satellite tracking systems for the Integrated Program Office of the National Oceanic and Atmospheric Administration (NOAA) at the Command and Data Acquisition Station near Fairbanks, Alaska. These systems track and command NOAA Polar Orbiting Weather Satellites and Defense Meteorological Satellites. Each tracking system operates for extended periods of time with little intervention other than periodic scheduling contacts. Schedule execution initiates equipment configuration, including establishing the RF communications link to the satellite. Station autonomy is achieved through use of a robust scheduler that permits remote users and the System Administrator to request pass activities for any of the supported missions. Spacecraft in the mission set are scheduled for normal operations according to the priority they have been assigned. Once the scheduler resolves conflicts, it builds a human-readable control script that executes all required support activities. Pass adds or deletes generate new schedule scripts and can be performed in seconds. The systems can be configured to support CCSDS and TDM telemetry processing, but the units installed at Fairbanks required only telemetry and command through-put capabilities. Received telemetry data is buffered on disk-storage for immediate, post-pass playback, and also on tape for long-term archiving purposes. The system can autonomously support up to 20 spacecraft with 5 different configuration setups each. L-Band, S-Band and X-Band frequencies are supported.
282

LOW-COST MISSION SUPPORT CONCEPT

Lam, Barbara 10 1900 (has links)
International Telemetering Conference Proceedings / October 28-31, 1996 / Town and Country Hotel and Convention Center, San Diego, California / This paper presents a new architecture of the end-to-end ground system to reduce overall mission support costs. The present ground system of the Jet Propulsion Laboratory (JPL) is costly to operate, maintain, deploy, reproduce, and document. In the present climate of shrinking NASA budgets, this proposed architecture takes on added importance as it will dramatically reduce all of the above costs. Currently, the ground support functions (i.e., receiver, tracking, ranging, telemetry, command, monitor and control) are distributed among several subsystems that are housed in individual rack-mounted chassis. These subsystems can be integrated into one portable laptop system using established MultiChip Module (MCM) packaging technology. The large scale integration of subsystems into a small portable system will greatly reduce operations, maintenance and reproduction costs. Several of the subsystems can be implemented using Commercial Off-The-Shelf (COTS) products further decreasing non-recurring engineering costs. The inherent portability of the system will open up new ways for using the ground system at the “point-of-use” site as opposed to maintaining several large centralized stations. This eliminates the propagation delay of the data to the Principal Investigator (PI), enabling the capture of data in real-time and performing multiple tasks concurrently from any location in the world. Sample applications are to use the portable ground system in remote areas or mobile vessels for real-time correlation of satellite data with earth-bound instruments; thus, allowing near real-time feedback and control of scientific instruments. This end-to-end portable ground system will undoubtedly create opportunities for better scientific observation and data acquisition.
283

THE EFFECT OF A NAVIGATIONAL AID ON TRAINING OF A MINIMALLY INVASIVE SURGERY CAMERA TASK

Vidwans, Ketan 30 July 2012 (has links)
Minimally Invasive Surgery (MIS) differs from Open Surgery as surgeons view the surgical site indirectly on a monitor. The view shown is typically from an angled endoscope off to one side of the surgery (i.e., uncollocated with the view of the hands). This makes camera navigation a challenging ability to learn. MIS thus requires longer training periods, more practice and mental effort to achieve proficiency. Current training setups and Operating Room (OR) environments lack appropriate real-time visual cues for navigation and other perception related information that could help with learning and performance in the OR. The purpose of this research was to design and develop graphical aids for improving understanding of camera navigation and depth perception in a trainer box necessary for enhancing surgeon’s skills to perform endoscopic surgery. For the former, two alternate training methods: 1) using no graphics (control group) and 2) using three different types of graphics conveying different information, were considered for this study. The effectiveness of the training was evaluated by a comparative analysis of different performance measures across all the groups. It was observed that training using graphics did improve the performance of participants in performing a minimally invasive surgery training task. For the latter, the use of a proximity sensor was explored.
284

Shareholder activism: performing for publicity or actual policy change? : The influence of social and environmental shareholder activism on CSR performance.

Zantinge, Robert January 2017 (has links)
No description available.
285

Über die Optimierung von Waveletalgorithmen für die verlustbehaftete Kompression digitaler Röntgenbilddaten

Ricke, Jens 29 May 2001 (has links)
Ziel: Eine Optimierung medizinischer Bilddatenkompression. Evaluation des Einflusses unterschiedlicher Filter auf die Bildqualität waveletkomprimierter Röntgenbilder. Material und Methode: Im Rahmen von Vorstudien Optimierung der digitalen Bildbefundung anhand von ROC-Analysen. Auswahl geeigneter Kompressionsverfahren durch methodischen und ROC-gestützten Vergleich von Wavelet- mit fraktaler und JPEG-Kompression. Im Rahmen der Hauptstudie ROC-basierter und statistischer Vergleich von 4 unterschiedlichen Waveletfiltern verschiedener Komplexität mittels Prüfkörper für niedrigfrequente, gemischt-frequente und hochfrequente Bildinformation im schwellenwertnahen Bereich. Ergebnisse: Durch Einsatz unterschiedlicher Filter insbesondere im Niedrigfrequenzbereich entstehen signifikante Unterschiede des Rekonstruktionsergebnisses der Röntgenbilder. Trotz eines partiell uneinheitlichen Ergebnisses der visuellen Analyse fanden sich Vorteile für komplexere Filter. Für Details im hochfrequenten Bereich finden sich kaum signifikante Unterschiede. Schlußfolgerungen: Die durch die ROC-Analyse erhobenen Ergebnisse korrelierten in keiner Weise mit den gleichzeitig mathematisch erhobenen PSNR-Werten. Ursache hierfür ist, daß die Reduktion des Bildrauschens durch die Waveletkompression in der PSNR als negative Einflußgröße abgebildet wird. Bei medizinischen Röntgenbildern führt jedoch die Minimierung des Bildrauschens zu einer erhöhten Erkennbarkeit von Details insbesondere im schwellenwertnahen Bereich. Entsprechend verbesserten sich die Ergebnisse der schwellenwertnah durchgeführten ROC-Analyse ungleichsinnig zu den PSNR-Werten. Eine detaillierte Beschreibung des Einflusses der Komplexität von Waveletfiltern auf die Rekonstruktionsqualität medizinsicher Bilder findet sich im Diskussionsteil der Studie. / Aim: Optimisation of medical image compression. Evaluation of wavelet-filters for wavelet-compression. Materials and methods: Optimisation of image review applying ROC analysis. Analysis of medical image compression methods comparing wavelet-compression, fractal compression and JPEG by ROC analysis. Evaluation of 4 different wavelet-filters with different complexity applying phantoms for low frequency, high and mixed frequency information. Results: Application of filters with different complexity results in significant variations in the quality of image reconstruction after compression specifically in low frequency informatiin. Filters of high complexity proved to be advantagous despite of heterogenous results during visual analysis. For high frequency details, complexity of filters did not prove to be of significant impact on image quality after reconstruction. Conclusions: Results of ROC analysis did not correspond with PSNR values. Reduction of image noise in reconstructed images by wavelet-filtering is expressed negatively in PSNR values. In medical images, reduction of image noise enhances detection specifically of low contrast details. A detailed discussion of the influence of filter complexity on the reconstruction quality of medical images can be found in the discussion section of the study.
286

Amélioration du processus de vérification des architectures générées à l'aide d'outils de synthèse de haut-niveau / Improvement of the verification process of architectures generated by high-level synthesis tools

Ribon, Aurélien 17 December 2012 (has links)
L'augmentation de la capacité d'intégration des circuits a permis le développement des systèmes de plus en plus complexes. De cette complexité sont nés des besoins conséquents quant aux méthodes de conception et de vérification. Les outils de synthèse de haut-niveau (HLS) sont une des réponses à ces besoins. Les travaux présentés dans cette thèse ont pour cadre l'amélioration du processus de vérification des architectures matérielles synthétisées par HLS. En particulier, ils proposent une méthode pour la transformation des assertions booléennes spécifiées dans la description algorithmique d'une application en moniteurs matériels pour la simulation. Une deuxième méthode est proposée. Elle cible la synthèse automatique d'un gestionnaire d'erreurs matériel dont le rôle est d'archiver les erreurs survenant dans un circuit en fonctionnement réel, ainsi que leurs contextes d'exécution. / The fast growing complexity of hardware circuits, during the last three decades, has change devery step of their development cycle. Design methods evolved a lot, and this evolutionwas necessary to cope with an always shorter time-to-market, mainly driven by the internationalcompetition.An increased complexity also means more errors, harder to find corner-cases, and morelong and expensive simulations. The verification of hardware systems requires more andmore resources, and is the main cost factor of the whole development of a circuit. Since thecomplexity of any system increases, the cost of an error undetected until the foundry stepbecame prohibitive. Therefore, the verification process is divided between multiple stepsinvolved at every moment of the design process : comparison of models behavior, simulationof RTL descriptions, formal analysis of algorithms, assertions usage, etc. The verificationmethodologies evolved a lot, in order to follow the progress of design methods. Somemethods like the Assertion-Based Verification became so important that they are nowwidely adopted among the developers community, providing near-source error detection.Thus, the work described here aims at improving the assertion-based verification process,in order to offer a consequent timing improvment to designers. Two contributions aredetailed. The first one deals with the transformation of Boolean assertions found in algorithmicdescriptions into equivalent temporal assertions in the RTL description generatedby high-level synthesis (HLS) methodologies. Therefore, the assertions are usable duringthe simulation process of the generated architectures. The second contribution targets theverification of hardware systems in real-time. It details the synthesis process of a hardwareerror manager, which has to save and serialize the execution context when an error isdetected. Thus, it is easier to understand the cause of an error and to find its source. Theerrors and their contexts are serialized as reports in a memory readable by the system ordirectly by the designer. The behavior of a circuit can be analyzed without requiring anyprobe or integrated logic analyzer.
287

Monitoring PC Hardware Sounds in Linux Systems Using the Daubechies D4 Wavelet.

Henry, Robert Karns 17 December 2005 (has links)
Users of high availability (HA) computing require systems that run continuously, with little or no downtime. Modern PCs address HA needs by monitoring operating system parameters such as voltage, temperature, and hard drive status in order to anticipate possible system failure. However, one modality for PC monitoring that has been underutilized is sound. The application described here uses wavelet theory to analyze sounds produced by PC hard drives during standard operation. When twenty-nine hard drives were tested with the application and the results compared with the drives' Self-Monitoring, Analysis, and Reporting Technology (S.M.A.R.T.) data, the binomial distribution's low p-value of 0.012 indicated better than chance agreement. While the concurrence between the two systems shows that sound is an effective tool in detecting hardware failures, the disagreements between the systems show that the application can complement S.M.A.R.T. in an HA system.
288

Public Safety Impact of Electronic Monitoring of Texas High-Risk Offenders

Aliu, Paul Utu 01 January 2015 (has links)
The use of electronic monitoring (EM) as a tool to supervise high-risk offenders has increased in the field of criminal justice in the state of Texas. Although EM is now widely used to supervise high-risk offenders to prevent them from committing further crimes, it is unclear whether EM has achieved the purpose of reducing reoffenses during parole supervision. Hirschi's social bond theory, which was later developed into social control theory, was used as the framework for this general qualitative study to explore retired parole officers' perceptions concerning whether EM is successful in preventing high-risk offenders from committing additional crimes. Interview data were collected from 10 retired parole officers who supervised high-risk offenders on EM in Harris County, Texas. The findings revealed that the 10 officers perceived EM to be an effective tool, but they perceived the role of capitalizing on positive social bonds was equally important in controlling criminal behavior. Specifically, the officers perceived that their bond with the high-risk offenders on EM could diminish offenders' propensity to commit new crimes. Opportunities for positive social change stemming from this study include recommendations to the Texas Department of Criminal Justice to develop policies and training that is consistent with social bond theory, and retrain parole officers to emphasize to offenders positive contacts and relationship with family and continuing employment during the term of parole release in order to reduct opportunities for reoffense and futher victimization to the community.
289

Development of a gamma-ray beam profile monitor for the high-intensity gamma-ray source

Regier, Thomas Zachary 29 October 2003
Beam profile monitors provide position and ux distribution information to facilitate the configuration of an experimental apparatus and are an important component of any accelerator facilities beam diagnostic system. Nuclear physics experiments typically involve the incidence of high energy particles or gamma-rays on some target material and the detection of the products of the ensuing interactions. Therefore, knowing the profile of the incident radiation beam is desirable. To address the need for a profile monitor for the High-Intensity Gamma-Ray Source, development of a CCD-based gamma-ray beam profiler was undertaken. The profiler consisted of plastic scintillator, a lens system and a Starlight Express MX5 CCD camera, all contained within a light tight box. The scintillation pattern, created by the interaction between the incident gamma-rays and the scintillator, could be focused onto the CCD. Simulations were used to determine the amount of power that would be absorbed for different beam energies and scintillator thicknesses. The use of a converter material, placed directly against the scintillator to improve power deposition, was also investigated. The system was tested in order to and the camera noise characteristics, the optical resolution and magnification and the systems responsivity to power absorption in the scintillator. Using a 137Cs source, preliminary beam proles were obtained. By combining the results of the testing and simulation, predictions of the required length of exposure were made. It was determined that a beam with a flux of 10^6/s and a diameter of 2.5 cm could be profiled, using 6.0 mm of plastic scintillator and 0.6 mm of iron converter, to within 5% error per 0.64 mm x 0.91 mm resolving unit, in less than 1 minute.
290

Development of a gamma-ray beam profile monitor for the high-intensity gamma-ray source

Regier, Thomas Zachary 29 October 2003 (has links)
Beam profile monitors provide position and ux distribution information to facilitate the configuration of an experimental apparatus and are an important component of any accelerator facilities beam diagnostic system. Nuclear physics experiments typically involve the incidence of high energy particles or gamma-rays on some target material and the detection of the products of the ensuing interactions. Therefore, knowing the profile of the incident radiation beam is desirable. To address the need for a profile monitor for the High-Intensity Gamma-Ray Source, development of a CCD-based gamma-ray beam profiler was undertaken. The profiler consisted of plastic scintillator, a lens system and a Starlight Express MX5 CCD camera, all contained within a light tight box. The scintillation pattern, created by the interaction between the incident gamma-rays and the scintillator, could be focused onto the CCD. Simulations were used to determine the amount of power that would be absorbed for different beam energies and scintillator thicknesses. The use of a converter material, placed directly against the scintillator to improve power deposition, was also investigated. The system was tested in order to and the camera noise characteristics, the optical resolution and magnification and the systems responsivity to power absorption in the scintillator. Using a 137Cs source, preliminary beam proles were obtained. By combining the results of the testing and simulation, predictions of the required length of exposure were made. It was determined that a beam with a flux of 10^6/s and a diameter of 2.5 cm could be profiled, using 6.0 mm of plastic scintillator and 0.6 mm of iron converter, to within 5% error per 0.64 mm x 0.91 mm resolving unit, in less than 1 minute.

Page generated in 0.0299 seconds