• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 50
  • 23
  • 13
  • 7
  • 4
  • 4
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 117
  • 117
  • 24
  • 22
  • 21
  • 20
  • 13
  • 13
  • 13
  • 12
  • 12
  • 11
  • 11
  • 10
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Development of integrated graphic user interface for 2D/3D MR spectroscopic imaging with LCModel

Yu, Meng-Hsueh 05 July 2007 (has links)
Magnetic Resonance Spectroscopy (MRS) can be applied to probe noninvasively the concentrations and distribution of metabolites of human tissue in vivo. As the improving of hardware and localization techniques, MRS becomes more and more important in clinical applications. Furthermore, some post-processing software, like LCModel, provide a graphical user interface for efficient and convenient analysis of MR spectroscopic imaging and thus increase the value of MRS applications. Although LCModel provides an efficient analysis and produces stable results, it can not provide metabolite map to observe the distribution of metabolite concentrations. For this reason our study processes the output data of LCModel and Digital Imaging and Communications in Medicine (DICOM) format MR images for 2D/3D metabolite map displaying. Users can use this software to observe the metabolic distribution in AP, SI and RL slice of brain tissue. In the meanwhile, as the absolute quantification of MRS has played more and more important role in clinical applications, this study also provides the LCModel end users an easy way for interpretation.
12

Methods for high volume mixed signal circuit testing in the presence of resource constraints

Dasnurkar, Sachin 05 April 2013 (has links)
Analog and mixed signal device testing is resource intensive due to the spectral and temporal speci cations of the input/output interface signals. These devices and circuits are commonly validated by parametric speci fication tests to ensure compliance with the required performance criteria. Analog signal complexity increases resource requirements for the Automatic Test Equipment (ATE) systems used for commercial testing, making mixed signal testing resource ine cient as compared to digital structural testing. This dissertation proposes and implements a test ecosystem to address these constraints where Built In Self Test (BIST) modules are designed for internal stimulus generation. Data learning and processing algorithms are developed for output response shaping. This modi ed output response is then compared against the established performance matrices to maintain test quality with low cost receiver hardware. BIST modules reduce dependence on ATE resources for stimulus and output observation while improving capability to test multiple devices in parallel. Data analysis algorithms are used to predict specification parameters based on learning methods applied to measurable device parameters. Active hardware resources can be used in conjunction with post processing resources to implement complex speci cation based tests within the hardware limitations. This dissertation reviews the results obtained with the consolidated approach of using BIST, output response analysis and active hardware resources to reduce test cost while maintaining test quality. / text
13

An Image and Processing Comparison Study of Antialiasing Methods

Grahn, Alexander January 2016 (has links)
Context. Aliasing is a long standing problem in computer graphics. It occurs as the graphics card is unable to sample with an infinite accuracy to render the scene which causes the application to lose colour information for the pixels. This gives the objects and the textures unwanted jagged edges. Post-processing antialiasing methods is one way to reduce or remove these issues for real-time applications. Objectives. This study will compare two popular post-processing antialiasing methods that are used in modern games today, i.e., Fast approximate antialiasing (FXAA) and Submorphological antialiasing (SMAA). The main aim is to understand how both method work and how they perform compared to the other. Methods. The two methods are implemented in a real-time application using DirectX 11.0. Images and processing data is collected, where the processing data consists of the updating frequency of the rendering of screen known as frames per second (FPS), and the elapsed time on the graphics processing unit(GPU). Conclusions. FXAA shows difficulties in handling diagonal edges well but show only minor graphical artefacts in vertical and horizontal edges. The method can produce unwanted blur along edges. The edge pattern detection in SMAA makes it able to handle all directions well. The performance results conclude that FXAA do not lose a lot of FPS and is quick. FXAA is at least three times faster than SMAA on the GPU.
14

Hardware Realization of Chaos Based Symmetric Image Encryption

Barakat, Mohamed L. 06 1900 (has links)
This thesis presents a novel work on hardware realization of symmetric image encryption utilizing chaos based continuous systems as pseudo random number generators. Digital implementation of chaotic systems results in serious degradations in the dynamics of the system. Such defects are illuminated through a new technique of generalized post proceeding with very low hardware cost. The thesis further discusses two encryption algorithms designed and implemented as a block cipher and a stream cipher. The security of both systems is thoroughly analyzed and the performance is compared with other reported systems showing a superior results. Both systems are realized on Xilinx Vetrix-4 FPGA with a hardware and throughput performance surpassing known encryption systems.
15

Monitoring Tools File Specification

Vogelsang, Stefan 22 March 2016 (has links) (PDF)
This paper describes the format of monitoring data files that are collected for external measuring sites and at laboratory experiments at the Institute for Building Climatology (IBK). The Monitoring Data Files are containers for storing time series or event driven data collected as input for transient heat and moisture transport simulations. Further applications are the documentation of real world behaviour, laboratory experiments or the collection of validation data sets for simulation results ( whole building / energy consumption / HAM ). The article also discusses the application interface towards measurement data verification tools as well as data storage solutions that can be used to archive measurement data files conveniently and efficiently.
16

Jämförelse av korta temperaturprognoser från SMHI och Meteorologisk institutt med fokus på post-processingmetodikens betydelse för prognoskvaliteten / Comparison of Short-Range Temperature Forecasts from SMHI and the Norwegian Meteorological Institute - Focus on the Importance of Post-Processing Methods for the Quality of the Forecasts

Petersson, Sofie January 2019 (has links)
Temperaturprognoser är av stor betydelse för många i dagens samhälle, både privatpersoner och diverse olika sektorer. Förväntan på att prognoserna håller hög träffsäkerhet är stor och god kvalitet på dessa är viktigt av många olika aspekter. De numeriska vädermodellerna, som används för att göra väderprognoser, har brister som i stort sätt alltid leder till systematiska fel i prognoserna. Bristerna beror exempelvis på dålig representation av atmosfärens fysikaliska processer och för att korrigera och reducera dessa fel efterbehandlas prognoserna med olika metoder, så kallad post-processing. För att minimera de systematiska felen och öka träffsäkerheten för prognoserna pågår ständigt en utveckling och förbättring av både modellerna och post-processingmetodiken. Uppföljning och utvärdering av prognoser är av stor nytta för denna utveckling som ska leda till minimering av prognosfel och optimering av modell och metodik. I denna studie har temperaturprognosdata, med prognoslängd 0-12 timmar, från Sveriges Meteorologiska och Hydrologiska Institut (SMHI) och norska Meteorologisk institutt (met.no) jämförts med uppmätta värden för 2 m-temperatur. Observerad temperaturdata från 22 olika synoptiska väderstationer på platser utspridda över hela Sverige har använts i studien och perioden som studien är baserad på är 20 februari till 31 maj 2018. Statistiska mått, med mest fokus på korrelationskoefficient och bias, har analyserats och jämförts för att undersöka likheter och skillnader i temperaturprognoserna från de två olika väderinstituten. Resultaten av studien visar att temperaturprognoserna från met.no generellt sett har något högre träffsäkerhet än SMHI:s för de allra flesta av de 22 geografiska platserna. Båda institutens prognoser har för flertalet av stationerna i fjällen samt norra Sverige generellt sett lägre träffsäkerhet för februari än för mars, april och maj. / Temperature forecasts are of great importance for many different reasons in today's society, both for private individuals and various sectors. The expectations that the forecasts maintain high accuracy and good quality is important in many different aspects. The weather models, which are used to make the forecasts, have deficiencies which in large part always lead to systematic errors in the forecasts. The deficiencies are for example, due to poor representation of the physical processes of the atmosphere and to correct and reduce these errors, the forecasts are post-processed by various methods. To minimize the systematic errors and increase the accuracy of the forecasts, there is an ongoing development and improvement of both the models and the post-processing methods. Evaluation of forecasts is of great benefit to this development, which will lead to minimization of forecast errors and optimization of the model and methodology. In this study, temperature forecast data, with a forecast length of 0-12 hours, from the Swedish Meteorological and Hydrological Institute (SMHI) and the Norwegian Meteorological Institute (met.no) were compared with measured 2 m-temperature values. Observed temperature data from 22 different weather stations in locations scattered all over Sweden have been used in the study and the period on which the study is based is from the 20th of February to 31st of May, 2018. Different statistical measures have been analyzed and compared to examine similarities and differences in temperature forecasts from the two different weather institutes. The results of the study show that met.no's temperature forecasts generally have slightly higher accuracy than SMHI's for most of the 22 locations. For any of the stations in the mountains and northern Sweden forecasts from both institutes generally have lower accuracy for February than March, April and May.
17

How post-processing effects imitating camera artifacts affect the perceived realism and aesthetics of digital game graphics / Hur post-processing effekter som imiterar kamera-artefakter påverkar den uppfattade realismen och estetiken hos digital spelgrafik

Raud, Charlie January 2018 (has links)
This study investigates how post-processing effects affect the realism and aesthetics of digital game graphics. Four focus groups explored a digital game environment and were exposed to various post-processing effects. During qualitative interviews these focus groups were asked questions about their experience and preferences and the results were analysed. The results can illustrate some of the different pros and cons with these popular post-processing effects and this could help graphical artists and game developers in the future to use this tool (post-processing effects) as effectively as possible. / Denna studie undersöker hur post-processing effekter påverkar realismen och estetiken hos digital spelgrafik. Fyra fokusgrupper utforskade en digital spelmiljö medan olika post-processing effekter exponerades för dem. Under kvalitativa fokusgruppsintervjuer fick de frågor angående deras upplevelser och preferenser och detta resultat blev sedan analyserat. Resultatet kan ge en bild av de olika för- och nackdelarna som finns med dessa populära post-processing effekter och skulle möjligen kunna hjälpa grafiker och spelutvecklare i framtiden att använda detta verktyg (post-processing effekter) så effektivt som möjligt.
18

Zobrazování scény v moderních počítačových hrách / Scene Rendering in Modern Computer Games

Wilczák, Martin January 2011 (has links)
This thesis describes methods for lighting calculations of large scenes used in modern computer games. Forward shading and deferred shading methods are discussed and compared. Capabilities of raytracing are shortly described. There are some information about various methods for casting shadows, simulation of particle systems and applying post-processing effects. In the end there is a design of architecture for rendering complex scenes with use of XNA and description of implementation used in resulting game.
19

Monitoring Tools File Specification: Version 1.0

Vogelsang, Stefan January 2016 (has links)
This paper describes the format of monitoring data files that are collected for external measuring sites and at laboratory experiments at the Institute for Building Climatology (IBK). The Monitoring Data Files are containers for storing time series or event driven data collected as input for transient heat and moisture transport simulations. Further applications are the documentation of real world behaviour, laboratory experiments or the collection of validation data sets for simulation results ( whole building / energy consumption / HAM ). The article also discusses the application interface towards measurement data verification tools as well as data storage solutions that can be used to archive measurement data files conveniently and efficiently.:1 Introduction 2 File Name Conventions 3 Headers 3.1 Specifics on Time Series Header Files 3.2 Specifics s on Event Driven Header Files 4 Data Section Format Description 5 SI Unit Strings 6 Competition Law Advice 7 Liability for external Links
20

AN UPDATE ON NETWORK-BASED SECURITY TECHNOLOGIES APPLICABLE TO TELEMETRY POST-PROCESSING AND ANALYSIS ACTIVITIES

Kalibjian, Jeff 10 1900 (has links)
ITC/USA 2007 Conference Proceedings / The Forty-Third Annual International Telemetering Conference and Technical Exhibition / October 22-25, 2007 / Riviera Hotel & Convention Center, Las Vegas, Nevada / Networked based technologies (i.e. TCP/IP) have come to play an important role in the evolution of telemetry post processing services. A paramount issue when using networking to access/move telemetry data is security. In past years papers have focused on individual security technologies and how they could be used to secure telemetry data. This paper will review currently available network based security technologies, update readers on enhancements, and discuss their appropriate uses in the various phases of telemetry post-processing and analysis activities.

Page generated in 0.0669 seconds