• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 38
  • 14
  • 12
  • 10
  • 6
  • 4
  • 3
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 118
  • 33
  • 33
  • 31
  • 18
  • 17
  • 15
  • 15
  • 12
  • 11
  • 11
  • 9
  • 9
  • 8
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Att avsudda bilder / Image deblurring

Kienmayer, Mattis January 2019 (has links)
Denna uppsats rör vid problemet med suddiga bilder och vad man kan göra för att försöka återställa den eftersökta skarpa bilden. Problemet tacklas med hjälp av linjär-algebraiska medel, både regularisering och iterativa metoder. Som resultat visar sig DFPM (dynamical functional particle method) jämförbar med både konjugerad gradient-metoden och LSQR-algoritmen (least squares QR), dock med användning av andra parametrar än de teoretiskt optimala. Utöver tidigare kända metoder presenteras och testas även ett alternativt minimeringsproblem.
22

Desenvolvimento de preditores para recomendação automática de produtos. / Development of predictors for automated products recommendation.

Fuks, Willian Jean 28 May 2013 (has links)
Com o avanço da internet, novos tipos de negócios surgiram. Por exemplo, o sistema de anúncios online: produtores de sites e diversos outros conteúdos podem dedicar em uma parte qualquer de sua página um espaço para a impressão de anúncios de diversas lojas em troca de um valor oferecido pelo anunciante. É neste contexto que este trabalho se insere. O objetivo principal é o desenvolvimento de algoritmos que preveem a probabilidade que um dado usuário tem de se interessar e clicar em um anúncio a que está sendo exposto. Este problema é conhecido como predição de CTR (do inglês, \"Click-Through Rate\") ou taxa de conversão. Utiliza-se para isto uma abordagem baseada em regressão logística integrada a técnicas de fatoração de matriz que preveem, através da obtenção de fatores latentes do problema, a probabilidade de conversão para um anúncio impresso em dado site. Além disto, testes considerando uma estratégia dinâmica (em função do tempo) são apresentados indicando que o desempenho previamente obtido pode melhorar ainda mais. De acordo com o conhecimento do autor, esta é a primeira vez que este procedimento é relatado na literatura. / With the popularization of the internet, new types of business are emerging. An example is the online marketing system: publishers can dedicate in any given space of theirs websites a place to the printing of banners from different stores in exchange for a fee paid by the advertiser. It\'s in this context that this work takes place. Its main goal will be the development of algorithms that forecasts the probability that a given user will get interested in the ad he or she is seeing and click it. This problem is also known as CTR Prediction Task. To do so, a logistic regression approach is used combined with matrix factorization techniques that predict, through latent factor models, the probability that the click will occur. On top of that, several tests are conducted utilizing a dynamic approach (varying in function of time) revealing that the performance can increase even higher. According to the authors knowledge, this is the first time this test is conducted on the literature of CTR prediction.
23

FPGA-Based Co-processor for Singular Value Array Reconciliation Tomography

Coyne, Jack W 05 September 2007 (has links)
"This thesis describes a co-processor system that has been designed to accelerate computations associated with Singular Value Array Reconciliation Tomography (SART), a method for locating a wide-band RF source which may be positioned within an indoor environment, where RF propagation characteristics make source localization very challenging. The co-processor system is based on field programmable gate array (FPGA) technology, which offers a low-cost alternative to customized integrated circuits, while still providing the high performance, low power, and small size associated with a custom integrated solution. The system has been developed in VHDL, and implemented on a Virtex-4 SX55 FPGA development platform. The system is easy to use, and may be accessed through a C program or MATLAB script. Compared to a Pentium 4 CPU running at 3 GHz, use of the co-processor system provides a speed-up of about 6 times for the current signal matrix size of 128-by-16. Greater speed-ups may be obtained by using multiple devices in parallel. The system is capable of computing the SART metric to an accuracy of about -145 dB with respect to its true value. This level of accuracy, which is shown to be better than that obtained using single precision floating point arithmetic, allows even relatively weak signals to make a meaningful contribution to the final SART solution."
24

The role of blood brain barrier failure in progression of cerebral small vessel disease : a detailed magnetic resonance imaging study

Wang, Xin January 2014 (has links)
Small vessel disease (SVD) is an important cause of stroke, cognitive decline, and age-related disability. The cause of SVD is unknown, increasing evidence from neuropathology and neuroimaging suggests that failure of the blood-brain barrier (BBB) precipitates or worsens cerebral SVD progression and its failure is associated with SVD features such as white matter hyperintensities (WMH), perivascular spaces (PVS) and lacunar infarcts. The BBB change mechanism may also contribute to other common disorders of ageing such as Alzheimer's disease (AD). Magnetic resonance imaging (MRI) has revolutionised our understanding of SVD features. The MRI contributes to better understanding of the SVD pathophysiology and their clinical correlates. The purpose of this project was to better understand the pathogenesis of SVD, which involves improved understanding of BBB structures and pathophysiology and accurate measurement of cerebral SVD imaging characteristics on MRI scans. We aimed to assess (1) structures related to the BBB and factors that affect the BBB; (2) efficient and consistent WMH measurement method; (3) effect of stroke lesions on WMH and cerebral atrophy progression; (4) development and optimisation of computational PVS measurement method; (5) the relationships between PVS and SVD, blood markers, and BBB permeability. Section one describes structures and pathophysiology of the BBB. I reviewed the BBB structural and functional components from the view of neurovascular unit, PVS, and junctional proteins. The PVS part was done in a systematic search. I also reviewed some common stimuli for BBB permeability including inflammation and ischemia. Ischemic triggers for the BBB permeability were summarized systematically. Based on the literatures above, I summarized changes in junctional proteins in ischemia, inflammatory pain and AD models. Section two describes accurate measurement of WMH progression and atrophy. I used data from 100 patients who participated in a stroke study about BBB permeability changes in lacunar versus cortical stroke. To find a most efficient and consistent WMH measurement method, we tested several computational methods and effect of common processing steps including bias field correction and intensity adjustment. To avoid the effect of artefacts, I did a systematic search about artefacts and tested methods of image segmentation to avoid WMH artefacts as much as possible. To investigate the effect of stroke lesions on WMH and atrophy progression, I did the WMH, atrophy segmentation and stroke lesion measurements in a subgroup of 46 patients with follow-up scans, and showed that stroke lesions distorted measurement of WMH and atrophy progression and should be excluded. Section three describes development and optimization of a computational PVS measurement method, which measures the count and volume for PVS based on a threshold method using AnalyzeTM software. We tested the observer variability and validated it by comparison with visual rating scores. We investigated the associations between PVS results with other SVD features (WMH, atrophy), risk factors (hypertension, smoking and diabetes), blood markers, and BBB permeability. In conclusion, MRI is a valuable tool for the investigation of cerebral SVD features and BBB permeability. Exclusions of artefacts and stroke lesions are important in accurate measurement of WMH. PVS are important features of BBB abnormalities, and they correlate and share risk factors with other SVD features, and they should be considered as a marker of SVD and BBB permeability. Further systematic histological and ultrastructural studies of BBB are desirable in understanding the BBB regarding to the different parts of the cerebral vascular tree.
25

Statistický model tvaru obličeje / Statistical model of the face shape

Boková, Kateřina January 2019 (has links)
The goal of this thesis is to use machine learning methods for datasets of scanned faces and to create a program that allows to explore and edit faces represented as triangle meshes with a number of controls. Firstly we had to reduce dimension of triangle meshes by PCA and then we tried to predict shape of meshes according to physical properties like weight, height, age and BMI. The modeled faces can be used in animation or games.
26

Att avsudda bilder Filtrering och iterativa metoder / Image deblurring

Kienmayer, Mattis January 2019 (has links)
Denna uppsats rör vid problemet med suddiga bilder och vad man kan göra för att försöka återställa den eftersökta skarpa bilden. Problemet tacklas med hjälp av linjär-algebraiska medel, både regularisering och iterativa metoder. Som resultat visar sig DFPM (dynamical functional particle method) jämförbar med både konjugerad gradient-metoden och LSQR-algoritmen (least squares QR), dock med användning av andra parametrar än de teoretiskt optimala. Utöver tidigare kända metoder presenteras och testas även ett alternativt minimeringsproblem.
27

Role of Stochastic Forcing in ENSO Variability in a Coupled GCM

Kapur, Atul 01 January 2008 (has links)
A systematic procedure is designed to evaluate the role of stochastic forcing (SF) in El Nino ? Southern Oscillation (ENSO) exhibited by coupled ocean-atmosphere general circulation models (CGCMs). The procedure is applied on a 163-year run of a CGCM which has realistic representation of ENSO and SF. The realism of ENSO in the CGCM is determined by statistical comparison with a 29-year global reanalysis product. SF is extracted from both the CGCM and reanalysis as residual atmospheric variability uncoupled to the ocean. Further, the Madden-Julian Oscillation (MJO) and non-MJO components are isolated from SF. The CGCM stochastic components are compared to those from the reanalysis to validate their representation. A coupled ocean-atmosphere model of intermediate complexity is first forced with stochastic components from the reanalysis. The resulting ENSO is examined for realism to evaluate the strengths and weaknesses of the intermediate coupled model, which is then forced with the stochastic components from the CGCM. Results are diagnosed to investigate the role of SF. It is found that the SF can play an important role in ENSO in the CGCM, especially in its warm events. The role is similar to reanalysis SF in generating ENSO period and spring predictability barrier. However, unlike in case of the reanalysis, the seasonal dependence of ENSO variance in the CGCM does not seem to be originating from its SF. The contribution to statistics appears to be higher from the MJO component of SF compared to the non-MJO component. The intermediate model simulations also suggest that both in CGCM and nature, the SF operates on a weakly stable coupled system to produce ENSO variability.
28

Oceanic-Atmospheric and Hydrologic Variability in Long Lead-Time Forecasting

Oubeidillah, Abdoul Aziz 01 August 2011 (has links)
Water managers throughout the world are challenged with managing scarce resources and therefore rely heavily on forecasts to allocate and meet various water demands. The need for improved streamflow and snowpack forecast models is of the utmost importance. In this research, the use of oceanic and atmospheric variables as predictors was investigated to improve the long lead-time (three to nine months) forecast of streamflow and snowpack. Singular Value Decomposition (SVD) analysis was used to identify a region of Pacific and Atlantic Ocean SSTs and a region of 500 mbar geopotential height (Z500mb) that were teleconnected with streamflow and snowpack. The resulting Pacific and Atlantic Ocean SSTs and Z500mb regions were used to create indices that were then used as predictors in a non-parametric forecasting model. The majority of forecasts resulted in positive statistical skill, which indicated an improvement of the forecast over the climatology or no-skill forecast. The results indicated that derived indices from SSTs were better suited for long lead-time (six to nine month) forecasts of streamflow and snowpack while the indices derived from Z500mb improved short lead-time (3 month) forecasts. In all, the results of the forecast model indicated that incorporating oceanic-atmospheric climatic variability in forecast models can lead to improved forecasts for both streamflow and snowpack.
29

Classification System for Impedance Spectra

Sapp, Carl Gordon 01 May 2011 (has links)
This thesis documents research, methods, and results to satisfy the requirements for the M.S. degree in Electrical Engineering at the University of Tennessee. This thesis explores two primary steps for proper classification of impedance spectra: data dimension reduction and effectiveness of similarity/dissimilarity measure comparison in classification. To understand the data characteristics and classification thresholds, a circuit model analysis for simulation and unclassifiable determination is studied. The research is conducted using previously collected data of complex valued impedance measurements taken from 1844 similar devices. The results show a classification system capable of proper classification of 99% of data samples with well-separated data and approximately 85% using the full range of data available to this research.
30

Abrahams barn : En studie av religionsnyheternas förändring mellan 2002 och 2012 – med exempel från Svenska Dagbladet och Sydsvenskan

Atallah, Carol, P. Munther, Victor January 2012 (has links)
Studiens syfte är att genom en kvantitativ innehållsanalys ge svar på vilka artikeltyper och nyhetssammanhang som oftast förekommer för judendom, kristendom och islam – de tre abrahamitiska religionerna – i tidningarna Svenska Dagbladet (SvD) och Sydsvenskan (SDS) för perioderna april 2002, oktober 2002 och motsvarande för 2012. För att nå ökad förståelse för resultatet av den kvantitativa innehållsanalysen har intervjuer genomförts med en ledarskribent från vardera tidning samt med en representant från vardera religion. De intervjuade har bland annat fått ge sin uppfattning kring hur religionsnyheter ser ut idag jämfört med tio år sedan. Den kvantitativa studien består av 283 analyserade artiklar, av vilka 118 behandlar islam, 109 kristendom och 56 judendom. Kristendom dominerar för inrikesnyheterna och islam för utrikesnyheterna. Inom de inrikesartiklar som berör islam dominerar sammanhanget diskriminering, medan kultur och antisemitism utmärkte kristendom respektive judendom. För samtliga religioner är de dominerande sammanhangen för utrikesartiklarna annat och krig eller konflikt. I båda kategorier dominerar islam. De största skillnaderna åren emellan är antalet artiklar om Israel-Palestina-konflikten samt förekomsten av artiklar som rör antisemitism. Analyserat utifrån tidigare forskning och nyhetsvärderingsteorier tyder resultatet på att religiösa händelser är svåra att göra nyheter av, varför religionerna, och då särskilt islam och judendom, ofta förekommer i konfliktcentrerade nyheter – något även många av de intervjuade håller med om.

Page generated in 0.0179 seconds