• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1523
  • 603
  • 211
  • 163
  • 161
  • 70
  • 56
  • 30
  • 27
  • 24
  • 24
  • 20
  • 16
  • 14
  • 13
  • Tagged with
  • 3449
  • 1036
  • 726
  • 464
  • 434
  • 401
  • 385
  • 315
  • 309
  • 307
  • 252
  • 248
  • 221
  • 209
  • 187
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
391

Convex Large Margin Training - Unsupervised, Semi-supervised, and Robust Support Vector Machines

Xu, Linli January 2007 (has links)
Support vector machines (SVMs) have been a dominant machine learning technique for more than a decade. The intuitive principle behind SVM training is to find the maximum margin separating hyperplane for a given set of binary labeled training data. Previously, SVMs have been primarily applied to supervised learning problems, where target class labels are provided with the data. Developing unsupervised extensions to SVMs, where no class labels are given, turns out to be a challenging problem. In this dissertation, I propose a principled approach for unsupervised and semi-supervised SVM training by formulating convex relaxations of the natural training criterion: find a (constrained) labeling that would yield an optimal SVM classifier on the resulting labeled training data. This relaxation yields a semidefinite program (SDP) that can be solved in polynomial time. The resulting training procedures can be applied to two-class and multi-class problems, and ultimately to the multivariate case, achieving high quality results in each case. In addition to unsupervised training, I also consider the problem of reducing the outlier sensitivity of standard supervised SVM training. Here I show that a similar convex relaxation can be applied to improve the robustness of SVMs by explicitly suppressing outliers in the training process. The proposed approach can achieve superior results to standard SVMs in the presence of outliers.
392

The Power Landmark Vector Learning Framework

Xiang, Shuo 07 May 2008 (has links)
Kernel methods have recently become popular in bioinformatics machine learning. Kernel methods allow linear algorithms to be applied to non-linear learning situations. By using kernels, non-linear learning problems can benefit from the statistical and runtime stability traditionally enjoyed by linear learning problems. However, traditional kernel learning frameworks use implicit feature spaces whose mathematical properties were hard to characterize. In order to address this problem, recent research has proposed a vector learning framework that uses landmark vectors which are unlabeled vectors belonging to the same distribution and the same input space as the training vectors. This thesis introduces an extension to the landmark vector learning framework that allows it to utilize two new classes of landmark vectors in the input space. This augmented learning framework is named the power landmark vector learning framework. A theoretical description of the power landmark vector learning framework is given along with proofs of new theoretical results. Experimental results show that the performance of the power landmark vector learning framework is comparable to traditional kernel learning frameworks.
393

Vector Graphics for Real-time 3D Rendering

Qin, Zheng January 2009 (has links)
Algorithms are presented that enable the use of vector graphics representations of images in texture maps for 3D real time rendering. Vector graphics images are resolution independent and can be zoomed arbitrarily without losing detail or crispness. Many important types of images, including text and other symbolic information, are best represented in vector form. Vector graphics textures can also be used as transparency mattes to augment geometric detail in models via trim curves. Spline curves are used to represent boundaries around regions in standard vector graphics representations, such as PDF and SVG. Antialiased rendering of such content can be obtained by thresholding implicit representations of these curves. The distance function is an especially useful implicit representation. Accurate distance function computations would also allow the implementation of special effects such as embossing. Unfortunately, computing the true distance to higher order spline curves is too expensive for real time rendering. Therefore, normally either the distance is approximated by normalizing some other implicit representation or the spline curves are approximated with simpler primitives. In this thesis, three methods for rendering vector graphics textures in real time are introduced, based on various approximations of the distance computation. The first and simplest approach to the distance computation approximates curves with line segments. Unfortunately, approximation with line segments gives only C0 continuity. In order to improve smoothness, spline curves can also be approximated with circular arcs. This approximation has C1 continuity and computing the distance to a circular arc is only slightly more expensive than computing the distance to a line segment. Finally an iterative algorithm is discussed that has good performance in practice and can compute the distance to any parametrically differentiable curve (including polynomial splines of any order) robustly. This algorithm is demonstrated in the context of a system capable of real-time rendering of SVG content in a texture map on a GPU. Data structures and acceleration algorithms in the context of massively parallel GPU architectures are also discussed. These data structures and acceleration structures allow arbitrary vector content (with space-variant complexity, and overlapping regions) to be represented in a random-access texture.
394

The Differential Geometry of Instantons

Smith, Benjamin January 2009 (has links)
The instanton solutions to the Yang-Mills equations have a vast range of practical applications in field theories including gravitation and electro-magnetism. Solutions to Maxwell's equations, for example, are abelian gauge instantons on Minkowski space. Since these discoveries, a generalised theory of instantons has been emerging for manifolds with special holonomy. Beginning with connections and curvature on complex vector bundles, this thesis provides some of the essential background for studying moduli spaces of instantons. Manifolds with exceptional holonomy are special types of seven and eight dimensional manifolds whose holonomy group is contained in G2 and Spin(7), respectively. Focusing on the G2 case, instantons on G2 manifolds are defined to be solutions to an analogue of the four dimensional anti-self-dual equations. These connections are known as Donaldson-Thomas connections and a couple of examples are noted.
395

Detection and segmentation of moving objects in video using optical vector flow estimation

Malhotra, Rishabh 24 July 2008 (has links)
The objective of this thesis is to detect and identify moving objects in a video sequence. The currently available techniques for motion estimation can be broadly categorized into two main classes: block matching methods and optical flow methods.<p>This thesis investigates the different motion estimation algorithms used for video processing applications. Among the available motion estimation methods, the Lucas Kanade Optical Flow Algorithm has been used in this thesis for detection of moving objects in a video sequence. Derivatives of image brightness with respect to x-direction, y-direction and time t are calculated to solve the Optical Flow Constraint Equation. The algorithm produces results in the form of horizontal and vertical components of optical flow velocity, u and v respectively. This optical flow velocity is measured in the form of vectors and has been used to segment the moving objects from the video sequence. The algorithm has been applied to different sets of synthetic and real video sequences.<p>This method has been modified to include parameters such as neighborhood size and Gaussian pyramid filtering which improve the motion estimation process. The concept of Gaussian pyramids has been used to simplify the complex video sequences and the optical flow algorithm has been applied to different levels of pyramids. The estimated motion derived from the difference in the optical flow vectors for moving objects and stationary background has been used to segment the moving objects in the video sequences. A combination of erosion and dilation techniques is then used to improve the quality of already segmented content.<p>The Lucas Kanade Optical Flow Algorithm along with other considered parameters produces encouraging motion estimation and segmentation results. The consistency of the algorithm has been tested by the usage of different types of motion and video sequences. Other contributions of this thesis also include a comparative analysis of the optical flow algorithm with other existing motion estimation and segmentation techniques. The comparison shows that there is need to achieve a balance between accuracy and computational speed for the implementation of any motion estimation algorithm in real time for video surveillance.
396

Representing short sequences in the context of a model organism genome

Lewis, Christopher Thomas 25 May 2009 (has links)
<p>In the post-genomics era, the sheer volume of data is overwhelming without appropriate tools for data integration and analysis. Studying genomic sequences in the context of other related genomic sequences, i.e. comparative genomics, is a powerful technique enabling the identification of functionally interesting sequence regions based on the principal that similar sequences tend to be either homologous or provide similar functionality.</p> <p>Costs associated with full genome sequencing make it infeasible to sequence every genome of interest. Consequently, simple, smaller genomes are used as model organisms for more complex organisms, for instance, Mouse/Human. An annotated model organism provides a source of annotation for transcribed sequences and other gene regions of the more complex organism based on sequence homology. For example, the gene annotations from the model organism aid interpretation of expression studies in more complex organisms.</p> <p>To assist with comparative genomics research in the Arabidopsis/Brassica (Thale-cress/Canola) model-crop pair, a web-based, graphical genome browser (BioViz) was developed to display short Brassica genomic sequences in the context of the Arabidopsis model organism genome. This involved the development of graphical representations to integrate data from multiple sources and tools, and a novel user interface to provide the user with a more interactive web-based browsing experience. While BioViz was developed for the Arabidopsis/Brassica comparative genomics context, it could be applied to comparative browsing relative to other reference genomes.</p> <p>BioViz proved to be an valuable research support tool for Brassica / Arabidopsis comparative genomics. It provided convenient access to the underlying Arabidopsis annotation, allowed the user to view specific EST sequences in the context of the Arabidopsis genome and other related EST sequences. In addition, the limits to which the project pushed the SVG specification proved influential in the SVG community. The work done for BioViz inspired the definition of an opensource project to define standards for SVG based web applications and a standard framework for SVG based widget sets.</p>
397

Modeling polarized radiative transfer for improved atmospheric aerosol retrieval with OSIRIS limb scattered spectra

Bathgate, Anthony Franklin 25 February 2011 (has links)
Retrievals of atmospheric information from satellite observations permit the investigation of otherwise inaccessible atmospheric phenomena. The recovery of this information from optical instrumentation located in orbit requires both an inversion algorithm like the Saskatchewan Multiplicative Algebraic Reconstruction Technique and a forward model like the SASKTRAN radiative transfer model. These are used together at the University of Saskatchewan to retrieve sulphate aerosol extinction profiles from the radiance measurements made by the Canadian built OSIRIS instrument. Although these retrievals are highly successful the process currently does not consider the polarization of light or OSIRIS's polarization sensitivities because SASKTRAN is a scalar model. In this work the development of a vector version of SASKTRAN that can perform polarized radiative transfer calculations is presented.<p> The vector SASKTRAN's results compare favorably with vector SCIATRAN, another polarized model that is in development at the University of Bremen. Comparisons of the stratospheric aerosol retrieval vectors generated from the scalar and vector SASKTRAN results indicate that the polarized calculations are an important factor in future work to improve the aerosol retrievals and to recover particle size or composition information.
398

Financial Intermediation and the Macroeconomy of the United States: Quantitative Assessments

Chiu, Ching Wai January 2012 (has links)
<p>This dissertation presents a quantitative study on the relationship between financial intermediation and the macroeconomy of the United States. It consists of two major chapters, with the first chapter studying adverse shocks to interbank market lending, and with the second chapter studying a theoretical model where aggregate balance sheets of the financial and non-financial sectors play a key role in financial intermediation frictions.</p><p>In the first chapter, I empirically investigate a novel macroeconomic shock: the funding liquidity shock. Funding liquidity is defined as the ability of a (financial) institution to raise cash at short notice, with interbank market loans being a very common source of short-term external funding. Using the "TED spread" as a proxy of aggregate funding liquidity for the period from 1971M1 to 2009M9, I first discover that, by using the vector-autoregression approach, an unanticipated adverse TED shock brings significant recessionary effects: industrial production and prices fall, and the unemployment rate rises. The contraction lasts for about twenty months. I also recover the conventional monetary policy shock, the macro impact of which is in line with the results of Christiano et al (1998) and Christiano et al (2005) . I then follow the factor model approach and find that the excess returns of small-firm portfolios are more negatively impacted by an adverse funding liquidity shock. I also present evidence that this shock as a "risk factor" is priced in the cross-section of equity returns. Moreover, a proposed factor model which includes the structural funding liquidity and monetary policy shocks as factors is able to explain the cross-sectional returns of portfolios sorted on size and book-to-market ratio as well as the Fama and French (1993) three-factor model does. Lastly, I present empirical evidence that funding liquidity and market liquidity mutually affect each other.</p><p>I start the second chapter by showing that, in U.S. data, the balance sheet health of the financial sector, as measured by its equity capital and debt level, is a leading indicator of the balance sheet health of the nonfinancial sector. This fact, and the apparent role of the financial sector in the recent global financial crisis, motivate a general equilibrium macroeconomic model featuring the balance sheets of both sectors. I estimate and study a model within the "loanable funds" framework of Holmstrom and Tirole (1997), which introduces a double moral hazard problem in the financial intermediation process. I find that financial frictions modeled within this framework give rise to a shock transmission mechanism quantitatively different from the one that arises with the conventional modeling assumption, in New Keynesian business cycle models, of convex investment adjustment costs. Financial equity capital plays an important role in determining the depth and persistence of declines in output and investment due to negative shocks to the economy. Moreover, I find that shocks to the financial intermediation process cause persistent recessions, and that these shocks explain a significant portion of the variation in investment. The estimated model is also able to replicate some aspects of the cross-correlation structure of the balance sheet variables of the two sectors.</p> / Dissertation
399

On the inverse shortest path length problem

Hung, Cheng-Huang 01 December 2003 (has links)
No description available.
400

The Analysis of Long-run Real Exchange Rate in Japan

Liu, Ya-chun 26 July 2010 (has links)
Purchasing Power Parity (PPP) has been regarded as the most important theory to explain the exchange rate movement based on relative price levels of two countries. After 1973, more and more countries were taking the floating exchange rate system, and the real exchange is testing out to be a non-stationary time seriess. This would be some real factors to have an effect on the real exchange rate. In the article, We study how these possible factors change the real exchange rate and make use of Wu et.al (2008) and Lee (2010)¡¦s local projection to estimate the impulse responses under the non-stationary time series which has cointegration vectors, and then we compare the difference between the impulse response in conventional VAR and the impulse response in Local Projection. The emprical model we use is the smae one as in Zhou (1995) and Wang and Dunne (2003), and the rule of the data is the same as in Wang and Dunne (2003). Finally, we get the consistent conclusion with Wu et.al (2008), Zhou (1995) and Wang and Dunne (2003).

Page generated in 0.0354 seconds