• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 470
  • 171
  • 62
  • 40
  • 26
  • 19
  • 14
  • 14
  • 13
  • 10
  • 7
  • 7
  • 7
  • 7
  • 7
  • Tagged with
  • 1008
  • 1008
  • 199
  • 181
  • 165
  • 157
  • 148
  • 137
  • 123
  • 115
  • 96
  • 93
  • 80
  • 79
  • 76
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
171

Examining spatial arbitrage: Effect of electronic commerce and arbitrageur strategies

Subramanian, Hemang C. 07 January 2016 (has links)
Markets increase social welfare by matching willing buyers and sellers. It is important to understand whether markets are fulfilling their societal purpose and are operating efficiently. The prevalence of spatial arbitrage in markets is an important indicator of market efficiency. The two essays in my dissertation study spatial arbitrage and the behaviors of arbitrageurs Electronic commerce can improve market efficiency by helping buyers and sellers find and transact with each other across geographic distance. In the first essay, we study the effect of two distinct forms of electronic commerce on market efficiency, which we measure via the prevalence of spatial arbitrage. Spatial arbitrage is a more precise measure than price dispersion, which is typically used, because it accounts for the transaction costs of trading across distance and for unobserved product heterogeneity. Studying two forms of electronic commerce allows us to examine how the theoretical mechanisms of expanded reach and transaction immediacy affect market efficiency. We find that electronic commerce reduces the number of arbitrage opportunities but improves arbitrageur’s ability to identify and exploit those that remain. Overall, our results provide a novel and nuanced understanding of how electronic commerce improves market efficiency. Studying arbitrageur strategies will help us understand how arbitrageur behaviors impact markets by increasing/reducing spatial arbitrage. In the second essay, we study specialization strategies of arbitrageurs. Arbitrageurs specialize on asset type and sourcing locations. We investigate the role of specialization and find that specialization affects both arbitrage profits and arbitrage intensity. Subsequently, we find that specialization strategies evolve over time and different groups of arbitrageurs adapt differently based on behavioral biases and environmental factors. Overall, our findings support the predictions of the adaptive markets hypothesis and help us understand antecedents such as capital, arbitrage intensity, etc. which affect the evolution of arbitrageur strategy.
172

Developing a psychological model of end-users' experience with news Web sites

Aranyi, Gabor January 2012 (has links)
The primary aim of the research project presented in this thesis was to develop and test a comprehensive psychological model of interaction experience with news Web sites. Although news media have been publishing on the Web increasingly since the second half of the 1990s and news sites have become a favoured source of news for many, there is a lack of knowledge about news sites in terms of interaction-experience constructs and their structural relationships. The project aimed to examine people’s use of news sites from the perspective of interaction-experience research by developing a model and, based on this model, to provide guidance for designers of news sites. The project comprises three research phases: (1) exploratory phase, (2) modelling phase and (3) experimental phase. In the exploratory phase, a review of literature and an exploratory study of interaction experience with news Web sites were conducted. The latter explored how users of a particular news site interact with the site and which aspects of their experience they report. Data for the exploratory study were collected with an online questionnaire and by recording participants’ use of a news site under think-aloud instructions. In the modelling phase, an online questionnaire was used to collect answers to psychometric scales that were selected based on the literature review and the exploratory study. A measurement model was formulated to test the relationship between measurement items and the measurement scales, and structural models were formulated to test hypotheses related to the structural relationships of variables. Following the test results, a model of interaction experience with news sites was formulated to predict outcome measures of interaction experience from variables measuring aspects of interaction experience. Components of interaction experience, in turn, were predicted from measures of perceived news-site characteristics. In the experimental phase, an experiment was conducted to test the model of interaction experience with news sites in a controlled setting. Additionally, measures of person- and context characteristics were included in the prediction of components of interaction experience. The model of interaction experience with news sites was supported and accounted for a medium to substantial amount of variance in outcome measures. Finally, design guidance was derived from the model to advance interaction-experience knowledge, and conclusions were drawn regarding the model, in relation to existing research.
173

Essays on Computational Problems in Insurance

Ha, Hongjun 31 July 2016 (has links)
This dissertation consists of two chapters. The first chapter establishes an algorithm for calculating capital requirements. The calculation of capital requirements for financial institutions usually entails a reevaluation of the company's assets and liabilities at some future point in time for a (large) number of stochastic forecasts of economic and firm-specific variables. The complexity of this nested valuation problem leads many companies to struggle with the implementation. The current chapter proposes and analyzes a novel approach to this computational problem based on least-squares regression and Monte Carlo simulations. Our approach is motivated by a well-known method for pricing non-European derivatives. We study convergence of the algorithm and analyze the resulting estimate for practically important risk measures. Moreover, we address the problem of how to choose the regressors, and show that an optimal choice is given by the left singular functions of the corresponding valuation operator. Our numerical examples demonstrate that the algorithm can produce accurate results at relatively low computational costs, particularly when relying on the optimal basis functions. The second chapter discusses another application of regression-based methods, in the context of pricing variable annuities. Advanced life insurance products with exercise-dependent financial guarantees present challenging problems in view of pricing and risk management. In particular, due to the complexity of the guarantees and since practical valuation frameworks include a variety of stochastic risk factors, conventional methods that are based on the discretization of the underlying (Markov) state space may not be feasible. As a practical alternative, this chapter explores the applicability of Least-Squares Monte Carlo (LSM) methods familiar from American option pricing in this context. Unlike previous literature we consider optionality beyond surrendering the contract, where we focus on popular withdrawal benefits - so-called GMWBs - within Variable Annuities. We introduce different LSM variants, particularly the regression-now and regression-later approaches, and explore their viability and potential pitfalls. We commence our numerical analysis in a basic Black-Scholes framework, where we compare the LSM results to those from a discretization approach. We then extend the model to include various relevant risk factors and compare the results to those from the basic framework.
174

The relationship between lean service, activity-based costing and business strategy and their impact on performance

Hadid, Wael January 2014 (has links)
Lean system has drawn the attention of researchers and practitioners since its emergence in 1950s. This has been reflected by the increasing number of companies attempting to implement its practices and the large number of researchers investigating its effectiveness and identifying important contextual factors which affect its implementation. The rising level of interest in lean system has led to the emergence of three distinctive streams of literature. The first stream of literature has focused on the effectiveness of lean system. However, this literature was limited as it mainly examined the additive impact of lean practices on operational performance in the manufacturing context. The second stream of literature has focused on the role the accounting system in the lean context. In this body of literature, there was an agreement among researchers on the superiority of activity-based costing system (ABC) over the traditional accounting system in supporting the implementation of lean practices. However, most studies in this strand of literature were either conceptual or case-based studies. The third stream of literature has focused on the fit between business strategy and lean system. However, inconclusive results were reported in relation to the suitability of lean system to firms adopting the differentiation strategy and others adopting the cost leadership strategy. The aim of this study is to develop and empirically test a conceptual model which integrates the three distinctive streams of literature to extend their focus and overcome their limitations. More specifically, the model developed in the current study highlights not only the additive impact of lean practices but also the possible synergy among those practices in improving both operational and financial performance of service firms. In addition, the model brings to light the potential intervening role of ABC in the strategy-lean association. After identifying and reviewing the relevant literature, the socio-technical system theory and contingency theory were used to develop the conceptual model and associated hypotheses. A questionnaire instrument was designed to collect empirical data which was supplemented by objective data from the Financial Analysis Made Easy database in order to empirically test the conceptual model using partial least squares structural equation modelling (PLS-SEM). The findings of this study indicated that while the technical practices of lean service improved only the operational performance of service firms, the social practices enhanced both operational and financial performance. In addition, the two sets of practices positively interacted to improve firm performance over and above the improvement achieved from each set separately. Moreover, ABC was found to have a positive association with lean practice, and consequently an indirect positive relation with firm operational performance. Finally, both the differentiation and cost leadership strategy had a direct positive relationship with lean practices. However, while ABC was found to partially mediate the differentiation-lean association, it suppressed the cost leadership-lean association leading to a case of inconsistent mediation. The current study contributes to the current literature at different levels. First, at the theoretical level, this study develops a conceptual framework which crosses different streams of literatures mainly, lean system literature, management accounting literature (with focus on ABC), and business strategy literature. Unlike previous studies, by integrating the perspective of socio-technical system theory and contingency theory, the model (i) highlights not only the additive but also the synergistic effect of lean service practices on firm performance, (ii) brings to light the direct impact of ABC and business strategy on lean service practices and the intervening role of ABC due to which the business strategy is assumed to have also an indirect influence on lean practices, and (iii) offers an alternative view on how ABC can improve firm performance by enhancing other organisational capabilities (lean practices) which are expected to improve performance . Second, at the methodological level, unlike previous studies, this study includes a large number of lean service practices and contextual variables to report more precisely on the lean-performance association. In addition, the inclusion of the financial performance dimension-measured by secondary data- in the model besides the operational performance is critical to understand the full capability of lean service in improving firm performance. Further, employing a powerful statistical technique (PLS-SEM) provides more credibility to the results reported in this study. Third, at the empirical level, this study is conducted in the UK service sector. As such, this study is one of the very few studies that have reported on lean service and examined how the adoption of ABC and a specific type of business strategy can affect its implementation using empirical survey data from this context.
175

Supervised Descent Method

Xiong, Xuehan 01 September 2015 (has links)
In this dissertation, we focus on solving Nonlinear Least Squares problems using a supervised approach. In particular, we developed a Supervised Descent Method (SDM), performed thorough theoretical analysis, and demonstrated its effectiveness on optimizing analytic functions, and four other real-world applications: Inverse Kinematics, Rigid Tracking, Face Alignment (frontal and multi-view), and 3D Object Pose Estimation. In Rigid Tracking, SDM was able to take advantage of more robust features, such as, HoG and SIFT. Those non-differentiable image features were out of consideration of previous work because they relied on gradient-based methods for optimization. In Inverse Kinematics where we minimize a non-convex function, SDM achieved significantly better convergence than gradient-based approaches. In Face Alignment, SDM achieved state-of-the-arts results. Moreover, it was extremely computationally efficient, which makes it applicable for many mobile applications. In addition, we provided a unified view of several popular methods including SDM on sequential prediction, and reformulated them as a sequence of function compositions. Finally, we suggested some future research directions on SDM and sequential prediction.
176

Sparse Linear Modeling of Speech from EEG / Gles Linjära Modellering av Tal från EEG

Tiger, Mattias January 2014 (has links)
For people with hearing impairments, attending to a single speaker in a multi-talker background can be very difficult and something which the current hearing aids can barely help with. Recent studies have shown that the audio stream a human focuses on can be found among the surrounding audio streams, using EEG and linear models. With this rises the possibility of using EEG to unconsciously control future hearing aids such that the attuned sounds get enhanced, while the rest are damped. For such hearing aids to be useful for every day usage it better be using something other than a motion sensitive, precisely placed EEG cap. This could possibly be archived by placing the electrodes together with the hearing aid in the ear. One of the leading hearing aid manufacturer Oticon and its research lab Erikholm Research Center have recorded an EEG data set of people listening to sentences and in which electrodes were placed in and closely around the ears. We have analyzed the data set by applying a range of signal processing approaches, mainly in the context of audio estimation from EEG. Two different types of linear sparse models based on L1-regularized least squares are formulated and evaluated, providing automatic dimensionality reduction in that they significantly reduce the number of channels needed. The first model is based on linear combinations of spectrograms and the second is based on linear temporal filtering. We have investigated the usefulness of the in-ear electrodes and found some positive indications. All models explored consider the in-ear electrodes to be the most important, or among the more important, of the 128 electrodes in the EEG cap.This could be a positive indication of the future possibility of using only electrodes in the ears for future hearing aids.
177

The role of the human nasal cavity in patterns of craniofacial covariation and integration

Lindal, Joshua 18 January 2016 (has links)
Climate has a selective influence on nasal cavity morphology. Due to the constraints of cranial integration, naturally selected changes in one structure necessitate changes in others in order to maintain structural and functional cohesion. The relationships between climate and skull/nasal cavity morphology have been explored, but the integrative role of nasal variability within the skull as a whole has not. This thesis presents two hypotheses: 1) patterns of craniofacial integration observed in 2D can be reproduced using 3D geometric morphometric techniques; 2) the nasal cavity exhibits a higher level of covariation with the lateral cranial base than with other parts of the skull, since differences in nasal morphology and basicranial breadth have both been linked to climatic variables. The results support the former hypothesis, but not the latter; covariation observed between the nasal cavity and other cranial modules may suggest that these relationships are characterized by a unique integrative relationship. / February 2016
178

Aspects of model development using regression quantiles and elemental regressions

Ranganai, Edmore 03 1900 (has links)
Dissertation (PhD)--University of Stellenbosch, 2007. / ENGLISH ABSTRACT: It is well known that ordinary least squares (OLS) procedures are sensitive to deviations from the classical Gaussian assumptions (outliers) as well as data aberrations in the design space. The two major data aberrations in the design space are collinearity and high leverage. Leverage points can also induce or hide collinearity in the design space. Such leverage points are referred to as collinearity influential points. As a consequence, over the years, many diagnostic tools to detect these anomalies as well as alternative procedures to counter them were developed. To counter deviations from the classical Gaussian assumptions many robust procedures have been proposed. One such class of procedures is the Koenker and Bassett (1978) Regressions Quantiles (RQs), which are natural extensions of order statistics, to the linear model. RQs can be found as solutions to linear programming problems (LPs). The basic optimal solutions to these LPs (which are RQs) correspond to elemental subset (ES) regressions, which consist of subsets of minimum size to estimate the necessary parameters of the model. On the one hand, some ESs correspond to RQs. On the other hand, in the literature it is shown that many OLS statistics (estimators) are related to ES regression statistics (estimators). Therefore there is an inherent relationship amongst the three sets of procedures. The relationship between the ES procedure and the RQ one, has been noted almost “casually” in the literature while the latter has been fairly widely explored. Using these existing relationships between the ES procedure and the OLS one as well as new ones, collinearity, leverage and outlier problems in the RQ scenario were investigated. Also, a lasso procedure was proposed as variable selection technique in the RQ scenario and some tentative results were given for it. These results are promising. Single case diagnostics were considered as well as their relationships to multiple case ones. In particular, multiple cases of the minimum size to estimate the necessary parameters of the model, were considered, corresponding to a RQ (ES). In this way regression diagnostics were developed for both ESs and RQs. The main problems that affect RQs adversely are collinearity and leverage due to the nature of the computational procedures and the fact that RQs’ influence functions are unbounded in the design space but bounded in the response variable. As a consequence of this, RQs have a high affinity for leverage points and a high exclusion rate of outliers. The influential picture exhibited in the presence of both leverage points and outliers is the net result of these two antagonistic forces. Although RQs are bounded in the response variable (and therefore fairly robust to outliers), outlier diagnostics were also considered in order to have a more holistic picture. The investigations used comprised analytic means as well as simulation. Furthermore, applications were made to artificial computer generated data sets as well as standard data sets from the literature. These revealed that the ES based statistics can be used to address problems arising in the RQ scenario to some degree of success. However, due to the interdependence between the different aspects, viz. the one between leverage and collinearity and the one between leverage and outliers, “solutions” are often dependent on the particular situation. In spite of this complexity, the research did produce some fairly general guidelines that can be fruitfully used in practice. / AFRIKAANSE OPSOMMING: Dit is bekend dat die gewone kleinste kwadraat (KK) prosedures sensitief is vir afwykings vanaf die klassieke Gaussiese aannames (uitskieters) asook vir data afwykings in die ontwerpruimte. Twee tipes afwykings van belang in laasgenoemde geval, is kollinearitiet en punte met hoë hefboom waarde. Laasgenoemde punte kan ook kollineariteit induseer of versteek in die ontwerp. Na sodanige punte word verwys as kollinêre hefboom punte. Oor die jare is baie diagnostiese hulpmiddels ontwikkel om hierdie afwykings te identifiseer en om alternatiewe prosedures daarteen te ontwikkel. Om afwykings vanaf die Gaussiese aanname teen te werk, is heelwat robuuste prosedures ontwikkel. Een sodanige klas van prosedures is die Koenker en Bassett (1978) Regressie Kwantiele (RKe), wat natuurlike uitbreidings is van rangorde statistieke na die lineêre model. RKe kan bepaal word as oplossings van lineêre programmeringsprobleme (LPs). Die basiese optimale oplossings van hierdie LPs (wat RKe is) kom ooreen met die elementale deelversameling (ED) regressies, wat bestaan uit deelversamelings van minimum grootte waarmee die parameters van die model beraam kan word. Enersyds geld dat sekere EDs ooreenkom met RKe. Andersyds, uit die literatuur is dit bekend dat baie KK statistieke (beramers) verwant is aan ED regressie statistieke (beramers). Dit impliseer dat daar dus ‘n inherente verwantskap is tussen die drie klasse van prosedures. Die verwantskap tussen die ED en die ooreenkomstige RK prosedures is redelik “terloops” van melding gemaak in die literatuur, terwyl laasgenoemde prosedures redelik breedvoerig ondersoek is. Deur gebruik te maak van bestaande verwantskappe tussen ED en KK prosedures, sowel as nuwes wat ontwikkel is, is kollineariteit, punte met hoë hefboom waardes en uitskieter probleme in die RK omgewing ondersoek. Voorts is ‘n lasso prosedure as veranderlike seleksie tegniek voorgestel in die RK situasie en is enkele tentatiewe resultate daarvoor gegee. Hierdie resultate blyk belowend te wees, veral ook vir verdere navorsing. Enkel geval diagnostiese tegnieke is beskou sowel as hul verwantskap met meervoudige geval tegnieke. In die besonder is veral meervoudige gevalle beskou wat van minimum grootte is om die parameters van die model te kan beraam, en wat ooreenkom met ‘n RK (ED). Met sodanige benadering is regressie diagnostiese tegnieke ontwikkel vir beide EDs en RKe. Die belangrikste probleme wat RKe negatief beinvloed, is kollineariteit en punte met hoë hefboom waardes agv die aard van die berekeningsprosedures en die feit dat RKe se invloedfunksies begrensd is in die ruimte van die afhanklike veranderlike, maar onbegrensd is in die ontwerpruimte. Gevolglik het RKe ‘n hoë affiniteit vir punte met hoë hefboom waardes en poog gewoonlik om uitskieters uit te sluit. Die finale uitset wat verkry word wanneer beide punte met hoë hefboom waardes en uitskieters voorkom, is dan die netto resultaat van hierdie twee teenstrydige pogings. Alhoewel RKe begrensd is in die onafhanklike veranderlike (en dus redelik robuust is tov uitskieters), is uitskieter diagnostiese tegnieke ook beskou om ‘n meer holistiese beeld te verkry. Die ondersoek het analitiese sowel as simulasie tegnieke gebruik. Voorts is ook gebruik gemaak van kunsmatige datastelle en standard datastelle uit die literatuur. Hierdie ondersoeke het getoon dat die ED gebaseerde statistieke met ‘n redelike mate van sukses gebruik kan word om probleme in die RK omgewing aan te spreek. Dit is egter belangrik om daarop te let dat as gevolg van die interafhanklikheid tussen kollineariteit en punte met hoë hefboom waardes asook dié tussen punte met hoë hefboom waardes en uitskieters, “oplossings” dikwels afhanklik is van die bepaalde situasie. Ten spyte van hierdie kompleksiteit, is op grond van die navorsing wat gedoen is, tog redelike algemene riglyne verkry wat nuttig in die praktyk gebruik kan word.
179

Analysis of a Combined GLONASS/Compass-I Navigation Algorithm

Peng, Song, Xiao-yu, Chen, Jian-zhong, Qi 10 1900 (has links)
ITC/USA 2011 Conference Proceedings / The Forty-Seventh Annual International Telemetering Conference and Technical Exhibition / October 24-27, 2011 / Bally's Las Vegas, Las Vegas, Nevada / Compass-I system is China has built satellite navigation system. It's a kind of regional position system according to the double-star position principle. Commonly, Compass-I system need adopt active position, in the paper several passive position methods are put forward. A combination navigation mode based on GLONASS and Compass-I passive navigation is proposed in this paper. The differences of coordinates and time systems between those two navigation systems are analyzed. User position is calculated by least squares method. Combination Navigation Algorithm can improve visible satellite constellation structure and positioning precision so as to ensure the reliability and continuity of positioning result.
180

Archeologinių duomenų analizė. Sukimo ašies radimas / Analysis of archaeological data. estimation of the axis of rotation

Misiukevičius, Ramūnas 30 June 2014 (has links)
Pasaulyje sparčiai besivystančios informacinės technologijos (IT) neaplenkia ir archeologijos mokslo. Vis dažniau archeologai naudoja įvairias kompiuterines programas ne tik archeologinės medžiagos dokumentavimui, vaizdavimui ar rekonstrukcijai, bet ir žmonių veiklos, buities, gyvenimo aplinkos rekonstrukcijai ar modeliavimui. Šis uždavinys reikalauja atlikti kelių etapų analizę ir išsiaiškinti radinių kilmę, tipą, originalumą ir paskirtį. Turint šią informaciją, galime daug sužinoti apie žmonių, kurie naudojosi tais daiktais žinias, turėtus įrankius, papročius, emigraciją ir daug kitos informacijos. Žinių kiekis apie senovę priklauso nuo radinių ir mūsų gebėjimų juos analizuoti. Šiame darbe yra pristatomas vienas iš puodų šukių analizės metodų - sukimo ašies radimas. Tai yra pirmasis ir esminis tokio tipo radinių analizės etapas, nes nuo jo rezultatų priklauso kitos radinio analizės - profilio linijos radimas, simetriškumo tikrinimas, segmantacijos realizavimas, objektų tipologija, rekonstrukcija ir galiausiai - žmonių gyvenimo analizė. Klaidos šiame etape turi lemiamos reikšmės kitiems analizės etapams, o gautos žinios gali suklaidinti tiriant senovės žmonių kultūrą ir jų paplitimą bei migraciją. Darbe yra aptariami sukimo ašies radimo metodai, jų privalumai ir trūkumai, pateikiami pavyzdžiai. / The world is rapidly developing information technology (IT) exist in archaeological science. Increasingly, archaeologists use various computer programs not only for documentation of archaeological material, or the depiction of reconstruction, but human activity, lifestyle, environmental reconstruction and modeling. This task requires a multi-step analysis of the findings and to clarify the origin of the type of originality and purpose. With this information, we can learn a lot about the people who used the objects of knowledge at the tools, customs, emigration, and much other information. Amount of knowledge about ancient artifacts and depends on our ability to analyze them. This paper has presented one of the pottery shards of methods of analysis – estimation of the axis of rotation. This is the first of its kind and an essential step in the analysis finds, because it captures the results of another analysis - Finding the profile lines, symmetry checks, realization of segmentation, object typology, reconstruction, and finally - an analysis of people's lives. Errors at this stage is critical for other steps in the analysis and the knowledge generated is likely to mislead the investigation of ancient human cultures and their distribution and migration. The paper discusses the rotation axis of the detection methods, their advantages and disadvantages, are examples.

Page generated in 0.0588 seconds