• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 323
  • 143
  • 42
  • 25
  • 12
  • 8
  • 6
  • 5
  • 4
  • 4
  • 3
  • 3
  • 3
  • 2
  • 2
  • Tagged with
  • 648
  • 104
  • 101
  • 99
  • 89
  • 76
  • 72
  • 70
  • 69
  • 68
  • 60
  • 56
  • 51
  • 50
  • 42
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
131

Social Network Analysis of Researchers' Communication and Collaborative Networks Using Self-reported Data

Cimenler, Oguz 16 June 2014 (has links)
This research seeks an answer to the following question: what is the relationship between the structure of researchers' communication network and the structure of their collaborative output networks (e.g. co-authored publications, joint grant proposals, and joint patent applications), and the impact of these structures on their citation performance and the volume of collaborative research outputs? Three complementary studies are performed to answer this main question as discussed below. 1. Study I: A frequently used output to measure scientific (or research) collaboration is co-authorship in scholarly publications. Less frequently used are joint grant proposals and patents. Many scholars believe that co-authorship as the sole measure of research collaboration is insufficient because collaboration between researchers might not result in co-authorship. Collaborations involve informal communication (i.e., conversational exchange) between researchers. Using self-reports from 100 tenured/tenure-track faculty in the College of Engineering at the University of South Florida, researchers' networks are constructed from their communication relations and collaborations in three areas: joint publications, joint grant proposals, and joint patents. The data collection: 1) provides a rich data set of both researchers' in-progress and completed collaborative outputs, 2) yields a rating from the researchers on the importance of a tie to them 3) obtains multiple types of ties between researchers allowing for the comparison of their multiple networks. Exponential Random Graph Model (ERGM) results show that the more communication researchers have the more likely they produce collaborative outputs. Furthermore, the impact of four demographic attributes: gender, race, department affiliation, and spatial proximity on collaborative output relations is tested. The results indicate that grant proposals are submitted with mixed gender teams in the college of engineering. Besides, the same race researchers are more likely to publish together. The demographics do not have an additional leverage on joint patents. 2. Study II: Previous research shows that researchers' social network metrics obtained from a collaborative output network (e.g., joint publications or co-authorship network) impact their performance determined by g-index. This study uses a richer dataset to show that a scholar's performance should be considered with respect to position in multiple networks. Previous research using only the network of researchers' joint publications shows that a researcher's distinct connections to other researchers (i.e., degree centrality), a researcher's number of repeated collaborative outputs (i.e., average tie strength), and a researchers' redundant connections to a group of researchers who are themselves well-connected (i.e., efficiency coefficient) has a positive impact on the researchers' performance, while a researcher's tendency to connect with other researchers who are themselves well-connected (i.e., eigenvector centrality) had a negative impact on the researchers' performance. The findings of this study are similar except that eigenvector centrality has a positive impact on the performance of scholars. Moreover, the results demonstrate that a researcher's tendency towards dense local neighborhoods (as measured by the local clustering coefficient) and the researchers' demographic attributes such as gender should also be considered when investigating the impact of the social network metrics on the performance of researchers. 3. Study III: This study investigates to what extent researchers' interactions in the early stage of their collaborative network activities impact the number of collaborative outputs produced (e.g., joint publications, joint grant proposals, and joint patents). Path models using the Partial Least Squares (PLS) method are run to test the extent to which researchers' individual innovativeness, as determined by the specific indicators obtained from their interactions in the early stage of their collaborative network activities, impacts the number of collaborative outputs they produced taking into account the tie strength of a researcher to other conversational partners (TS). Within a college of engineering, it is found that researchers' individual innovativeness positively impacts the volume of their collaborative outputs. It is observed that TS positively impacts researchers' individual innovativeness, whereas TS negatively impacts researchers' volume of collaborative outputs. Furthermore, TS negatively impacts the relationship between researchers' individual innovativeness and the volume of their collaborative outputs, which is consistent with `Strength of Weak Ties' Theory. The results of this study contribute to the literature regarding the transformation of tacit knowledge into explicit knowledge in a university context.
132

Generalized Maximum Entropy, Convexity and Machine Learning

Sears, Timothy Dean, tim.sears@biogreenoil.com January 2008 (has links)
This thesis identifies and extends techniques that can be linked to the principle of maximum entropy (maxent) and applied to parameter estimation in machine learning and statistics. Entropy functions based on deformed logarithms are used to construct Bregman divergences, and together these represent a generalization of relative entropy. The framework is analyzed using convex analysis to charac- terize generalized forms of exponential family distributions. Various connections to the existing machine learning literature are discussed and the techniques are applied to the problem of non-negative matrix factorization (NMF).
133

Estimating the inevitability of fast oscillations in model systems with two timescales

Choy, Vivian K.Y, 1971- January 2001 (has links)
Abstract not available
134

Contributions to Motion Planning and Orbital Stabilization : Case studies: Furuta Pendulum swing up, Inertia Wheel oscillations and Biped Robot walking

Miranda La Hera, Pedro Xavier January 2008 (has links)
<p>Generating and stabilizing periodic motions in nonlinear systems is a challenging task. In the control system community this topic is also known as limit cycle control. In recent years a framework known as Virtual Holonomic Constraints (VHC) has been developed as one of the solutions to this problem. The aim of this thesis is to give an insight into this approach and its practical application.</p><p>The contribution of this work is primarily the experimental validation of the theory. A step by step procedure of this methodology is given for motion planning, as well as for controller design. Three particular setups were chosen for experiments: the inertia wheel pendulum, the Furuta pendulum and the two-link planar pendulum. These under-actuated mechanical systems are well known benchmarking setups for testing advanced control design methods.</p><p>Further application is intended for cases such as biped robot walking/running, human and animal locomotion analysis, etc.</p>
135

Exponential function of pseudo-differential operators

Galstian, Anahit, Yagdjian, Karen January 1997 (has links)
The paper is devoted to the construction of the exponential function of a matrix pseudo-differential operator which do not satisfy any of the known theorems (see, Sec.8 Ch.VIII and Sec.2 Ch.XI of [17]). The applications to the construction of the fundamental solution for the Cauchy problem for the hyperbolic operators with the characteristics of variable multiplicity are given, too.
136

Uniform compact attractors for a nonlinear non-autonomous equation of viscoelasticity

Schulze, Bert-Wolfgang, Qin, Yuming January 2005 (has links)
In this paper we establish the regularity, exponential stability of global (weak) solutions and existence of uniform compact attractors of semiprocesses, which are generated by the global solutions, of a two-parameter family of operators for the nonlinear 1-d non-autonomous viscoelasticity. We employ the properties of the analytic semigroup to show the compactness for the semiprocess generated by the global solutions.
137

Exponential Fitting, Finite Volume and Box Methods in Option Pricing.

Shcherbakov, Dmitry, Szwaczkiewicz, Sylwia January 2010 (has links)
In this thesis we focus mainly on special finite differences and finite volume methods and apply them to the pricing of barrier options.The structure of this work is the following: in Chapter 1 we introduce the definitions of options and illustrate some properties of vanilla European options and exotic options.Chapter 2 describes a classical model used in the financial world, the  Black-Scholes model. We derive theBlack-Scholes formula and show how stochastic differential equations model financial instruments prices.The aim of this chapter is also to present the initial boundary value problem and the maximum principle.We discuss boundary conditions such as: the first boundary value problem, also called  Dirichlet problem that occur in pricing ofbarrier options and European options. Some kinds of put options lead to the study of a second boundary value problem (Neumann, Robin problem),while the Cauchy problem is associated with one-factor European and American options.Chapter 3 is about finite differences methods such as theta, explicit, implicit and Crank-Nicolson method, which are used forsolving partial differential equations.The exponentially fitted scheme is presented in Chapter 4. It is one of the new classesof a robust difference scheme that is stable, has good convergence and does not produce spurious oscillations.The stability is also advantage of the box method that is presented in Chapter 5.In the beginning of the Chapter 6 we illustrate barrier options and then we consider a novel finite volume discretization for apricing the above options.Chapter 7 describes discretization of the Black-Scholes equation by the fitted finite volume scheme. In  Chapter 8 we present and describe numerical results obtained by using  the finite difference methods illustrated in the previous chapters.
138

Stretched Exponential Decline Model as a Probabilistic and Deterministic Tool for Production Forecasting and Reserve Estimation in Oil and Gas Shales

Akbarnejad Nesheli, Babak 2012 May 1900 (has links)
Today everyone seems to agree that ultra-low permeability and shale reservoirs have become the potentials to transform North America's oil and gas industry to a new phase. Unfortunately, transient flow is of long duration (perhaps life of the well) in ultra-low permeability reservoirs, and traditional decline curve analysis (DCA) models can lead to significantly over-optimistic production forecasts without additional safeguards. Stretched Exponential decline model (SEDM) gives considerably more stabilized production forecast than traditional DCA models and in this work it is shown that it produces unchanging EUR forecasts after only two-three years of production data are available in selected reservoirs, notably the Barnett Shale. For an individual well, the SEDM model parameters, can be determined by the method of least squares in various ways, but the inherent nonlinear character of the least squares problem cannot be bypassed. To assure a unique solution to the parameter estimation problem, this work suggests a physics-based regularization approach, based on critical velocity concept. Applied to selected Barnett Shale gas wells, the suggested method leads to reliable and consistent EURs. To further understand the interaction of the different fracture properties on reservoir response and production decline curve behavior, a series of Discrete Fracture Network (DFN) simulations were performed. Results show that at least a 3-layer model is required to reproduce the decline behavior as captured in the published SEDM parameters for Barnett Shale. Further, DFN modeling implies a large number of parameters like fracture density and fracture length are in such a way that their effect can be compensated by the other one. The results of DFN modeling of several Barnett Shale horizontal wells, with numerous fracture stages, showed a very good agreement with the estimated SEDM model for the same wells. Estimation of P90 reserves that meet SEC criteria is required by law for all companies that raise capital in the United States. Estimation of P50 and P10 reserves that meet SPE/WPC/AAPG/SPEE Petroleum Resources Management System (PRMS) criteria is important for internal resource inventories for most companies. In this work a systematic methodology was developed to quantify the range of uncertainty in production forecast using SEDM. This methodology can be used as a probabilistic tool to quantify P90, P50, and P10 reserves and hence might provide one possible way to satisfy the various legal and technical-society-suggested criteria.
139

Argmax over Continuous Indices of Random Variables - An Approach Using Random Fields

Malmberg, Hannes Unknown Date (has links)
optimizationover a discrete number of random variables. In this paperwe extend this theory from the discrete to the continuous case, andconsider the limiting distribution of the location of the best offer asthe number of offers tends to infinity.Given a set   Rd of possible offers we seek a distribution over ,the argmax measure of the best offer. It depends on , the samplingdistribution of offer locations, and a measure index , which assignsto each point x 2  a probability distribution of offers.This problem is closely related to argmax theory of marked pointprocesses, altough we consider deterministic sequences of points inspace, to allow for greater generality. We first define a finite sampleargmax measure and then give conditions under which it converges asthe number of offers tends to infinity.To this end, we introduce a max-field of best offers and use continuityproperties of this field to calculate the argmax measure. Wedemonstrate the usefulness of the method by giving explicit formulasfor the limiting argmax distribution for a large class of models, includingexponential independent offers with a deterministic, additivedisturbance term. Finally, we illustrate the theory by simulations.
140

The Turkish Catastrophe Insurance Pool Claims Modeling 2000-2008 Data

Saribekir, Gozde 01 March 2013 (has links) (PDF)
After the 1999 Marmara Earthquake, social, economic and engineering studies on earthquakes became more intensive. The Turkish Catastrophe Insurance Pool (TCIP) was established after the Marmara Earthquake to share the deficit in the budget of the Government. The TCIP has become a data source for researchers, consisting of variables such as number of claims, claim amount and magnitude. In this thesis, the TCIP earthquake claims, collected between 2000 and 2008, are studied. The number of claims and claim payments (aggregate claim amount) are modeled by using Generalized Linear Models (GLM). Observed sudden jumps in claim data are represented by using the exponential kernel function. Model parameters are estimated by using the Maximum Likelihood Estimation (MLE). The results can be used as recommendation in the computation of expected value of the aggregate claim amounts and the premiums of the TCIP.

Page generated in 0.0896 seconds