• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 12
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 18
  • 5
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Production externalities : cooperative and non-cooperative approaches

Trudeau, Christian January 2008 (has links)
Thèse numérisée par la Division de la gestion de documents et des archives de l'Université de Montréal
12

The Effect of Combined Bony Defects on the Anterior Stability of the Glenohumeral Joint and Implications for Surgical Repair

Walia, Piyush 24 August 2015 (has links)
No description available.
13

Essays on Consumption : - Aggregation, Asymmetry and Asset Distributions

Bjellerup, Mårten January 2005 (has links)
The dissertation consists of four self-contained essays on consumption. Essays 1 and 2 consider different measures of aggregate consumption, and Essays 3 and 4 consider how the distributions of income and wealth affect consumption from a macro and micro perspective, respectively. Essay 1 considers the empirical practice of seemingly interchangeable use of two measures of consumption; total consumption expenditure and consumption expenditure on nondurable goods and services. Using data from Sweden and the US in an error correction model, it is shown that consumption functions based on the two measures exhibit significant differences in several aspects of econometric modelling. Essay 2, coauthored with Thomas Holgersson, considers derivation of a univariate and a multivariate version of a test for asymmetry, based on the third central moment. The logic behind the test is that the dependent variable should correspond to the specification of the econometric model; symmetric with linear models and asymmetric with non-linear models. The main result in the empirical application of the test is that orthodox theory seems to be supported for consumption of both nondurable and durable consumption. The consumption of durables shows little deviation from symmetry in the four-country sample, while the consumption of nondurables is shown to be asymmetric in two out of four cases, the UK and the US. Essay 3 departs from the observation that introducing income uncertainty makes the consumption function concave, implying that the distributions of wealth and income are omitted variables in aggregate Euler equations. This implication is tested through estimation of the distributions over time and augmentation of consumption functions, using Swedish data for 1963-2000. The results show that only the dispersion of wealth is significant, the explanation of which is found in the marked changes of the group of households with negative wealth; a group that according to a concave consumption function has the highest marginal propensity to consume. Essay 4 attempts to empirically specify the nature of the alleged concavity of the consumption function. Using grouped household level Swedish data for 1999-2001, it is shown that the marginal propensity to consume out of current resources, i.e. current income and net wealth, is strictly decreasing in current resources and net wealth, but approximately constant in income. Also, an empirical reciprocal to the stylized theoretical consumption function is estimated, and shown to bear a close resemblance to the theoretical version.
14

Métodos para aproximação poligonal e o desenvolvimento de extratores de características de forma a partir da função tangencial

Carvalho, Juliano Daloia de 12 September 2008 (has links)
Whereas manually drawn contours could contain artifacts related to hand tremor, automatically detected contours could contain noise and inaccuracies due to limitations or errors in the procedures for the detection and segmentation of the related regions. To improve the further step of description, modeling procedures are desired to eliminate the artifacts in a given contour, while preserving the important and significant details present in the contour. In this work, are presented a couple of polygonal modeling methods, first a method applied direct on the original contour and other derived from the turning angle function. Both methods use the following parametrization Smin e µmax to infer about removing or maintain a given segment. By the using of the mentioned parameters the proposed methods could be configured according to the application problem. Both methods have been shown eficient to reduce the influence of noise and artifacts while preserving relevant characteristic for further analysis. Systems to support the diagnosis by images (CAD) and retrieval of images by content (CBIR) use shape descriptor methods to make possible to infer about factors existing in a given contour or as base to classify groups with dierent patterns. Shape factors methods should represent a value that is aected by the shape of an object, thus it is possible to characterize the presence of a factor in the contour or identify similarity among contours. Shape factors should be invariant to rotation, translation or scale. In the present work there are proposed the following shape features: index of the presence of convex region (XRTAF ), index of the presence of concave regions (V RTAF ), index of convexity (CXTAF ), two measures of fractal dimension (DFTAF e DF1 TAF ) and the index of spiculation (ISTAF ). All derived from the smoothed turning angle function. The smoothed turning angle function represent the contour in terms of their concave and convex regions. The polygonal modeling and the shape descriptors methods were applied on the breast masses classification issue to evaluate their performance. The polygonal modeling procedure proposed in this work provided higher compression and better polygonal fitness. The best classification accuracies, on discriminating between benign masses and malignant tumors, obtain for XRTAF , V RTAF , CXTAF , DFTAF , DF1 TAF and ISTAF , in terms of area under the receiver operating characteristics curve, were 0:92, 0:92, 0:93, 0:93, 0:92 e 0:94, respectively. / Contornos obtidos manualmente podem conter ruídos e artefatos oriundos de tremores da mão bem como contornos obtidos automaticamente podem os conter dado a problemas na etapa de segmentação. Para melhorar os resultados da etapa de representação e descrição, são necessários métodos capazes de reduzir a influência dos ruídos e artefatos enquanto mantém características relevantes da forma. Métodos de aproximação poligonal têm como objetivo a remoção de ruídos e artefatos presentes nos contornos e a melhor representação da forma com o menor número possível de segmentos de retas. Nesta disserta ção são propostos dois métodos de aproximação poligonal, um aplicado diretamente no contorno e outro que é obtido a partir da função tangencial do contorno original. Ambos os métodos fazem uso dos parâmetros Smin e µmax para inferirem sobre a permanência ou remoção de um dado segmento. Com a utilização destes parâmetros os métodos podem ser configurados para serem utilizados em vários tipos de aplicações. Ambos os métodos mostram-se eficientes na remoção de ruídos e artefatos, enquanto que características relevantes para etapas de pós-processamento são mantidas. Sistemas de apoio ao diagnóstico por imagens e de recuperação de imagens por conte údo fazem uso de métodos descritores de forma para que seja possível inferir sobre características presentes em um dado contorno ou ainda como base para medir a dissimilaridade entre contornos. Métodos descritores de características são capazes de representar um contorno por um número, assim é possível estabelecer a presença de uma característica no contorno ou ainda identificar uma possível similaridade entre os contornos. Métodos para extração de características devem ser invariantes a rotação, translação e escala. Nesta dissertação são propostos os seguintes métodos descritores de características: índice de presença de regiões convexas (XRTAF ), índice da presença de regiões côncavas (V RTAF ), índice de convexidade (CXTAF ), duas medidas de dimensão fractal (DFTAF e DF1 TAF ) e o índice de espículos (ISTAF ). Todos aplicados sobre a função tangencial suavizada. A função tangencial suavizada representa o contorno em termos de suas regiões côncavas e regiões convexas. Os métodos de aproximação poligonal e descritores de características foram aplicados para o problema de classificação de lesões de mama. Os resultados obtidos, mostraram que os métodos de aproximação poligonal propostos neste trabalho resultam em polígonos mais compactos e com melhor representação do contorno original. Os melhores resultados de classificação, na discriminação entre lesões benignas e tumores malignos, obtidos por XRTAF , V RTAF , CXTAF , DFTAF , DF1 TAF e ISTAF , em termos da área sob a curva ROC, foram 0:92, 0:92, 0:93, 0:93, 0:92 e 0:94, respectivamente. / Mestre em Ciência da Computação
15

Des équations de contrainte en gravité modifiée : des théories de Lovelock à un nouveau problème de σk-Yamabe / On the constraint equations in modified gravity

Lachaume, Xavier 15 December 2017 (has links)
Cette thèse est consacrée au problème d’évolution des théories de gravité modifiée : après avoir rappelé ce qu’il en est pour la Relativité Générale (RG), nous exposons le formalisme n + 1 des théories ƒ(R), Brans-Dicke et tenseur-scalaire et redémontrons un résultat connu : le problème de Cauchy est bien posé pour ces théories, et les équations de contrainte se réduisent à celles de la RG avec un champ de matière. Puis nous effectuons la même décomposition n + 1 pour les théories de Lovelock et, ce qui est nouveau, ƒ(Lovelock). Nous étudions ensuite les équations de contrainte des théories de Lovelock et montrons qu’elles sont, dans le cas conformément plat et symétrique en temps, la prescription d’une somme de σk-courbures. Afin de résoudre cette équation de prescription, nous introduisons une nouvelle famille de polynômes semi-symétriques homogènes et développons des résultats de concavité pour ces polynômes. Nous énonçons une conjecture qui, si elle était avérée, nous permettrait de résoudre l’équation de prescription dans de nombreux cas : ∀ P;Q ∈ ℝ[X], avec deg P = deg Q = p, P et Q sont scindés => p ∑ k=0 P(k) Q(p-k) est scindé / This thesis is devoted to the evolution problem for modified gravity theories. After having explained this problem for General Relativity (GR), we present the n + 1 formalism for ƒ(R) theories, Brans-Dicke and scalar-tensor theories. We recall a known result: the Cauchy problem for these theories is well-posed, and the constraint equations are reduced to those of GR with a matter field. Then we proceed to the same n+1 decomposition for Lovelock and ƒ(Lovelock) theories, the latter being an original result. We show that in the locally conformally flat timesymmetric case, they can be written as the prescription of a sum of σk-curvatures. In order to solve the prescription equation, we introduce a new family of homogeneous semisymmetric polynomials and prove some concavity results for those polynomials. We express the following conjecture: if this is true, we are able to solve the prescription equation in many cases. ∀ P;Q ∈ ℝ[X], avec deg P = deg Q = p, P and Q are real-rooted => p ∑ k=0 P(k) Q(p-k) is real-rooted:
16

Three Essays on the Measurement of Productivity

Hussain, Jakir January 2017 (has links)
This doctoral thesis consists of three essays. In the first essay I investigate the presence of productivity convergence in eight regional pulp and paper industries of U.S. and Canada over the period of 1971-2005. Expectation of productivity convergence in the pulp and paper industries of Canadian provinces and of the states of its southern neighbour is high since they are trading partners with fairly high level of exchanges in both pulp and paper products. Moreover, they share a common production technology that changed very little over the last century. I supplement the North-American regional data with national data for two Nordic countries, Finland and Sweden, which provides a scope to compare the productivity performances of four leading players in global pulp and paper industry. I find evidence in favour of the catch-up hypothesis among the regional pulp and paper industries of U.S. and Canada in my sample. The growth performance is at the advantage of Canadian provinces relative to their U.S. counterparts. However, it is not good enough to surpass the growth rates of this industry in the two Nordic countries. It is well-known that econometric productivity estimation using flexible functional forms often encounter violations of curvature conditions. However, the productivity literature does not provide any guidance on the selection of appropriate functional forms once they satisfy the theoretical regularity conditions. The second chapter of my thesis provides an empirical evidence that imposing local curvature conditions on the flexible functional forms affect total factor productivity (TFP) estimates in addition to the elasticity estimates. Moreover, I use this as a criterion for evaluating the performances of three widely used locally flexible cost functional forms - the translog (TL), the Generalized Leontief (GL), and the Normalized Quadratic (NQ) - in providing TFP estimates. Results suggest that the NQ model performs better than the other two functional forms in providing TFP estimates. The third essay capitalizes on newly available high frequency energy consumption data from commercial buildings in the District of Columbia (DC) to provide novel insights on the realized energy use impacts of energy efficiency standards in commercial buildings. Combining these data with hourly weather data and information on tenancy contract structure I evaluate the impacts of energy standards, contractual structure of utility bill payments, and energy star labeling on account level electricity consumption. Using this unique panel dataset, the analysis takes advantage of detailed building-level characteristics and the heterogeneity in the building age distribution, resulting in buildings constructed before and after mandatory energy standards came into effect. Estimation results suggest that in commercial buildings constructed under a code, electricity consumption is lower by about 0.48 kWh per cooling degree hour. When tenants pay for their own utilities, consumption is lower by 0.82 kWh per cooling degree hour. The Energy Star effect is a 0.31 kWh reduction per cooling degree hour. Finally, peak savings for all three variables of interest occur at 2pm in the summer months, whereas peak summer marginal prices at DC's local electric utility occur at 5pm.
17

Rozpoznávání ručně psaného písma pomocí neuronových sítí / Handwritten Character Recognition Using Artificial Neural Networks

Horký, Vladimír January 2012 (has links)
Neural networks with algorithm back-propagation will be presented in this work. Theoretical background of the algorithm will be explained. The problems with training neural nets will be solving there. The work discuss some techniques of image preprocessing and image extraction features, which is one of main part in classification. Some part of work discuss few experiments with neural nets with chosen image features.
18

Análisis y procesado tecnológico del modelo sólido de una pieza para determinar sus elementos característicos de mecanizado

Gutiérrez Rubert, Santiago Carlos 07 May 2008 (has links)
Una de las primeras etapas en la Planificación de Procesos asistida por ordenador, para procesos de mecanizado por arranque de material, consiste en identificar las zonas de material a eliminar en el bruto de partida para generar la pieza. El resultado es un conjunto de entidades llamadas: Elementos Característicos de Mecanizado, que tienen una clara relación con las operaciones de mecanizado. Al procedimiento de obtención automática de estas entidades se le denomina: reconocimiento automático de Elementos Característicos de Mecanizado (AFR, Automatic Feature Recognition), en el que partiendo del modelo 3D del bruto y de la pieza se establecen las entidades de trabajo adecuadas (Elementos Característicos de Mecanizado). Estas entidades contienen la información necesaria para poder llevar a cabo una Planificación de Procesos automática. A su vez, la información se va completando y ampliando a medida que se avanza en las etapas de la Planificación. En la Tesis se plantea el reconocimiento automático de Elementos Característicos de Mecanizado como una de las primeras etapas de la Planificación de Procesos, y que permite el enlace con el diseño asistido por ordenador. Este reconocimiento debe tener un planteamiento dinámico, ofreciendo distintas opciones. Su solución no debe ser una entrada estática, prefijada, para el resto de etapas de la Planificación. El proceso de reconocimiento está fuertemente influenciado por conceptos y decisiones de índole tecnológico (tipos de herramientas, movimientos característicos de los procesos, influencia del corte vinculado, ), que lo guían y que permiten obtener resultados válidos en la aplicación destino: el mecanizado. Atendiendo a este planteamiento, la Tesis ofrece una solución general y completa al proceso de reconocimiento automático de Elementos Característicos de Mecanizado, teniendo en cuenta a los llamados procesos convencionales (torneado, fresado, limado, rectificado, etc.). La solución propuesta no se restringe a piezas / Gutiérrez Rubert, SC. (2007). Análisis y procesado tecnológico del modelo sólido de una pieza para determinar sus elementos característicos de mecanizado [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/1963 / Palancia

Page generated in 0.0282 seconds