• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 567
  • 320
  • 160
  • 42
  • 35
  • 28
  • 26
  • 16
  • 16
  • 14
  • 11
  • 10
  • 9
  • 5
  • 3
  • Tagged with
  • 1454
  • 140
  • 112
  • 110
  • 106
  • 104
  • 98
  • 96
  • 92
  • 85
  • 83
  • 78
  • 68
  • 64
  • 63
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
241

Avaliação psicológica para seleção de pessoal: características de personalidade de candidatos a vagas de emprego

Guimarães, Carolina de Fátima 15 December 2015 (has links)
Submitted by Cláudia Bueno (claudiamoura18@gmail.com) on 2016-05-18T18:20:07Z No. of bitstreams: 2 Dissertação - Carolina de Fátima Guimarães - 2015.pdf: 1274038 bytes, checksum: 6c6d540e1bd592639767a33f0b2f3f52 (MD5) license_rdf: 23148 bytes, checksum: 9da0b6dfac957114c6a7714714b86306 (MD5) / Approved for entry into archive by Luciana Ferreira (lucgeral@gmail.com) on 2016-05-19T13:30:25Z (GMT) No. of bitstreams: 2 Dissertação - Carolina de Fátima Guimarães - 2015.pdf: 1274038 bytes, checksum: 6c6d540e1bd592639767a33f0b2f3f52 (MD5) license_rdf: 23148 bytes, checksum: 9da0b6dfac957114c6a7714714b86306 (MD5) / Made available in DSpace on 2016-05-19T13:30:25Z (GMT). No. of bitstreams: 2 Dissertação - Carolina de Fátima Guimarães - 2015.pdf: 1274038 bytes, checksum: 6c6d540e1bd592639767a33f0b2f3f52 (MD5) license_rdf: 23148 bytes, checksum: 9da0b6dfac957114c6a7714714b86306 (MD5) Previous issue date: 2015-12-15 / For an organization to have satisfactory results it needs to have on satisfied employees and with good performance. Thus, the professional of personnel selection area has sought to investigate aspects related to interpersonal relationship and personality, as these characteristics have relation to job performance and success of the selection process. The personality assessment has been a challenge for psychologists considering that there are different ways to understand and evaluate this construct. In this perspective, the present study aimed to discuss aspects of personality assessment applied in personnel selection and had 108 participants. To this end two studies were conducted. The first aimed to evaluate the circular structure of the Checklist of Interpersonal Transactions – II (CLOIT-II) when applied in the personnel selection. Therefore, an investigation of the adequacy of CLOIT-II data to quasi-circumplex model was made considering the criteria of two-dimensional and constant radius. To test structure was used confirmatory multidimensional scaling (MDS with Proxcal algorithm). Furthermore, the locations of scales have been established in Euclidean space and calculated the Phi de Tucker coefficient and normalized raw stress. In order to evaluate the constant radius criterion was applied Fisher test. The results showed the presence of a two-dimensional model and an appropriate variation the CLOIT-II replicates the quasi-circumplex structure which points to the usefulness of this measure in the context of people management, particularly in the selection process. The second study aimed to (1) map the personality characteristics and interpersonal interactions of successful and failed candidates in the personnel selection; (2) compare the groups of candidates approved and reproved as the interpersonal profile and projected personality; (3) evaluate the perception formulated by the interview about the interpersonal characteristics of the candidates; (4) relate projective and interpersonal self-report methods to assess personality. To achieve these goals we used the Checklist of Interpersonal Transactions – II and Palográfico Test. To estimate interpersonal profiles assumed and perceived by the interview and also projected personality profile were calculated means and standard deviations of the variables. In order to compare the groups approved and reproved wascalculated the U Mann Whitney Test and to verify the relationship between projected measure of personality and self-report was estimated the Spearman correlation coefficient and them there was corrected for attenuation. The results showed that the two groups of candidates had very similar characteristics, therefore differed only in relation to the isolation position. In addition, the selectors were unable to assess the candidates’ interpersonal characteristics during the job interview. Already on the relationship between the two measures of personality, we found that certain characteristics, such as aggression and insecurity, people tend to project them and not take them. / Para que uma organização tenha resultados satisfatórios é preciso que ela conte com funcionários satisfeitos e com bom desempenho. Com isso, os profissionais da área de seleção de pessoal tem buscado investigar aspectos relacionados ao relacionamento interpessoal e à personalidade, pois essas características apresentam relação com desempenho no trabalho e com aprovação nos processos seletivos. A investigação da personalidade tem sido um desafio para os psicólogos tendo em vista que existem diferentes maneiras de se conceber e avaliar a tal constructo. Nessa perspectiva, a presente pesquisa teve como objetivo discutir aspectos da avaliação da personalidade para a seleção de pessoal e contou com 108 participantes. Para isto foram realizados dois estudos. O primeiro objetivou avaliar a estrutura circular do Checklist de Relações Interpessoais quando aplicado no âmbito da seleção de pessoal. Logo, foi feita uma investigação da adequação dos dados do CLOIT-II ao modelo quasi-circumplexo, considerando os critérios de bidimensionalidade e raio constante. Para testar a estrutura foi utilizado o Escalonamento multidimensional confirmatório (MDS com algoritmo Proxcal). Além disso, foram estabelecidos os lugares das escalas no espaço euclidiano e calculados o coeficiente Phi de Tucker e o normalized raw stress. Com vistas a avaliar o critério de raio constante foi aplicado o teste de Fisher. Os resultados apontaram para a presença de um modelo bidimensional e uma variação adequada do tamanho dos raios das variáveis. Desse modo, quando aplicado no contexto da seleção de pessoal o CLOIT-II replica a estrutura quasi-circumplexa o que aponta para a utilidade da medida no contexto da gestão de pessoas, em especial nos processos seletivos. O segundo estudo se propôs a (1) mapear as características de personalidade e de interações interpessoais dos candidatos aprovados e reprovados na seleção; (2) comparar os grupos de candidatos aprovados e de reprovados quanto ao perfil interpessoal e de personalidade projetada (3) avaliar a percepção formulada pelo entrevistador acerca das características interpessoais dos candidatos; (5) relacionar método projetivo e de autorrelato interpessoal para avaliar a personalidade. Para atingir tais objetivos foram utilizados o Checklist de Relações Interpessoais II (CLOIT-II) e o Teste Palográfico. Para estimar os perfis interpessoais assumidos e os percebidos pelo entrevistador e também o perfil de personalidade projetada foram calculados as médias e os desvios-padrões das variáveis. Com vistas a comparar os grupos de aprovados e reprovados calculou-se o Teste U de Mann Whitney e para verificar a relação entre medidas de personalidade projetadas e de autorrelato foi estimado o coeficiente de correlação de Spearman e em seguida houve correção para atenuação. Os resultados mostraram que os dois grupos de candidatos apresentaram características bastante semelhantes, pois se diferenciaram apenas em relação à posição de isolamento. Além disso, os selecionadores não conseguiram avaliar as características interpessoais dos candidatos durante as entrevistas de emprego. Já sobre a relação entre as duas medidas de personalidade, foi possível verificar que certas características, tais como de agressividade e insegurança, os sujeitos tendem a projetá-las e não assumi-las.
242

Essays on heteroskedasticity

da Glória Abage de Lima, Maria 31 January 2008 (has links)
Made available in DSpace on 2014-06-12T18:29:15Z (GMT). No. of bitstreams: 2 arquivo4279_1.pdf: 1161561 bytes, checksum: 80aee0b17f88de11dd7d0999ad1594a1 (MD5) license.txt: 1748 bytes, checksum: 8a4605be74aa9ea9d79846c1fba20a33 (MD5) Previous issue date: 2008 / Esta tese de doutorado trata da realização de inferências no modelo de regressão linear sob heteroscedasticidade de forma desconhecida. No primeiro capítulo, nós desenvolvemos estimadores intervalares que são robustos à presença de heteroscedasticidade. Esses estimadores são baseados em estimadores consistentes de matrizes de covariâncias propostos na literatura, bem como em esquemas bootstrap. A evidência numérica favorece o estimador intervalar HC4. O Capítulo 2 desenvolve uma seqüência corrigida por viés de estimadores de matrizes de covariâncias sob heteroscedasticidade de forma desconhecida a partir de estimador proposto por Qian eWang (2001). Nós mostramos que o estimador de Qian-Wang pode ser generalizado em uma classe mais ampla de estimadores consistentes para matrizes de covariâncias e que nossos resultados podem ser facilmente estendidos a esta classe de estimadores. Finalmente, no Capítulo 3 nós usamos métodos de integração numérica para calcular as distribuições nulas exatas de diferentes estatísticas de testes quasi-t, sob a suposição de que os erros são normalmente distribuídos. Os resultados favorecem o teste HC4
243

Analyse communicationnelle des stratégies d'intelligence économique et des pratiques de veille dans le cadre de l'innovation : le cas des petites entreprises de l’industrie aéronautique, en Nouvelle Aquitaine / Communication analysis of competitive intelligence strategies and monitoring practices in the context of innovation : The case of the small companies of the spatial aeronautical sector defense in New Aquitaine

Hennezel, Claire d' 18 May 2017 (has links)
L’intelligence économique (IE) et la veille sont des stratégies d’entreprise qui sont bien implantées dans les grandes entreprises et les grandes PME. Depuis le rapport Carayon en 2003, l’intelligence économique est devenue une politique publique. Ces concepts ont intéressé assez tôt plusieurs disciplines scientifiques, ce qui leur confère un caractère interdisciplinaire. Ce sont des objets empiriques et hybrides car ils sont issus des pratiques des entreprises. Les principales disciplines à s’y être intéressées sont les SIC et les sciences de gestion, pour les principales, mais aussi les sciences économiques. Le sujet a été beaucoup observé, surtout dans les grandes entreprises, notamment par les sciences de gestion car l’intelligence économique et la veille sont principalement des stratégies d’aide à la décision. Aujourd’hui, en SIC, ces objet sont étudiés notamment d’un point de vue informationnel, car l’information se situe au cœur des processus examinés. Il existe des problématiques récurrentes à l’analyse de ces concepts, assez jeunes, qui tournent autour des difficultés d’implémentation de ces stratégies dans les entreprises d’une part, et des obstacles à l’établissement de la politique publique française en la matière d’autre part. Des paradoxes existent à leur sujet. Les petites entreprises et notamment les TPE ont été très peu étudiées par les diverses disciplines scientifiques qui s’y intéressent ce qui paraît surprenant au regard de l’importance de ce type de structure dans l’économie nationale. Par ailleurs, les politiques publiques, assez bien perçues par les grandes entreprises et grandes PME, échouent à percer auprès des petites entreprises. Ce qui nous amène à nous interroger sur les raisons des obstacles constatés au sujet de ces stratégies et politiques publiques auprès des petites entreprises. Les postulats de cette recherche, qui se positionne dans le courant compréhensif, sont fondés sur l’idée centrale qu’il existe des caractéristiques spécifiques aux petites entreprises du fait de leurs contraintes structurelles. Ces petites entreprises mettent en œuvre des stratégies d’IE disruptives, qui leur sont propres, et qui sont fondées sur des processus inverses à ceux qui sont modélisés pour les grandes entreprises. Ces processus sont à caractère communicationnel. L’intelligence économique et la veille dans les petites entreprises s’établissent sur une culture informationnelle collaborative de partage des informations. Ces stratégies sont mises en œuvre dans le cadre d’une structure organisationnelle innovante, réticulaire, la quasi-organisation, à la communication organisante et construite sur une stratégie réseau. Enfin, l’intelligence économique dans la petite entreprise s’appuie sur une sphère de médiation, la biocénose économique, constituée d’un enchevêtrement de relations entre plusieurs acteurs dont les institutions semi-étatiques qui jouent un rôle moteur. Les résultats d’une enquête effectuée auprès de dirigeants de petites entreprises dans le secteur aéronautique spatial défense en Nouvelle Aquitaine, secteur innovant s’il en est, viendront illustrer un modèle d’intelligence économique adapté à la petite entreprise qui sera proposé à la discussion. / Competitive intelligence (CI) and business intelligence are strategies that are well established in large companies and large SMEs. Since the Carayon report in 2003, competitive intelligence has become a public policy. These concepts have been of interest to several scientific disciplines since then, which gives them an interdisciplinary character. They are empirical and hybrid objects because they are derived from the practices of companies. The main disciplines to be studied here are the CIS and the management sciences, for the main ones, but also for the economics. The subject has been much observed, especially in large companies, notably by the management sciences because the competitive intelligence is mainly strategy of decision support. Today, in SIC, these objects are studied in particular from an information point of view, because information is at the heart of the processes examined. There are recurring problems in the analysis of these rather young concepts, which revolve around the difficulties of implementing these strategies in companies on the one hand, and the obstacles to the establishment of French public policy on the matter on the other hand. There are paradoxes about them. Small businesses and especially very small businesses have been little studied by the various scientific disciplines concerned, which is surprising given the importance of this type of companies in the national economy. On the other hand, public policies, which are fairly well perceived by large companies and large SMEs, fail to penetrate small businesses. This leads us to question the reasons for the barriers to these small business strategies and policies. The postulates of this research are based on the central idea that there are characteristics specific to small enterprises because of their structural constraints. These small companies are implementing their own disruptive CI strategies, which are based on processes that are inverse to those modeled for large enterprises. These processes are communicational in nature. Competitive and business intelligence in small businesses are based on a collaborative information-sharing culture. These strategies are implemented within the framework of an innovative organizational structure, reticular, quasi-organization, based on an organizational communication and built on a network strategy. Finally, competitive intelligence in small enterprises is based on a sphere of mediation, economic biocenosis, constituted by an entanglement of relations between several economic actors including state institutions which play a leading role.
244

Shortest Length Geodesics on Closed Hyperbolic Surfaces

Sanki, Bidyut January 2014 (has links) (PDF)
Given a hyperbolic surface, the set of all closed geodesics whose length is minimal form a graph on the surface, in fact a so called fat graph, which we call the systolic graph. The central question that we study in this thesis is: which fat graphs are systolic graphs for some surface -we call such graphs admissible. This is motivated in part by the observation that we can naturally decompose the moduli space of hyperbolic surfaces based on the associated systolic graphs. A systolic graph has a metric on it, so that all cycles on the graph that correspond to geodesics are of the same length and all other cycles have length greater than these. This can be formulated as a simple condition in terms of equations and inequations for sums of lengths of edges. We call this combinatorial admissibility. Our first main result is that admissibility is equivalent to combinatorial admissibility. This is proved using properties of negative curvature, specifically that polygonal curves with long enough sides, in terms of a lower bound on the angles, are close to geodesics. Using the above result, it is easy to see that a subgraph of an admissible graph is admissible. Hence it suffices to characterize minimal non-admissible fat graphs. Another major result of this thesis is that there are infinitely many minimal non-admissible fat graphs (in contrast, for instance, to the classical result that there are only two minimal non-planar graphs).
245

Élastographie par résonance magnétique : contributions pour l’acquisition et la reconstruction du module de cisaillement : association avec l’élastographie ultrasonore quasi-statique pour l’étude de milieux pré-contraints / Magnetic resonance elastography : contributions to acquisition and reconstruction of the shear modulus : association with quasi-static ultrasound elastography to study the effect of pre-strain

Blanchard, Rémy 22 February 2013 (has links)
Le terme élastographie désigne les techniques d'imagerie dédiées à l'étude des propriétés mécaniques des tissus biologiques in vivo. Au cours de cette thèse, nous nous sommes intéressés à deux de ces techniques. La première est l'élastographie quasi-statique par ultrasons permettant de mesurer les déformations locales induites dans un tissu sous l'action d'une contrainte globale. La seconde est l'élastographie par résonance magnétique (ERM) permettant d'accéder localement à une estimation du module de cisaillement. Pour cette dernière technique, une onde de cisaillement est générée au sein du milieu puis imagée a l'aide d'une séquence IRM spécifique. Les images d'ondes acquises permettent la reconstruction du module de cisaillement local. Dans le cadre de ces travaux, une nouvelle technique d'acquisition de l'image d'onde de cisaillement a été proposée, ainsi qu'une méthode de reconstruction du module de cisaillement basée sur l'estimation locale de fréquence par rapport de filtres. Un autre axe de recherche a consisté en l'étude de l'effet d'une précontrainte appliquée à un milieu sur son module de cisaillement mesuré par ERM. Cet effet a tout d'abord été étudié sur des milieux homogènes puis avec des milieux test hétérogènes. Dans ce dernier cas, l'utilisation de l'élastographie quasi-statique par ultrasons s'avère nécessaire pour accéder à la déformation locale du milieu. Cette dernière information a été combinée avec les informations obtenues en ERM pour extraire pour chaque région d'intérêt une courbe déformation/module de cisaillement / The term elastography refers to imaging techniques dedicated to the in vivo investigation of the mechanical properties of biological tissues. During this thesis, we focused on two elastography techniques. The first one is quasi-static ultrasound elastography, able to locally estimate tissue strain induced by a global deformation of a medium. The second one is Magnetic Resonance Elastography (MRE), able to measure the local shear modulus. In MRE, a shear wave is generated within the medium and imaged using a specific MRI sequence. The resulting wave images are then processed to estimate the local shear modulus. A new acquisition scheme of the shear wave images was proposed during this thesis. A method, based on local frequency estimation, was also developed for the estimation of the local shear modulus using the properties of a ratio of filters. Another research axis was the study of the effect of a prestrain application on the measured shear modulus. This effect was first studied with homogeneous media and then with heterogeneous test objects. In this last case, the use of quasi-static ultrasound elastography was necessary to locally access to the medium strain. This information was then combined with the information obtained using MRE to extract, for each region of interest, a strain/shear modulus curve
246

Convergence asymptotique des niveaux de temps quasi-concaves dans un espace temps à courbure constante / Asymptomatic convergence of level sets of quasi-concave times in a space-time of constant curvature

Belraouti, Mehdi 20 June 2013 (has links)
Dans cette thèse, nous nous intéressons aux espaces temps dit globalement hyperboliques Cauchy compacts. Ce sont des espaces temps qui admettent une fonction, dite fonction temps de Cauchy, propre qui croit strictement le long des courbes causales inextensibles. Les niveaux de telles fonctions sont des hypersurfaces de type espace appelées hypersurfaces de Cauchy. La donnée d'une fonction temps définit naturellement une famille à 1-paramètres d'espaces métriques. Notre but est d'étudier le comportement asymptomatique de ces familles d'espaces métriques Il y a deux cas de figure à considérer : le premier étant le comportement asymptomatique dans le passé ; le deuxième est celui du comportement asymptomatique dans le futur. Plus de conditions géométriques sur l'espace temps et les fonctions temps à considérer seront nécessaires / In this thesis we're interested in globally hyperbolic Cauchy compact space-times. These are space-times that possess a proper function, called Cauchy time function, which ist strictly increasing along inextensible causal curves. A Cauchy time function defines naturally a 1-parameter family of metric spaces. One asks the natural and important question of the asymptomatic behaviour of this family with respect to the time : when time goes to 0 and when it goes towards infinity. Of course additional geometric condition on the space-ime and the time function will be necessary for a more appropriate study
247

Analyse théorique et numérique de l'endommagement par micro-fissuration descomposites à matrice quasi-fragile / Theoretical and numerical analysis of damage by micro-cracking composite materials of quasi-brittle matrix

Dib, Dayana 22 October 2015 (has links)
Le problème initial traité dans cette thèse relève du cadre général de la modélisation des tunnels profonds. Pour cela, on a adopté l'approche basée sur la mécanique linéaire de la rupture. L'étude s'est appuyée sur le critère mixte de Leguillon. Suite à cette étude, on a pu tirer que ce n'est pas le critère mixte qui est insuffisant mais plutôt la façon d'aborder le problème. D'où le passage à la prise en compte de l'hétérogénéité du matériau constitutif et la possibilité d'amorçage d'une fissure sous une contrainte de compression. Une première approche a été entreprise par l'étude d'une bicouche périodique sous contrainte de compression verticale. La couche de grande raideur s'est apparue le siège d'une traction transversale. Effectivement la possibilité d'amorçage d'une fissure est tout à fait probable grâce toujours à la vérification des critères d'énergie et de contrainte. Une deuxième approche consistait à observer au plus près la microstructure du matériau ; on a considéré le problème d'une inclusion elliptique dans une matrice infinie. Par la méthode des variables complexes et la technique de la transformation conforme, on a analysé le champ de contrainte autour de l'inclusion et on a mis en évidence la présence d'une traction qui dépend fortement des paramètres choisis. Par la méthode des éléments finis étendus, on a calculé la variation de l'énergie potentielle mise en jeu par la création d'une fissure. Par une démarche semblable à l'approche précédente, à savoir la vérification des critères d'énergie et de contrainte, on a conclu à la possibilité d'amorçage d'une fissure. Mots clefs : mécanique linéaire de la rupture, critère mixte de Leguillon, énergie potentielle, taux de restitution d'énergie, méthode des éléments finis étendus, bicouche périodique, méthode des variables complexes / The initial problem treated in this thesis falls within the general framework of modeling deep tunnels. For this reason, the approach based on linear fracture mechanics was adopted. The study was based on the mixed criterion of Leguillon. Following This study, the mixed criterion was not insufficient but the way to approach the problem was. Where the transition to the consideration of the heterogeneity of the material component and the possibility of initing a crack under a compressive stress. A first approach was undertaken the study of periodic bilayer under the stress of vertical compression. The layer of the highest stiffness has appeared the seat of a transverse traction. Indeed the possibility to initiate a crack is quite likely always through the verification of the energy and the stress criteria. A second approach was to observe more closer the microstructure of the material; we have considered the problem of elliptic inclusion in an infinite matrix. By the method of complex variables and the technique of conformal mapping, we analyzed the stress field around the inclusion and were revealed the presence of a traction which depends strongly of the selected parameters. By the extended finite element method, we calculated the variation of the potential energy involved by creating a fracture. In a similar approach to the previous one, namely verification of the energy and the stress criteria, we concluded the possibility of initiating a crack. Keywords: linear fracture mechanics, mixed criterion of Leguillon, potential energy, energy release rate, extended finite element method, periodic bilayer, method of complex variables
248

Bayesian and Quasi-Monte Carlo spherical integration for global illumination / Intégration sphérique Bayésien et Quasi-Monte Carlo pour l'illumination globale

Marques, Ricardo 22 October 2013 (has links)
La qualité du résultat des opérations d’échantillonnage pour la synthèse d'images est fortement dépendante du placement et de la pondération des échantillons. C’est pourquoi plusieurs travaux ont porté sur l’amélioration de l’échantillonnage purement aléatoire utilisée dans les techniques classiques de Monte Carlo. Leurs approches consistent à utiliser des séquences déterministes qui améliorent l’uniformité de la distribution des échantillons sur le domaine de l’intégration. L’estimateur résultant est alors appelé un estimateur de quasi-Monte Carlo (QMC).Dans cette thèse, nous nous focalisons sur le cas de l’échantillonnage pour l’intégration hémisphérique. Nous allons montrer que les approches existantes peuvent être améliorées en exploitant pleinement l’information disponible (par exemple, les propriétés statistiques de la fonction à intégrer) qui est ensuite utilisée pour le placement des échantillons et pour leur pondération. / The spherical sampling of the incident radiance function entails a high computational cost. Therefore the llumination integral must be evaluated using a limited set of samples. Such a restriction raises the question of how to obtain the most accurate approximation possible with such a limited set of samples. In this thesis, we show that existing Monte Carlo-based approaches can be improved by fully exploiting the information available which is later used for careful samples placement and weighting.The first contribution of this thesis is a strategy for producing high quality Quasi-Monte Carlo (QMC) sampling patterns for spherical integration by resorting to spherical Fibonacci point sets. We show that these patterns, when applied to the rendering integral, are very simple to generate and consistently outperform existing approaches. Furthermore, we introduce theoretical aspects on QMC spherical integration that, to our knowledge, have never been used in the graphics community, such as spherical cap discrepancy and point set spherical energy. These metrics allow assessing the quality of a spherical points set for a QMC estimate of a spherical integral.In the next part of the thesis, we propose a new heoretical framework for computing the Bayesian Monte Carlo quadrature rule. Our contribution includes a novel method of quadrature computation based on spherical Gaussian functions that can be generalized to a broad class of BRDFs (any BRDF which can be approximated sum of one or more spherical Gaussian functions) and potentially to other rendering applications. We account for the BRDF sharpness by using a new computation method for the prior mean function. Lastly, we propose a fast hyperparameters evaluation method that avoids the learning step.Our last contribution is the application of BMC with an adaptive approach for evaluating the illumination integral. The idea is to compute a first BMC estimate (using a first sample set) and, if the quality criterion is not met, directly inject the result as prior knowledge on a new estimate (using another sample set). The new estimate refines the previous estimate using a new set of samples, and the process is repeated until a satisfying result is achieved.
249

Comportement mécanique des matériaux quasi-fragiles sous sollicitations cycliques : de l’expérimentation numérique au calcul de structures. / Mechanical behavior of quasi-brittle materials under cyclic loadings : from virtual testing to structural simulations

Vassaux, Maxime 13 March 2015 (has links)
Les modèles de comportement mécanique, dits macroscopiques, sont développés à la fois pour leur légèreté, permettant le calcul d’éléments structuraux pouvant atteindre d’importantes dimensions, et pour leur finesse de représentation des phénomènes mécaniques observés par le matériau à des échelles plus fines. Le développement de tels modèles est ici effectué dans le cadre de la sollicitation sismique, donc des chargements cycliques alternés, appliquée à des ouvrages en matériaux quasi-fragiles, et plus précisément en béton. À ce jour, les modèles macroscopiques, effectivement applicables au calcul de structures, et représentatifs du comportement cyclique du béton sont encore rares. En conséquence de la complexité du problème de fissuration à homogénéiser, les modèles macroscopiques existants affichent une robustesse limitée ou ne permettent pas de reproduire l’ensemble des phénomènes mécaniques observés par le matériau. Une des barrières à la résolution de ces deux problématiques est le manque de données expérimentales relatives aux phénomènes à modéliser. En effet, en cause de la difficulté technique de les réaliser, peu de résultats d’essais cycliques alternés sur du béton sont disponibles dans la littérature.
 Une démarche d’expérimentation numérique a donc été élaborée sur la base d’un modèle fin du matériau, dit microscopique, capable de fournir les résultats nécessaires à la formulation et à l’identification d’un modèle macroscopique. Dans le modèle microscopique le matériau est considéré comme une structure à part entière, il a été développé afin de ne nécessiter qu’une quantité réduite de résultats d’essais, maîtrisés, pour être mis en oeuvre. Le modèle microscopique, un modèle particulaire lattice, a été développé sur la base d’un modèle lattice existant, enrichi pour être en mesure de simuler le comportement des matériaux quasi-fragiles sous chargements multi-axiaux et cycliques. Le modèle microscopique a alors été validé en tant qu’outil d’expérimentation numérique, et exploité afin d’établir les équations constitutives du modèle macroscopique fondées sur les théories de l’endommagement et de la plasticité. La régularité de la relation de comportement proposée, intégrant un effet unilatéral progressif, a notamment été garantie par l’utilisation d’un modèle d’élasticité non-linéaire. Le modèle macroscopique a finalement été calibré, entièrement, à l’aide du modèle microscopique, et mis à l’oeuvre dans la simulation de la réponse d’un voile en béton armé soumis à un chargement de cisaillement cyclique alterné. Cette simulation a permis de mettre en avant la robustesse numérique du modèle développé, ainsi que la contribution significative du comportement uni-axial cyclique alterné du béton à l’amortissement de telles structures. / Macroscopic mechanical behavior models are developed for their light computational costs, allowing the simulation of large structural elements, and the precise description of mechanical phenomena observed by the material at lower scales. Such constitutive models are here developed in the seismic solicitation framework, therefore implying cyclic alternate loadings at the material scale, and applied to civil engineering buildings, often made of concrete, or more generally of quasi-brittle materials. To date, macroscopic models applicable to structural computations, while representing the cyclic mechanical behavior are rare. In consequence of the intricacy of the fracture processes to homogenize, macroscopic constitutive models either do not present sufficient robustness or miss on important phenomena. One of the limitations to the resolution of this issue is the lack of experimental data. Indeed, because of the complexity of the experiments to set up, few results on alternate cyclic tests on concrete are available in the literature.A virtual testing approach has therefore been established on a microscopic model of the material, able to provide results needed to the formulation and the calibration of a macroscopic model. In the microscopic model, the material is considered as structure itself, it is developed so as to only necessitate a reduced amount of results from controlled experimental tests, in order to be used. The microscopic model, a lattice discrete element model, has been developed on the basis of an existing lattice model and extended to the simulation of multi-axial and cyclic loadings. The microscopic model has then been validated as a virtual testing tool and used to establish equations of the macroscopic model, on the basis of damage and plasticity theories. The consistency of the proposed constitutive relation, embedding progressive unilateral effect, has been achieved using non-linear elasticity. The macroscopic model has finally been calibrated, entirely with the microscopic model, and employed to simulate the response of a reinforced concrete wall under alternate shear loading. This simulation has served to showcase the numerical robustness of the proposed model, as well as the significant contribution of the uni-axial alternate behavior of concrete to the structural damping of such structures.
250

Jaký vliv mají změny peněžní zásoby na reálnou ekonomiku České republiky? / What is the Effect of Money Supply Changes to the Real Economy of the Czech Republic?

Trnková, Adéla January 2016 (has links)
The thesis analyses in detail the relationship between the money stock defined by money aggregates M1 and M2 and the real GDP in the Czech Republic for period between 1996 and 2015. A long-term relationship between the real GDP and the money aggregate is not found using quarterly time series data. These conclusions are in accordance with the economic theory which does not confirm that money affects level of the real GDP in the long run. Short-term relationship between given variables is also analysed. Results indicate that the growth rate of the money aggregate M1 statistically significantly affects the growth rate of the real GDP in the same direction which is in line with monetary theories of business cycle. On the other hand, any statistically significant relationship for the money aggregate M2 is not found which speaks in favour of the Real Business Cycle theory. The Policy Ineffectiveness Proposition accepted by New Classical Macroeconomists is also tested in the thesis. The issue is investigated for the whole period and subsequently for shorter time from 2000 to 2015 where the uniform monetary policy is applied. Results for the money aggregate M1 imply that expected changes in the growth rate of M1 play important role in the money-output relationship which is consistent with the New Keynesian Macroeconomic theory. Considering the shorter period of time, Lucas' theory seems to be more appropriate explanation. Outcomes for the aggregate M2 provide mixed conclusions which support rather the Real Business Cycle theory. At the end of the thesis, there is a section devoted to the quasi money (one of M2 aggregate components) as a possible source of mixed results.

Page generated in 0.0954 seconds