• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 226
  • 65
  • Tagged with
  • 291
  • 283
  • 203
  • 169
  • 169
  • 84
  • 81
  • 81
  • 73
  • 44
  • 43
  • 43
  • 40
  • 39
  • 38
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
251

Essays in banking

Albertazzi, Ugo 07 September 2011 (has links)
Cette thèse contient trois études sur le fonctionnement des banques.<p>Le premier Chapitre analyse empiriquement comment la capacité d’offrir des emprunts à long terme est influencée par la dimension des intermédiaires financiers.<p>Le deuxième Chapitre analyse, avec un model théorique caractérisé par la présence de soft-budget constraint, ratchet effect et short-termism, comment la pression compétitive influence la capacité des banque de financer le firmes ayant des projets de bonne qualité.<p>Le troisième Chapitre examine, avec un model théorique du type moral hazard common agency, le conflits d'intérêts des banques universelles.<p><p>Financial intermediaries are recognized to promote the efficiency of resource allocation by mitigating problems of incentives, asymmetric information and contract incompleteness. The role played by financial intermediaries is considered so crucial that these institutions have received all over the world the greatest attention of regulators.<p>Across and within banking sectors it is possible to observe a wide variety of intermediaries. Banks may differ in their size, market power and degree of specialization. This variety raises interesting questions about the features of a well functioning banking sector. These questions have inspired an important body of economic literature which, however, is still inconclusive in many aspects. This dissertation includes three studies intending to contribute in this direction.<p>Chapter 1 will empirically study the willingness of smaller and larger lenders to grant long-term loans which, as credit to SME's, constitute an opaque segment of the credit market. Chapter 2 analyzes, with a theoretical model, the effects of competition on the efficiency of the banking sector when this is characterized by dynamic commitment issues which brings to excessive refinancing of bad quality investments (so called soft-budget constraint) or excessive termination of good ones (ratchet effect and short-termism). Chapter 3 presents a model to investigate to what extent the distortions posed by conflicts of interest in universal banks can be addressed through the provision of appropriate incentive schemes by the different categories of clients. / Doctorat en Sciences économiques et de gestion / info:eu-repo/semantics/nonPublished
252

Essays in dynamic macroeconometrics

Bañbura, Marta 26 June 2009 (has links)
The thesis contains four essays covering topics in the field of macroeconomic forecasting.<p><p>The first two chapters consider factor models in the context of real-time forecasting with many indicators. Using a large number of predictors offers an opportunity to exploit a rich information set and is also considered to be a more robust approach in the presence of instabilities. On the other hand, it poses a challenge of how to extract the relevant information in a parsimonious way. Recent research shows that factor models provide an answer to this problem. The fundamental assumption underlying those models is that most of the co-movement of the variables in a given dataset can be summarized by only few latent variables, the factors. This assumption seems to be warranted in the case of macroeconomic and financial data. Important theoretical foundations for large factor models were laid by Forni, Hallin, Lippi and Reichlin (2000) and Stock and Watson (2002). Since then, different versions of factor models have been applied for forecasting, structural analysis or construction of economic activity indicators. Recently, Giannone, Reichlin and Small (2008) have used a factor model to produce projections of the U.S GDP in the presence of a real-time data flow. They propose a framework that can cope with large datasets characterised by staggered and nonsynchronous data releases (sometimes referred to as “ragged edge”). This is relevant as, in practice, important indicators like GDP are released with a substantial delay and, in the meantime, more timely variables can be used to assess the current state of the economy.<p><p>The first chapter of the thesis entitled “A look into the factor model black box: publication lags and the role of hard and soft data in forecasting GDP” is based on joint work with Gerhard Rünstler and applies the framework of Giannone, Reichlin and Small (2008) to the case of euro area. In particular, we are interested in the role of “soft” and “hard” data in the GDP forecast and how it is related to their timeliness.<p>The soft data include surveys and financial indicators and reflect market expectations. They are usually promptly available. In contrast, the hard indicators on real activity measure directly certain components of GDP (e.g. industrial production) and are published with a significant delay. We propose several measures in order to assess the role of individual or groups of series in the forecast while taking into account their respective publication lags. We find that surveys and financial data contain important information beyond the monthly real activity measures for the GDP forecasts, once their timeliness is properly accounted for.<p><p>The second chapter entitled “Maximum likelihood estimation of large factor model on datasets with arbitrary pattern of missing data” is based on joint work with Michele Modugno. It proposes a methodology for the estimation of factor models on large cross-sections with a general pattern of missing data. In contrast to Giannone, Reichlin and Small (2008), we can handle datasets that are not only characterised by a “ragged edge”, but can include e.g. mixed frequency or short history indicators. The latter is particularly relevant for the euro area or other young economies, for which many series have been compiled only since recently. We adopt the maximum likelihood approach which, apart from the flexibility with regard to the pattern of missing data, is also more efficient and allows imposing restrictions on the parameters. Applied for small factor models by e.g. Geweke (1977), Sargent and Sims (1977) or Watson and Engle (1983), it has been shown by Doz, Giannone and Reichlin (2006) to be consistent, robust and computationally feasible also in the case of large cross-sections. To circumvent the computational complexity of a direct likelihood maximisation in the case of large cross-section, Doz, Giannone and Reichlin (2006) propose to use the iterative Expectation-Maximisation (EM) algorithm (used for the small model by Watson and Engle, 1983). Our contribution is to modify the EM steps to the case of missing data and to show how to augment the model, in order to account for the serial correlation of the idiosyncratic component. In addition, we derive the link between the unexpected part of a data release and the forecast revision and illustrate how this can be used to understand the sources of the<p>latter in the case of simultaneous releases. We use this methodology for short-term forecasting and backdating of the euro area GDP on the basis of a large panel of monthly and quarterly data. In particular, we are able to examine the effect of quarterly variables and short history monthly series like the Purchasing Managers' surveys on the forecast.<p><p>The third chapter is entitled “Large Bayesian VARs” and is based on joint work with Domenico Giannone and Lucrezia Reichlin. It proposes an alternative approach to factor models for dealing with the curse of dimensionality, namely Bayesian shrinkage. We study Vector Autoregressions (VARs) which have the advantage over factor models in that they allow structural analysis in a natural way. We consider systems including more than 100 variables. This is the first application in the literature to estimate a VAR of this size. Apart from the forecast considerations, as argued above, the size of the information set can be also relevant for the structural analysis, see e.g. Bernanke, Boivin and Eliasz (2005), Giannone and Reichlin (2006) or Christiano, Eichenbaum and Evans (1999) for a discussion. In addition, many problems may require the study of the dynamics of many variables: many countries, sectors or regions. While we use standard priors as proposed by Litterman (1986), an<p>important novelty of the work is that we set the overall tightness of the prior in relation to the model size. In this we follow the recommendation by De Mol, Giannone and Reichlin (2008) who study the case of Bayesian regressions. They show that with increasing size of the model one should shrink more to avoid overfitting, but when data are collinear one is still able to extract the relevant sample information. We apply this principle in the case of VARs. We compare the large model with smaller systems in terms of forecasting performance and structural analysis of the effect of monetary policy shock. The results show that a standard Bayesian VAR model is an appropriate tool for large panels of data once the degree of shrinkage is set in relation to the model size. <p><p>The fourth chapter entitled “Forecasting euro area inflation with wavelets: extracting information from real activity and money at different scales” proposes a framework for exploiting relationships between variables at different frequency bands in the context of forecasting. This work is motivated by the on-going debate whether money provides a reliable signal for the future price developments. The empirical evidence on the leading role of money for inflation in an out-of-sample forecast framework is not very strong, see e.g. Lenza (2006) or Fisher, Lenza, Pill and Reichlin (2008). At the same time, e.g. Gerlach (2003) or Assenmacher-Wesche and Gerlach (2007, 2008) argue that money and output could affect prices at different frequencies, however their analysis is performed in-sample. In this Chapter, it is investigated empirically which frequency bands and for which variables are the most relevant for the out-of-sample forecast of inflation when the information from prices, money and real activity is considered. To extract different frequency components from a series a wavelet transform is applied. It provides a simple and intuitive framework for band-pass filtering and allows a decomposition of series into different frequency bands. Its application in the multivariate out-of-sample forecast is novel in the literature. The results indicate that, indeed, different scales of money, prices and GDP can be relevant for the inflation forecast.<p> / Doctorat en Sciences économiques et de gestion / info:eu-repo/semantics/nonPublished
253

Essays on aggregation and cointegration of econometric models

Silvestrini, Andrea 02 June 2009 (has links)
This dissertation can be broadly divided into two independent parts. The first three chapters analyse issues related to temporal and contemporaneous aggregation of econometric models. The fourth chapter contains an application of Bayesian techniques to investigate whether the post transition fiscal policy of Poland is sustainable in the long run and consistent with an intertemporal budget constraint.<p><p><p>Chapter 1 surveys the econometric methodology of temporal aggregation for a wide range of univariate and multivariate time series models. <p><p><p>A unified overview of temporal aggregation techniques for this broad class of processes is presented in the first part of the chapter and the main results are summarized. In each case, assuming to know the underlying process at the disaggregate frequency, the aim is to find the appropriate model for the aggregated data. Additional topics concerning temporal aggregation of ARIMA-GARCH models (see Drost and Nijman, 1993) are discussed and several examples presented. Systematic sampling schemes are also reviewed.<p><p><p>Multivariate models, which show interesting features under temporal aggregation (Breitung and Swanson, 2002, Marcellino, 1999, Hafner, 2008), are examined in the second part of the chapter. In particular, the focus is on temporal aggregation of VARMA models and on the related concept of spurious instantaneous causality, which is not a time series property invariant to temporal aggregation. On the other hand, as pointed out by Marcellino (1999), other important time series features as cointegration and presence of unit roots are invariant to temporal aggregation and are not induced by it.<p><p><p>Some empirical applications based on macroeconomic and financial data illustrate all the techniques surveyed and the main results.<p><p>Chapter 2 is an attempt to monitor fiscal variables in the Euro area, building an early warning signal indicator for assessing the development of public finances in the short-run and exploiting the existence of monthly budgetary statistics from France, taken as "example country". <p><p><p>The application is conducted focusing on the cash State deficit, looking at components from the revenue and expenditure sides. For each component, monthly ARIMA models are estimated and then temporally aggregated to the annual frequency, as the policy makers are interested in yearly predictions. <p><p><p>The short-run forecasting exercises carried out for years 2002, 2003 and 2004 highlight the fact that the one-step-ahead predictions based on the temporally aggregated models generally outperform those delivered by standard monthly ARIMA modeling, as well as the official forecasts made available by the French government, for each of the eleven components and thus for the whole State deficit. More importantly, by the middle of the year, very accurate predictions for the current year are made available. <p><p>The proposed method could be extremely useful, providing policy makers with a valuable indicator when assessing the development of public finances in the short-run (one year horizon or even less). <p><p><p>Chapter 3 deals with the issue of forecasting contemporaneous time series aggregates. The performance of "aggregate" and "disaggregate" predictors in forecasting contemporaneously aggregated vector ARMA (VARMA) processes is compared. An aggregate predictor is built by forecasting directly the aggregate process, as it results from contemporaneous aggregation of the data generating vector process. A disaggregate predictor is a predictor obtained from aggregation of univariate forecasts for the individual components of the data generating vector process. <p><p>The econometric framework is broadly based on Lütkepohl (1987). The necessary and sufficient condition for the equality of mean squared errors associated with the two competing methods in the bivariate VMA(1) case is provided. It is argued that the condition of equality of predictors as stated in Lütkepohl (1987), although necessary and sufficient for the equality of the predictors, is sufficient (but not necessary) for the equality of mean squared errors. <p><p><p>Furthermore, it is shown that the same forecasting accuracy for the two predictors can be achieved using specific assumptions on the parameters of the VMA(1) structure. <p><p><p>Finally, an empirical application that involves the problem of forecasting the Italian monetary aggregate M1 on the basis of annual time series ranging from 1948 until 1998, prior to the creation of the European Economic and Monetary Union (EMU), is presented to show the relevance of the topic. In the empirical application, the framework is further generalized to deal with heteroskedastic and cross-correlated innovations. <p><p><p>Chapter 4 deals with a cointegration analysis applied to the empirical investigation of fiscal sustainability. The focus is on a particular country: Poland. The choice of Poland is not random. First, the motivation stems from the fact that fiscal sustainability is a central topic for most of the economies of Eastern Europe. Second, this is one of the first countries to start the transition process to a market economy (since 1989), providing a relatively favorable institutional setting within which to study fiscal sustainability (see Green, Holmes and Kowalski, 2001). The emphasis is on the feasibility of a permanent deficit in the long-run, meaning whether a government can continue to operate under its current fiscal policy indefinitely.<p><p>The empirical analysis to examine debt stabilization is made up by two steps. <p><p>First, a Bayesian methodology is applied to conduct inference about the cointegrating relationship between budget revenues and (inclusive of interest) expenditures and to select the cointegrating rank. This task is complicated by the conceptual difficulty linked to the choice of the prior distributions for the parameters relevant to the economic problem under study (Villani, 2005).<p><p>Second, Bayesian inference is applied to the estimation of the normalized cointegrating vector between budget revenues and expenditures. With a single cointegrating equation, some known results concerning the posterior density of the cointegrating vector may be used (see Bauwens, Lubrano and Richard, 1999). <p><p>The priors used in the paper leads to straightforward posterior calculations which can be easily performed.<p>Moreover, the posterior analysis leads to a careful assessment of the magnitude of the cointegrating vector. Finally, it is shown to what extent the likelihood of the data is important in revising the available prior information, relying on numerical integration techniques based on deterministic methods.<p> / Doctorat en Sciences économiques et de gestion / info:eu-repo/semantics/nonPublished
254

Essays in game theory applied to political and market institutions

Bouton, Laurent 15 June 2009 (has links)
My thesis contains essays on voting theory, market structures and fiscal federalism: (i) One Person, Many Votes: Divided Majority and Information Aggregation, (ii) Runoff Elections and the Condorcet Loser, (iii) On the Influence of Rankings when Product Quality Depends on Buyer Characteristics, and (iv) Redistributing Income under Fiscal Vertical Imbalance.<p><p>(i) One Person, Many Votes: Divided Majority and Information Aggregation (joint with Micael Castanheira)<p>In elections, majority divisions pave the way to focal manipulations and coordination failures, which can lead to the victory of the wrong candidate. This paper shows how this flaw can be addressed if voter preferences over candidates are sensitive to information. We consider two potential sources of divisions: majority voters may have similar preferences but opposite information about the candidates, or opposite preferences. We show that when information is the source of majority divisions, Approval Voting features a unique equilibrium with full information and coordination equivalence. That is, it produces the same outcome as if both information and coordination problems could be resolved. Other electoral systems, such as Plurality and Two-Round elections, do not satisfy this equivalence. The second source of division is opposite preferences. Whenever the fraction of voters with such preferences is not too large, Approval Voting still satisfies full information and coordination equivalence.<p><p>(ii) Runoff Elections and the Condorcet Loser<p>A crucial component of Runoff electoral systems is the threshold fraction of votes above which a candidate wins outright in the first round. I analyze the influence of this threshold on the voting equilibria in three-candidate Runoff elections. I demonstrate the existence of an Ortega Effect which may unduly favor dominated candidates and thus lead to the election of the Condorcet Loser in equilibrium. The reason is that, contrarily to commonly held beliefs, lowering the threshold for first-round victory may actually induce voters to express their preferences excessively. I also extend Duverger's Law to Runoff elections with any threshold below, equal or above 50%. Therefore, Runoff elections are plagued with inferior equilibria that induce either too high or too low expression of preferences.<p><p>(iii) On the Influence of Rankings when Product Quality Depends on Buyer Characteristics<p>Information on product quality is crucial for buyers to make sound choices. For "experience products", this information is not available at the time of the purchase: it is only acquired through consumption. For much experience products, there exist institutions that provide buyers with information about quality. It is commonly believed that such institutions help consumers to make better choices and are thus welfare improving.<p>The quality of various experience products depends on the characteristics of buyers. For instance, conversely to the quality of cars, business school quality depends on buyers (i.e. students) characteristics. Indeed, one of the main inputs of a business school is enrolled students. The choice of buyers for such products has then some features of a coordination problem: ceteris paribus, a buyer prefers to buy a product consumed by buyers with "good" characteristics. This coordination dimension leads to inefficiencies when buyers coordinate on products of lower "intrinsic" quality. When the quality of products depends on buyer characteristics, information about product quality can reinforce such a coordination problem. Indeed, even though information of high quality need not mean high intrinsic quality, rational buyers pay attention to this information because they prefer high quality products, no matter the reason of the high quality. Information about product quality may then induce buyers to coordinate on products of low intrinsic quality.<p>In this paper, I show that, for experience products which quality depends on the characteristics of buyers, more information is not necessarily better. More precisely, I prove that more information about product quality may lead to a Pareto deterioration, i.e. all buyers may be worse off due.<p><p>(iv) Redistributing Income under Fiscal Vertical Imbalance (joint with Marjorie Gassner and Vincenzo Verardi)<p>From the literature on decentralization, it appears that the fiscal vertical imbalance (i.e. the dependence of subnational governments on national government revenues to support their expenditures) is somehow inherent to multi-level governments. Using a stylized model we show that this leads to a reduction of the extent of redistributive fiscal policies if the maximal size of government has been reached. To test for this empirically, we use some high quality data from the LIS dataset on individual incomes. The results are highly significant and point in the direction of our theoretical predictions.<p> / Doctorat en Sciences économiques et de gestion / info:eu-repo/semantics/nonPublished
255

Etudes des modèles d'implémentation des opérateurs de réseaux mobiles virtuels / Study of implementation models of mobile virtual network operator (MVNO)

Cuvelliez, Charles 09 March 2006 (has links)
This thesis addresses the phenomena of Mobile Virtual Network Operators (MVNO): often MVNO do not come from the telecom sector but they rent the network to mobile operators to become fully mobile operators on their own. So they compete directly not only with the other mobile operators but also with their hosting mobile operator.<p><p>Hosting MVNOs, supporting them from both operational and commercial perspectives are new challenges for the mobile operators: they have to set up a specific internal organization because they used to be a vertically integrated industrial structure. All of this represents tremendous changes for them.<p><p>This thesis describes first the strategic models for hosting MVNOs. The regulatory aspects, that are an essential driver for the liberalization of network economies, are then analyzed. The key point is: are the MVNO able to increase the efficiency and competitiveness of mobile markets.<p><p>Another important aspect is the modelling of a mobile network: the MVNOs will use some components while other components are unuseful for them. Without such model, it is impossible to derive the right costs and price to be charged to the MVNO in order to have a win win industrial model for both the MVNO and its hosting operator.<p><p>A case study of a corporate MVNO focusing on needs of multinationals is analyzed. Such case study describes all the steps for an economic actor to become MVNO from scratch. Such MVNO are not yet present although there is a market demand.<p>The last chapter makes a comparative study of the MVNO markets in the European Union.<p><p>Cette étude aborde le phénomène des opérateurs de réseaux mobiles virtuels ;il s’agit d’acteurs économiques - parfois sans lien aucun avec les télécommunications - qui louent le réseau d’opérateurs mobiles ;ils deviennent à leur tour opérateurs à part entière au point de devenir concurrent non seulement des autres opérateurs mobiles mais aussi de l’opérateur mobile qui leur a loué une partie de son réseau.<p><p>L’hébergement technique de l’opérateur virtuel, son support opérationnel et commercial, l’organisation que doit mettre en place l’opérateur mobile pour héberger son opérateur virtuel afin de partager avec lui son réseau sont des nouveaux défis auxquels les opérateurs mobiles, habitués à être des structures industrielles totalement intégrées verticalement, font face pour la première fois.<p><p>Cette étude traite des modèles d’implémentation des opérateurs de réseaux mobiles virtuels :elle aborde d’abord les aspects stratégiques qui fixent le cadre pour les opérateurs mobiles qui veulent ou sont forcés à héberger des opérateurs virtuels. Les aspects réglementaires, une composante essentielle des économies de réseaux en voie de libéralisation, sont ensuite étudiés, en particulier, comment les opérateurs virtuels peuvent contribuer à améliorer la compétitivité et la libéralisation de ces marchés.<p><p>L’étude modélise ensuite le réseau d’un opérateur mobile de façon à mettre en évidence les éléments à partager qui seront utilisés par l’opérateur virtuel. Cette modélisation est essentielle pour déterminer le coût de l’hébergement et du partage du réseau de façon à créer les conditions de rentabilité de ce modèle d’opérateur virtuel.<p><p>On aborde enfin un cas d’étude d’un candidat à devenir opérateur de réseau mobile virtuel qui souhaite répondre à la demande transnationale des grandes entreprises. Ce type d’opérateur virtuel est encore inexistant de nos jours.<p>Enfin, nous terminons par une étude comparée de la situation des opérateurs mobiles virtuels à travers l’Union Européenne.<p><p> / Doctorat en sciences appliquées / info:eu-repo/semantics/nonPublished
256

Essays on Vietnam's financial markets: databases and empirics

Vuong, Quan-Hoang January 2004 (has links)
Doctorat en Sciences politiques et sociales / info:eu-repo/semantics/nonPublished
257

L' investissement direct à l'étranger : le cas de l'Algérie / The foreign direct investment : the case of Algeria

Boualam, Fatima 12 July 2010 (has links)
Depuis plus de trois décennies, l'un des aspects par lequel se manifeste la mondialisation est la mobilité internationale des firmes et des facteurs de production. L'interdépendance croissante des économies, qui résulte de l'expansion des échanges et du développement des mouvements des capitaux, est devenue incontournable. La mondialisation a donné lieu à la promotion du libéralisme économique, et au développement d'un processus mondialisé des activités, dans lesquels les IDE se font une place centrale, dans les politiques de développement des pays d'accueil. Qu'ils soient pays développés ou PED, ces derniers se livrent à une concurrence acerbe, pour l'attractivité des flux d'IDE sur leur territoire. Les FMN ont gagné une place centrale dans ce nouveau schéma. Après avoir suscité la méfiance et l'hostilité des gouvernements des PED, elles sont devenues une composante principale dans la stratégie de développement. Suivant leurs propres préoccupations (gains de productivité, sources d'approvisionnement, etc), les FMN établissent des stratégies dans une perspective internationale, et intègrent dans les activités leur responsabilité sociale et environnementale. Les IDE ont fait l'objet d'une forte controverse sur leurs potentialités à faire profiter les pays d'accueil des « spillovers ». Une inégale répartition des flux d'IDE, amène les PED à mettre en place des politiques d'attractivité ambitieuses (exonérations fiscales, subventions, etc.). L'Algérie a engagé, à l'instar d'autres PED, une série de réformes pour s'inscrire dans une nouvelle logique qui consacre les lois du marché et la libéralisation de l'économie touchant tous les secteurs d'activité. L'objectif de cette thèse est d'évaluer empiriquement les déterminants d'attractivité des IDE dans le modèle algérien, de la conduite de sa politique de promotion et de son évaluation. La qualité des institutions est bel et bien validée comme déterminant central dans la conduite de la politique d'attractivité de l'Algérie. La construction de la matrice d'attractivité a révélé que l'Algérie se place dans le cercle des « pays potentiels », qui pourraient figurer un jour dans la « short-list », à condition d'améliorer certaines composantes de leur attractivité. Des réformes cohérentes restent à faire pour une insertion effective dans l'économie internationale. / For over three decades, one aspect by which manifests globalization is the international mobility of firms and factors of production. Growing interdependence of economies resulting from the expansion of trade and development of capital movements has become unavoidable. Globalization has given rise to the promotion of economic liberalism, and development of a process of global activities in which FDI are central in the development policies of host countries. Whether developed or developing countries, these one are engaged in a bitter competition for FDI attractiveness of flux of FDI in their territory. The MNF has taken a central place in this new scheme. After arousing the suspicion and hostility of the governments of developing countries, they have become a key component in the strategy of development. Following their own concerns (productivity gains, exoneration, etc.), the MNF gone to research an international perspective, and integrate their activities in the social and environmental responsibility. The FDI has been subject of an important controversy over their potential to benefit the host countries of the "spillover". An unequal distribution of FDI, make the developing countries to take up an attractive ambitious policies (tax holidays, subsidies etc.). Algeria has committed, like other developing countries, a series of reforms to be part of a new logic that embodies the laws of the market and liberalization of the economy, affecting all sectors of activity. The objective of this thesis is to empirically assess the determinants of FDI attraction in the Algerian model, the conduct of its promotion policy and its evaluation. Quality of institutions is indeed well validated as a central determinant in the conduct of political attractiveness of Algeria. The construction of the matrix of attractiveness revealed that Algeria is located in the circle of ?potential countries", and that could one day appear in the "short list", provided certain components to improve their attractiveness. Consistent reforms are still needed for effective integration into the international economy.
258

Analyse du risque en assurance automobile : nouvelles approches / Risk analysis on car insurance market : new approaches

Kouki-Zekri, Mériem 28 June 2011 (has links)
La recherche menée dans cette thèse propose une contribution à l’analyse du risque sur le marché de l’assurance automobile en France. Trois nouveaux axes sont présentés : le premier axe s’inscrit dans un cadre théorique de marché d’assurance automobile. Un modèle original de double asymétrie d’information est présenté. Le principal résultat qui en découle est l’existence de deux sortes de contrats d’équilibre : un contrat séparateur et un contrat mélangeant. Le deuxième point est lié à la prise en compte de la sinistralité passée dans l’étude de la relation risque - couverture. Des modèles bivariés et trivariés sont appliqués pour cette fin. Il en ressort que l’hypothèse de l’asymétrie d’information est vérifiée. Enfin, la troisième question soulevée dans cette thèse concerne l’application de la surprime aux jeunes conducteurs. Nous montrons par des modélisations économétriques de la sinistralité que la légitimité des assureurs à proposer quasi systématiquement des tarifs plus élevés aux jeunes conducteurs par rapport aux conducteurs expérimentés n'est pas toujours vérifiée. / This dissertation provides a contribution to the risk analysis on the French automobile insurance market. The objective of this thesis is threefold. The first aim relates to a theoretical framework applied on insurance market. An original model of double asymmetry of information is presented The main result that emerges is the existence of two kinds of contracts at equilibrium :a separating contract and a pooling contract. The second point concerns the past claims and the risk-coverage correlation. Bivariate and trivariate models are applied for this purpose. It results that the assumption of asymmetry of information is not rejected. The third issue is related to the over-premium that insurers apply quasi-systematically to the young drivers. We show, using econometric modeling, that this over-pricing compared to the experienced drivers’premium isnot necessary and its removal does not compromise the sustainability of the insurance company.
259

Essays on the propensity to patent: measurement and determinants

de Rassenfosse, Gaétan 28 May 2010 (has links)
Chapter 1 discusses the econometric pitfalls associated with the use of patent production functions to study the invention process. It then goes on to argue that a sound understanding of the invention process necessarily requires an understanding of the propensity to patent. The empirical analysis carried out in Chapter 1 seeks to explain the proportion of inventions patented – a potential metric for the propensity to patent – from an international sample of manufacturing firms. <p><p>Chapter 2 proposes a methodology to filter out the noise induced by varying patent practices in the R&D-patent relationship. The methodology explicitly decomposes the patent-to-R&D ratio into its components of productivity and propensity. It is then applied to a novel data set of priority patent applications in four countries and six industries.<p><p>Chapter 3 takes stock of the literature on the role of fees in patent systems while Chapter 4 presents estimates of the price elasticity of demand for patents at the trilateral offices (that is, in the U.S. Japan and Europe). The estimation of dynamic panel data models of patent applications suggests that the long-term price elasticity is about -0.30.<p> / Doctorat en Sciences économiques et de gestion / info:eu-repo/semantics/nonPublished
260

Les interactions sociales dans le milieu du travail : évidence du terrain

Diallo, Mamadou Yaya 24 April 2018 (has links)
L'objectif de ce mémoire est d'évaluer l'influence des amis sur la productivité des individus à partir de données d'une entreprise de reboisement. La productivité de chaque employé est observée et correspond au nombre d'arbres plantés par jour. Ces données sont appariées avec des informations au réseau d'amitié des employés. Nous avons utilisé un modèle linéaire en moyenne pour estimer les effets de pairs. Nous trouvons une influence positive des amis. Tout choc sur la productivité est amplifié entre 1 et 2.

Page generated in 0.0443 seconds