Spelling suggestions: "subject:"timingprecision making"" "subject:"prediceringsprecision making""
1 |
Essays In Effects of Market PowerBurya, Anastasia January 2023 (has links)
My dissertation within macroeconomics puts special emphasis on uncovering the effects of market power within product and labor markets. I conduct these studies using novel empirical techniques and detailed granular data sets at the firm- and household-levels.In the first chapter, coauthored with Shruti Mishra, we consider how firms’ price-setting decisions are affected by the properties of their markup. We start by designing a general oligopoly framework that accounts for firm heterogeneity, firm granularity, and the effects of market share distribution. We use this structural model to decompose the effect of price on the quantity demanded into a direct price effect and an indirect effect coming from the impact of the market-level aggregates, such as market-level price. This decomposition allows us to take care of all the degrees of heterogeneity in a flexible manner.
Under plausible assumptions, the most crucial of which we test in the data, all the information about the distribution of shares within the market will be accounted for by the variation of the market aggregates. Under these conditions, we can estimate the structural parameters that do not depend on the distribution of shares within the market. We use the model to inform our empirical strategy and apply it to the ACNielsen Retail Scanner Data. We test the assumptions put forward by the theory, estimate structural parameters and then use the decomposition formulas to calculate the elasticity of the firm’s demand and other parameters important for the markup variation. We find that elasticity depends sharply on the firm’s market share and decreases significantly as market shares increase.
There is a positive dependence of demand elasticities on relative prices (superelasticity), in line with Marshall’s second law of demand. Additionally, elasticity depends on the levels of competitiveness within the market. Even if a firm’s market share stays the same, its elasticity decreases if the market becomes less competitive. Lastly, we apply our estimates to calculate the optimal pass-through of marginal costs into prices and strategic complementarity. We find that an individual firm’s pass-through is contained between zero and one, but depends sharply on the firm’s market share. We find that strategic complementarity between two firms depends on both of their shares and is not symmetric so the degree of strategic complementarity between a small and a large firm, between two small firms, between two large firms, or between large and small firms would all be different. We then assess the non-linear effects of the marginal cost shock on the price and find that pass-through depends positively on the size of the marginal cost shock. This means that the total effect of marginal cost shock on prices is non-linear and that firm prices are more responsive to marginal cost increases than to marginal cost decreases. For market leaders, the pass-through of a large negative marginal cost shock would be close to zero, while the pass-through of a large positive marginal cost shock would approach that of small firms.
In the second chapter, coauthored with Rui Mano, Yannick Timmer, and Anke Weber, we study the effect of the firm granularity in the labor market on their hiring decisions. We argue that prevalence of firms controlling large vacancy shares plays an important role in the transmission of monetary policy to labor demand and wage growth and can partially explain the flattening of the wage Philips curve after the GFC. Accommodative monetary policy raises the marginal product of labor, incentivizing all firms to hire more. However, since the wage elasticity of labor demand is lower for high vacancy share firms, they can hire more workers without raising wages disproportionately. We study this effect in the Burning Glass Technology vacancy microdata and, consistently with this mechanism, show that accommodative monetary policy increases labor demand more for high vacancy share firms and that this comes without a disproportionate response in wages.
In aggregate, this implies that due to the presence of firms controlling large vacancy shares, accommodative monetary policy can lead to a decline in the unemployment rate that is decoupled from an increase in wage growth. Quantitatively, a firm at the 50th percentile of vacancy share distribution increases its labor demand by ≈ 7% in response to a 10 basis point surprise monetary loosening while a firm at the 95th percentile of the vacancy share distribution increases labor demand by ≈ 9%. Moreover, the effect of monetary policy shocks on firms with high vacancy share is much more persistent, with effects economically large and statistically significant at least for eight quarters. At the same time, there is no comparable differential response of wages, so even though firms with high vacancy shares hire more, they don’t have to increase their wages by more. In this case, more hiring does not result in a comparable increase in wage inflation. This channel can partly explain the flattening of the wage Phillips curve and the “wage-less” recovery after the Global Financial Crisis.In the third and last chapter, coauthored with Shruti Mishra, we study the impact of wealth heterogeneity on labor supply decisions. In the standard model, the positive wealth effect should decrease the willingness to supply labor. In the macroeconomic setting, this means that the direction and the magnitude of the wealth effect will determine whether people search for jobs more actively after a monetary intervention. For example, if unemployed consumers are indebted, they experience a negative wealth effect after a monetary contraction, search for jobs more actively and increase their probability of finding a job, therefore, reducing the total unemployment response.
The sign and magnitude of the overall effect of monetary policy on unemployment will therefore depend on whether unemployed consumers are indebted and the magnitude of their debt. To study this mechanism, we develop a theoretical framework with heterogeneous consumers and employment search efforts and then decompose the effect of the monetary policy shock on aggregate unemployment. We test the prediction of the model in both micro and aggregate data. To test the prediction of the model in the aggregate, we estimate the coefficient of the interaction term between the debt-to-income ratio and Romer and Romer monetary policy shock. For the microdata, we use a similar regression with unemployment and mortgage variables for individual consumers from the PSID panel dataset. Consistently with the proposed mechanism, we find that the intuitive negative effect on employment of the monetary contraction is virtually non-existent or even reversed for indebted consumers.
The three chapters together paint a complex picture of the impact of market power on macroeconomic variables. First, product market power impacts price-setting decisions of the firms and affects the dynamic of prices and inflation, effectively leading less concentrated economies to behave as if they have more flexible prices. Second, firms that control large share of vacancies in their labor market conduct hiring differently from their smaller counterparts leading to more quantity expansion. Lastly, labor markets exhibit complex supply dynamics as well, with labor supply potentially intensifying during recessions, which might lead the bargaining power of firms to become countercyclical. All these effects hold first-order significance for macroeconomic dynamics and influence our ability to project the future or asses the effects of monetary policy.
|
2 |
Integrating Machine Learning and Optimization for Problems in Contextual Decision-Making and Dynamic LearningZhao, Yunfan January 2023 (has links)
In this thesis, we study the intersection of optimization and machine learning, especially how to use machine learning and optimization tools to make decisions. In Chapter 1, we propose a novel approach for accurate policy evaluation in personalized pricing. We solve an optimization problem to evaluate new pricing strategies, while searching over some worst case revenue functions.
In Chapter 2, we consider problems where parameters are predicted using a machine learning model to be used for downstream optimization tasks. Recent works have proposed an integrated approach, accounting for how predictions are used in the downstream optimization problem, instead of just minimizing prediction error. We analyze the asymptotic performance of methods under the integrated and traditional approaches, in the sense of first-order stochastic. We argue that when the model class is rich enough to cover the ground truth, the traditional predict-then-optimize approach outperforms the integrated approach, and the performance ordering between the two approaches is reversed when the model is misspecified.
In Chapter 3, we present a new class of architectures for reinforcement learning, Implicit Two-Tower (ITT) policies, where the actions are chosen based on the attention scores of their learnable latent representations with those of the input states. We show that ITT-architectures are particularly suited for evolutionary optimization and the corresponding policy training algorithms outperform their vanilla unstructured implicit counterparts as well as commonly used explicit policies.
In Chapter 4, we consider an active learning problem, in which the learner has the ability to sequentially select unlabeled samples for labeling. A typical active learning algorithm would sample more points at “difficult" regions in the feature space to more efficiently use the sampling budget and reduce excess risk. For nonparametric classification with smooth regression functions, we show that nuances in notions of margin that involves the uniqueness of the Bayes classifier, having no apparent effect on rates in passive learning, determine whether or not any active learner can outperform passive learning rates.
|
3 |
Beyond Worst-Case Analysis for Sequential Decision MakingPerivier, Noemie January 2023 (has links)
Traditionally, algorithms have been evaluated through worst-case analysis, where the input is presumed to take its worst possible configuration. However, in many real-world settings, the data is not adversarially constructed and, on the contrary, exhibits some recognizable patterns. This often leads worst-case guarantees to be poor indicators of algorithms' performance. To overcome this limitation, a growing body of work on Beyond Worst-Case analysis has recently emerged.
In this thesis, we are concerned with sequential decision-making problems, where an agent must take successive decisions over multiple time steps without knowing in advance the forthcoming input. Examples of such settings include ride-sharing, online retail or job scheduling. Motivated by the unprecedented surge of data in these domains, which may help to overcome worst-case barriers by allowing to predict at least partially the future, we explore three distinct frameworks for Beyond Worst-Case analysis of sequential decision-making: (i) semi-random models, (ii) parametric models, and (iii) algorithms with predictions. While they all pursue the same objective — using previously collected data to provide stronger theoretical guarantees —, these frameworks mainly differ in the way the data is utilized. We examine each of them separately and present novel results for five different online optimization problems: minimum cost matching, assortment optimization (with and without inventory constraints), pricing and scheduling.
|
4 |
Simplifying Revenue ManagementSheth, Harsh Tarak January 2024 (has links)
In this thesis, we study three revenue management problems where we propose simple algorithms with provable guarantees. While online marketplaces provide retailers with tremendous flexibility, they are often large, noisy, have multiple stakeholders, and could be more challenging to characterize. These complexities give rise to a preference for simple, interpretable policies. Further, traditional marketplaces such as brick-and-mortar stores cannot always leverage tools designed for online environments due to physical constraints, higher latency, etc. With these motivations in mind, we develop algorithms for assortment optimization and pricing that are easy to implement in practice and have theoretical justifications for their performance.
In Chapter 1, we consider a dynamic assortment optimization problem where the seller has a fixed inventory of multiple substitutable products to sell over a fixed time horizon. We consider two modifications to the traditional problem. First, we simplify the assortment planning by restricting assortment changes to "product retirements". When a product is retired, it becomes unavailable to all future customers. Second, we assume the seller has flexibility regarding which customers to approach. In each period, the seller chooses which subset of products to retire and selects a customer to visit. The selected customer then receives an option to purchase one of the available products, i.e., non-retired products with positive remaining inventory. We provide two policies for this problem. Our first policy guarantees a constant fraction of the best possible revenue. Our second policy is near-optimal but requires the problem to have a specific structure.
In Chapter 2, we study the fundamental joint pricing and inventory management problem. The optimal policy for the model we consider is known to be an (s, S, p) policy: when the inventory level drops to s units, the seller immediately places an order to replenish the inventory to S units. Specifically, the optimal pricing policy p has a different price for every inventory state. We proposed simple policies requiring no more than three prices and prove that these policies are near-optimal compared to optimal policies which require more prices and are less robust. In particular, when orders cannot be backlogged, we show that a single price is sufficient for good performance.
In Chapter 3, we analyze assortment optimization and pricing with opaque products. An opaque product is one for which only partial information is available to the buyer at the time of purchase. When a customer selects the opaque product, the seller can fulfill the purchase using any of the offered products. Opaque products can help sellers boost total sales. We propose simple policies for assortment optimization with provable constant factor guarantees, which are near-optimal in numerical experiments. We also provide upper bounds for the advantage of selling opaque products.
|
5 |
Data-driven Decision-making: New Insights on Algorithm Performance and Data ValueMouchtaki, Omar January 2024 (has links)
With the rise of data-driven algorithms, both industrial practitioners and academicians have aimed at understanding how one can use past information to make better future decisions. This question is particularly challenging, as any answer necessarily depends on several parameters, such as the features of the data used (e.g., the quantity and relevance of data), the downstream problem being solved, and the type of algorithms deployed to leverage the data. Most of the current literature analyzes the value of data by anchoring their methods in the large data regime, making the implicit assumption that data is widely available in practice.
In this work, we depart from this implicit assumption and posit that, in fact, relevant data is a scarce resource in many practical settings. For instance, data is usually aggregated across different times, product categories, and geographies, and therefore the effective size of datasets is orders of magnitude lower than it may appear to be. The goal of this thesis is to bridge the gap between the theoretical understanding of data-driven decisions and practical performance by developing a problem-centric theory of data-driven decision-making in which we assess the value of data by quantifying its impact on our downstream decisions. In particular, we design methodological tools tailored to the problem at hand and derive fine-grained and problem-specific guarantees for algorithms.
In the first chapter, we study the data-driven newsvendor problem under the modeling assumption that data is identically and independently distributed. We are interested in analyzing central policies in the literature, such as Sample Average Approximation (SAA), along with optimal ones, and in characterizing the performance achievable across data sizes, both small and large. Specifically, we characterize exactly the performance of SAA and uncover novel fundamental insights on the value of data. Indeed, our analysis reveals that tens of samples are sufficient to perform very efficiently, but also that more data can lead to worse out-of-sample performance for SAA. In turn, we derive an optimal algorithm in the minimax sense, enhancing decision quality with limited data.
The second chapter explores the impact of data relevance on decision quality, addressing the challenge of using historical data from varying sources that may not be fully indicative of the future. We quantify the performance of SAA in these heterogeneous environments and design rate-optimal policies in settings where SAA falters. We illustrate the versatility of our framework by analyzing several prototypical problems across various fields: the newsvendor, pricing, and ski rental problems. Our analysis shows that the type of achievable asymptotic performance varies significantly across different problem classes and heterogeneity notions.
Finally, the third chapter develops a framework for contextual decision-making, examining how past data relevance and quantity affect policy performance. Focusing on the contextual newsvendor problem, we analyze the wide class of Weighted Empirical Risk Minimization (WERM) policies, which weigh past data according to their relevance. This class of policies includes the SAA policy (also referred to as ERM), k-Nearest Neighbors, and kernel-based methods. While past literature focuses on upper bounds via concentration inequalities, we instead take an optimization approach and isolate a structure in the newsvendor loss function that allows us to reduce the infinite-dimensional optimization problem over worst-case distributions to a simple line search. In addition to this methodological contribution, our exact analysis offers new granular insights into the learning curve of algorithms in contextual settings. Through these contributions, the thesis advances our understanding of data-driven decision-making, offering both theoretical foundations and practical insights for diverse operational applications.
|
6 |
Essays in MacroeconomicsDuarte Mascarenhas, Rui January 2023 (has links)
This dissertation consists of three chapters, each containing a distinct research paper in the field of macroeconomics. In the first chapter, I estimate the impact of mutual fund flows on corporate bond prices, issuance and firm investment. I leverage variation caused by the COVID-19 induced financial panic of March 2020 and find that safer firms suffered a larger impact in the component of bond spreads that does not compensate for expected default risk. However, I do not detect impacts of fund flows on issuance or investment.
A simple model predicts liquidation decisions and price responses as being driven by demand and liquidation elasticities, which depend on the characteristics of the bond return processes. In the second chapter, we ask: what is the importance of firm and bank credit factors in determining investment responses to monetary policy? We decompose variation in corporate loan growth rates into purely firm-level and bank-level variation. The estimated factors are correlated with a set of variables that proxy for the firm’s and bank’s financial health. Firms with a higher borrowing factor experience relatively larger investment responses to an unexpected interest rate shock; the effect is muted when the shock is the reveal of central bank information. The bank factor does not induce similar heterogeneity in investment responses.
In the third chapter, we ask: what is the nature of optimal monetary policy and central bank disclosure when the monetary authority is uncertain about the economic state? We consider a model in which firms make nominal pricing decisions and the central bank sets the nominal interest rate under incomplete information. We find that implementing flexible-price allocations is both feasible and optimal despite the existence of numerous measurability constraints; we explore a series of different implementations. When monetary policy is sub-optimal, public information disclosure by the central bank is welfare-improving as long as either firm or central bank information is sufficiently precise.
|
7 |
The Pricing Decision Process in Software-as-a-Service CompaniesWilczkowski, Susanna January 2015 (has links)
This study examines various approaches used by companies providingsoftware-as-a-service (SaaS) in a business-to-business (B2B) environment to find a pricing strategy. To be able to meet competition in a global market, a good pricing strategy is vital. Pricing is an important part of marketing, which must be congruent with the company's overall objectives. Strategic pricing is made up of different factors represented in the strategic pricing pyramid, which is based on a value-based approach. It is paramount to know your customers and their preferences when designing a pricing strategy and selecting pricing models, price metrics, market segmentation, bundling, and price levels. After having estimated how much value a product or service creates for a customer, this must be communicated to potential customers in order to convince them to purchase your offering. Choosing the right pricing strategy is not a onetime occurrence, but an on-going process. In this qualitative study, three case studies are performed to tie theory to real world practise.
|
Page generated in 0.0713 seconds