• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 164
  • 30
  • 15
  • 10
  • 9
  • 6
  • 5
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • Tagged with
  • 292
  • 292
  • 143
  • 82
  • 59
  • 46
  • 46
  • 37
  • 32
  • 31
  • 31
  • 26
  • 24
  • 22
  • 21
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
211

Creating Systems and Applying Large-Scale Methods to Improve Student Remediation in Online Tutoring Systems in Real-time and at Scale

Selent, Douglas A 08 June 2017 (has links)
"A common problem shared amongst online tutoring systems is the time-consuming nature of content creation. It has been estimated that an hour of online instruction can take up to 100-300 hours to create. Several systems have created tools to expedite content creation, such as the Cognitive Tutors Authoring Tool (CTAT) and the ASSISTments builder. Although these tools make content creation more efficient, they all still depend on the efforts of a content creator and/or past historical. These tools do not take full advantage of the power of the crowd. These issues and challenges faced by online tutoring systems provide an ideal environment to implement a solution using crowdsourcing. I created the PeerASSIST system to provide a solution to the challenges faced with tutoring content creation. PeerASSIST crowdsources the work students have done on problems inside the ASSISTments online tutoring system and redistributes that work as a form of tutoring to their peers, who are in need of assistance. Multi-objective multi-armed bandit algorithms are used to distribute student work, which balance exploring which work is good and exploiting the best currently known work. These policies are customized to run in a real-world environment with multiple asynchronous reward functions and an infinite number of actions. Inspired by major companies such as Google, Facebook, and Bing, PeerASSIST is also designed as a platform for simultaneous online experimentation in real-time and at scale. Currently over 600 teachers (grades K-12) are requiring students to show their work. Over 300,000 instances of student work have been collected from over 18,000 students across 28,000 problems. From the student work collected, 2,000 instances have been redistributed to over 550 students who needed help over the past few months. I conducted a randomized controlled experiment to evaluate the effectiveness of PeerASSIST on student performance. Other contributions include representing learning maps as Bayesian networks to model student performance, creating a machine-learning algorithm to derive student incorrect processes from their incorrect answer and the inputs of the problem, and applying Bayesian hypothesis testing to A/B experiments. We showed that learning maps can be simplified without practical loss of accuracy and that time series data is necessary to simplify learning maps if the static data is highly correlated. I also created several interventions to evaluate the effectiveness of the buggy messages generated from the machine-learned incorrect processes. The null results of these experiments demonstrate the difficulty of creating a successful tutoring and suggest that other methods of tutoring content creation (i.e. PeerASSIST) should be explored."
212

A PROFICIÊNCIA MATEMÁTICA DOS ALUNOS DO NÚCLEO REGIONAL DE EDUCAÇÃO DE PONTA GROSSA NO SAEP 2012: UMA ANÁLISE DOS DESCRITORES DO TRATAMENTO DA INFORMAÇÃO

Anjos, Luiz Fabiano dos 19 March 2015 (has links)
Made available in DSpace on 2017-07-21T20:56:27Z (GMT). No. of bitstreams: 1 Luiz Fabiano dos Anjos.pdf: 1403026 bytes, checksum: a11a5c920d7a1e504e6f9467bb570c19 (MD5) Previous issue date: 2015-03-19 / The external evaluations of Evaluation System, at the national and state level, allow collecting data that offer to the managers and the school community about the use of educational institutions and the performance of each student. This work aims at analyzing the performance of students of the 3rd year of high school under the jurisdiction of public facilities to the Regional Center of Ponta Grossa Education External Evaluation of the evaluation system of the State of Paraná Education in Mathematics with respect to Content Structuring - Treatment Information. From the official documents, there was the recommendations for teaching such content. In textbooks used by the institutions there was the presentation and the time the contente of the treatment was developed. In carrying out the study opted for a literature consisting in a qualitative and quantitative approach to research. As the research and statistical analysis tool used a hypothesis test for data analysis in response to the initial question. So, to complete the examination of the data and the test we found that this came corroborate the hypothesis that the textbook adopted, does not significantly affect the average performance of students in analyzed Content Structuring. / As avaliações externas do Sistema de Avaliação, seja no âmbito nacional ou estadual, permitem coletar dados que oferecem aos gestores e a comunidade escolar informações sobre o aproveitamento das instituições de ensino e o rendimento de cada aluno. Este trabalho tem como objeto analisar o desempenho dos estudantes do 3º ano do Ensino Médio dos estabelecimentos públicos jurisdicionadas ao Núcleo Regional de Educação de Ponta Grossa na Avaliação Externa do Sistema de Avaliação da Educação do Estado do Paraná, na disciplina de Matemática com relação ao Conteúdo Estruturante - Tratamento da Informação. A partir dos documentos oficiais, observou-se as recomendações para o ensino destes conteúdos. Nos livros didáticos utilizados pelas instituições verificou-se a forma de apresentação e o momento em que o conteúdo Tratamento da Informação era desenvolvido. Para a concretização deste estudo optou-se por uma pesquisa bibliográfica constituindo-se em uma pesquisa de abordagem qualitativa e quantitativa. Como ferramenta da pesquisa e análise estatística utilizou-se um Teste de Hipótese para análise dos dados em resposta ao questionamento inicial. Assim, ao finalizar a análise dos dados e o teste foi possível verificar que este veio corroborar com a hipótese de que o livro didático adotado, não afeta significativamente o desempenho médio dos estudantes, no conteúdo estruturante analisado.
213

Faktorer som kan ha samband med företags lönsamhet : En empirisk studie på de 20 största bolagen på Stockholmsbörsen / Factors that can correlate with corporate profitability : An empirical study of the 20 largest public corporations on the Stockholm stock market

Karell-Holmgren, Kasper, Mirza, Pauline January 2009 (has links)
<p><strong>Syfte: </strong>Syftet med uppsatsen är att undersöka och analysera om det finns något samband mellan företags lönsamhet och dess kapitalstruktur, storlek eller branschtillhörigheten. Detta innebär att en empirisk studie kommer att ske på soliditet, omsättning samt branschtillhörigheten för att se hur och om det finns något samband mellan lönsamheten och dessa tre faktorer. Vidare är syftet även att undersöka om det kan finnas något samband mellan företags lönsamhet och företags standardavvikelse på räntabilitet.</p><p><strong>Metod: </strong>Undersökningen är en empirisk studie med en deduktiv kvantitativ och kvalitativ ansats. Empirin undersöks med olika statistiska metoder såsom regressionsanalys och korrelationsberäkning.</p><p><strong>Teori: </strong>Uppsatsen har utgått från teorier gällande kapitalstruktur och lönsamhet. Nyckeltalen som används från dessa teorier är soliditet respektive räntabilitet på eget kapital.</p><p><strong>Empiri: </strong>Data från de 20 största börsnoterade företagen på Stockholmsbörsen har samlats in från företagens årsredovisningar 2003-2007. Den data som tagits fram är data på företagens lönsamhet (räntabilitet på eget kapital), kapitalstruktur (soliditet), storlek (omsättning), samt lönsamhet för företag utöver de 20 valda företagens, detta för att användas i analysen av branschtillhörigheten.</p><p><strong>Resultat: </strong>Resultatet av undersökningen visar att det inte finns något signifikant samband mellan vare sig företags kapitalstruktur och lönsamheten, företagsstorlek och lönsamheten, branschtillhörigheten för ett företag och lönsamheten eller standardavvikelse på räntabilitet och lönsamhet.</p> / <p><strong>Purpose: </strong>The purpose of the essay is to analyze the potential correlation between corporate profitability and corporate capital structure, corporate size, and corporate line of business. An empirical study will be done on solidity, turnover and on the line of business to determine if a correlation exists between profitability and these factors. The purpose is also to examine if there is a correlation between the corporate profitabilty and standard deviation of the corporations return on equity.</p><p><strong>Method: </strong>The survey is an empirical study employing a deductive quantitative and qualitative approach. The empirics are examined with statistical methods such as regression analysis and calculation of correlation.</p><p><strong>Theory: </strong>The essay uses theories about capital structure and profitability. The key numbers that have been used from these theories are solidity and return on equity.</p><p><strong>Empirics: </strong>Data from the 20 largest public corporations on the Stockholm stock market collected from their respective annual 2003-2007 reports. This includes data about corporate profitability (return on equity), capital structure (solidity), size (turnover), and the profitability of corporations beyond the 20 chosen ones, this to be used in the analyze of corporations within the specific line of business.</p><p><strong>Result: </strong>This survey shows that there is no significant correlation between corporate capital structure and the profitability, corporate size and the profitability, line of business of a corporation and the profitability or standard deviation of the corporations return on equity and the profitability.</p>
214

Faktorer som kan ha samband med företags lönsamhet : En empirisk studie på de 20 största bolagen på Stockholmsbörsen / Factors that can correlate with corporate profitability : An empirical study of the 20 largest public corporations on the Stockholm stock market

Karell-Holmgren, Kasper, Mirza, Pauline January 2009 (has links)
Syfte: Syftet med uppsatsen är att undersöka och analysera om det finns något samband mellan företags lönsamhet och dess kapitalstruktur, storlek eller branschtillhörigheten. Detta innebär att en empirisk studie kommer att ske på soliditet, omsättning samt branschtillhörigheten för att se hur och om det finns något samband mellan lönsamheten och dessa tre faktorer. Vidare är syftet även att undersöka om det kan finnas något samband mellan företags lönsamhet och företags standardavvikelse på räntabilitet. Metod: Undersökningen är en empirisk studie med en deduktiv kvantitativ och kvalitativ ansats. Empirin undersöks med olika statistiska metoder såsom regressionsanalys och korrelationsberäkning. Teori: Uppsatsen har utgått från teorier gällande kapitalstruktur och lönsamhet. Nyckeltalen som används från dessa teorier är soliditet respektive räntabilitet på eget kapital. Empiri: Data från de 20 största börsnoterade företagen på Stockholmsbörsen har samlats in från företagens årsredovisningar 2003-2007. Den data som tagits fram är data på företagens lönsamhet (räntabilitet på eget kapital), kapitalstruktur (soliditet), storlek (omsättning), samt lönsamhet för företag utöver de 20 valda företagens, detta för att användas i analysen av branschtillhörigheten. Resultat: Resultatet av undersökningen visar att det inte finns något signifikant samband mellan vare sig företags kapitalstruktur och lönsamheten, företagsstorlek och lönsamheten, branschtillhörigheten för ett företag och lönsamheten eller standardavvikelse på räntabilitet och lönsamhet. / Purpose: The purpose of the essay is to analyze the potential correlation between corporate profitability and corporate capital structure, corporate size, and corporate line of business. An empirical study will be done on solidity, turnover and on the line of business to determine if a correlation exists between profitability and these factors. The purpose is also to examine if there is a correlation between the corporate profitabilty and standard deviation of the corporations return on equity. Method: The survey is an empirical study employing a deductive quantitative and qualitative approach. The empirics are examined with statistical methods such as regression analysis and calculation of correlation. Theory: The essay uses theories about capital structure and profitability. The key numbers that have been used from these theories are solidity and return on equity. Empirics: Data from the 20 largest public corporations on the Stockholm stock market collected from their respective annual 2003-2007 reports. This includes data about corporate profitability (return on equity), capital structure (solidity), size (turnover), and the profitability of corporations beyond the 20 chosen ones, this to be used in the analyze of corporations within the specific line of business. Result: This survey shows that there is no significant correlation between corporate capital structure and the profitability, corporate size and the profitability, line of business of a corporation and the profitability or standard deviation of the corporations return on equity and the profitability.
215

Likvida tillgångars påverkan på lönsamhet och aktievärde : En studie av svenska företag på Nasdaq OMX Nordic Stockholm mellan 2008-2011

Nitschmann, Johanna, Norén, William January 2013 (has links)
Objective: The study will investigate whether cash liquidity have a negative affect on profitability and share value of companies, listed on Nasdaq OMX Stockholm 2008-2011. Part of the purpose is also to show if the industry risk is of importance for treasury management of these companies. Method: The methodology for the study is key analysis through hypothesis testing and regression analysis Conclusion: The liquidity ratio affects profitability in a negative direction on the entire sample. No other conclusion can be drawn. / Syfte: Studiens syfte är att undersöka om kassalikviditeten har en negativ påverkan på lönsamhet och aktievärde, hos företag noterade på Nasdaq OMX Nordic Stockholm mellan åren 2008 till 2011. Delsyftet är att undersöka om branschrisken har ett positivt samband med kassalikviditeten för dessa företag. Metod: Metoden för studien är nyckeltalsanalys med hjälp av hypotesprövning och regressionsanalys, för företag på Nasdaq OMX Nordic Stockholm. Slutsats: Kassalikviditeten påverkar lönsamheten i negativ riktning på hela urvalet. Inga andra slutsatser kan dras.
216

Some questions in risk management and high-dimensional data analysis

Wang, Ruodu 04 May 2012 (has links)
This thesis addresses three topics in the area of statistics and probability, with applications in risk management. First, for the testing problems in the high-dimensional (HD) data analysis, we present a novel method to formulate empirical likelihood tests and jackknife empirical likelihood tests by splitting the sample into subgroups. New tests are constructed to test the equality of two HD means, the coefficient in the HD linear models and the HD covariance matrices. Second, we propose jackknife empirical likelihood methods to formulate interval estimations for important quantities in actuarial science and risk management, such as the risk-distortion measures, Spearman's rho and parametric copulas. Lastly, we introduce the theory of completely mixable (CM) distributions. We give properties of the CM distributions, show that a few classes of distributions are CM and use the new technique to find the bounds for the sum of individual risks with given marginal distributions but unspecific dependence structure. The result partially solves a problem that had been a challenge for decades, and directly leads to the bounds on quantities of interest in risk management, such as the variance, the stop-loss premium, the price of the European options and the Value-at-Risk associated with a joint portfolio.
217

Evaluation of Ultra-Wideband Sensing Technology for Position Location in Indoor Construction Environments

Aryan, Afrooz January 2011 (has links)
Effective construction management involves real-time decisions regarding the progress of specific activities, the location of materials and equipment, and the construction site safety. The decision making process can be improved using real-time positioning technologies such as Radio Frequency Identification Device (RFID) systems, Global Positioning System (GPS), and Ultra Wide Band (UWB) sensors. While the GPS is not applicable to indoor positioning and RFID tags cannot provide a fully automated system for position location, the characteristics of UWB systems make this technology a strong candidate for a fully automated positioning system in an indoor construction environment. This thesis presents a comprehensive study of the performance of UWB systems in a controlled laboratory environment and in an institutional construction site in Waterloo, Canada as well as for a particular safety application. A primary objective of the research was to establish the accuracy of real-time position location under various conditions, including the effect of different construction materials (e.g., wood and metal), and to analyze changes in the accuracy of position location as construction progresses and the indoor environment physically evolves. Different challenges faced in implementing such a system in an active construction environment are addressed. Based on a statistical analysis of laboratory data, and considering the construction site experience, the reliability of the UWB positioning system for the aforementioned environments is discussed. Furthermore, an automated safety system is proposed using the real-time UWB positioning technology. Based on the error modeling of the UWB position location, an optimum alarming algorithm is designed for the proposed safety system and the reliability of such system is evaluated through a statistical analysis.
218

Model Validation and Discovery for Complex Stochastic Systems

Jha, Sumit Kumar 02 July 2010 (has links)
In this thesis, we study two fundamental problems that arise in the modeling of stochastic systems: (i) Validation of stochastic models against behavioral specifications such as temporal logics, and (ii) Discovery of kinetic parameters of stochastic biochemical models from behavioral specifications. We present a new Bayesian algorithm for Statistical Model Checking of stochastic systems based on a sequential version of Jeffreys’ Bayes Factor test. We argue that the Bayesian approach is more suited for application do- mains like systems biology modeling, where distributions on nuisance parameters and priors may be known. We prove that our Bayesian Statistical Model Checking algorithm terminates for a large subclass of prior probabilities. We also characterize the Type I/II errors associated with our algorithm. We experimentally demonstrate that this algorithm is suitable for the analysis of complex biochemical models like those written in the BioNetGen language. We then argue that i.i.d. sampling based Statistical Model Checking algorithms are not an effective way to study rare behaviors of stochastic models and present another Bayesian Statistical Model Checking algorithm that can incorporate non-i.i.d. sampling strategies. We also present algorithms for synthesis of chemical kinetic parameters of stochastic biochemical models from high level behavioral specifications. We consider the setting where a modeler knows facts that must hold on the stochastic model but is not confident about some of the kinetic parameters in her model. We suggest algorithms for discovering these kinetic parameters from facts stated in appropriate formal probabilistic specification languages. Our algorithms are based on our theoretical results characterizing the probability of a specification being true on a stochastic biochemical model. We have applied this algorithm to discover kinetic parameters for biochemical models with as many as six unknown parameters.
219

Evaluation of Ultra-Wideband Sensing Technology for Position Location in Indoor Construction Environments

Aryan, Afrooz January 2011 (has links)
Effective construction management involves real-time decisions regarding the progress of specific activities, the location of materials and equipment, and the construction site safety. The decision making process can be improved using real-time positioning technologies such as Radio Frequency Identification Device (RFID) systems, Global Positioning System (GPS), and Ultra Wide Band (UWB) sensors. While the GPS is not applicable to indoor positioning and RFID tags cannot provide a fully automated system for position location, the characteristics of UWB systems make this technology a strong candidate for a fully automated positioning system in an indoor construction environment. This thesis presents a comprehensive study of the performance of UWB systems in a controlled laboratory environment and in an institutional construction site in Waterloo, Canada as well as for a particular safety application. A primary objective of the research was to establish the accuracy of real-time position location under various conditions, including the effect of different construction materials (e.g., wood and metal), and to analyze changes in the accuracy of position location as construction progresses and the indoor environment physically evolves. Different challenges faced in implementing such a system in an active construction environment are addressed. Based on a statistical analysis of laboratory data, and considering the construction site experience, the reliability of the UWB positioning system for the aforementioned environments is discussed. Furthermore, an automated safety system is proposed using the real-time UWB positioning technology. Based on the error modeling of the UWB position location, an optimum alarming algorithm is designed for the proposed safety system and the reliability of such system is evaluated through a statistical analysis.
220

Theoretical and empirical essays on microeconometrics

Possebom, Vitor Augusto 17 February 2016 (has links)
Submitted by Vitor Augusto Possebom (vitorapossebom@gmail.com) on 2016-03-08T00:06:59Z No. of bitstreams: 1 possebom_2016_masters-thesis.pdf: 905848 bytes, checksum: 1d5af42563617b7a8058b09baab1e040 (MD5) / Rejected by Letícia Monteiro de Souza (leticia.dsouza@fgv.br), reason: Prezado, Vítor, O seu trabalho foge totalmente das normas ABNT ou APA. Por gentileza, verificar trabalhos dos seus colegas postados na Biblioteca Digital para conhecimento. Qualquer dúvida, estou a disposição para falar ao telefone, onde fica mais fácil a comunicação. Atenciosamente, Letícia Monteiro 3799-3631 on 2016-03-08T11:53:56Z (GMT) / Submitted by Vitor Augusto Possebom (vitorapossebom@gmail.com) on 2016-03-08T20:40:10Z No. of bitstreams: 1 possebom_2016_masters-thesis.pdf: 926719 bytes, checksum: 6db5399b2d8e24a4b01f2bec748e4e95 (MD5) / Rejected by Letícia Monteiro de Souza (leticia.dsouza@fgv.br), reason: Prezado Vítor, Favor alterar o seu trabalho conforme as normas da ABNT. 1 - O Epigrafo deve constar na 5ª página, anteriormente a Dedicatória. 2 - Agradecimentos na 7ª página: Deve constar uma versão em português antes da versão em inglês. O título deve ser em caixa alta, centralizado e em negrito. 3 - Assim como em Agradecimentos, os títulos de: Resumo, Abstract e Sumário, devem ser em caixa alta, centralizado e em negrito. Estou a disposição para eventuais dúvidas. Atenciosamente, Letícia Monteiro 3799-3631 on 2016-03-09T12:15:54Z (GMT) / Submitted by Vitor Augusto Possebom (vitorapossebom@gmail.com) on 2016-03-10T00:21:36Z No. of bitstreams: 1 possebom_2016_masters-thesis.pdf: 933731 bytes, checksum: 69d467a1d6cb459ddd326d7fd593b4f9 (MD5) / Approved for entry into archive by Letícia Monteiro de Souza (leticia.dsouza@fgv.br) on 2016-03-10T11:59:00Z (GMT) No. of bitstreams: 1 possebom_2016_masters-thesis.pdf: 933731 bytes, checksum: 69d467a1d6cb459ddd326d7fd593b4f9 (MD5) / Made available in DSpace on 2016-03-10T12:48:54Z (GMT). No. of bitstreams: 1 possebom_2016_masters-thesis.pdf: 933731 bytes, checksum: 69d467a1d6cb459ddd326d7fd593b4f9 (MD5) Previous issue date: 2016-02-17 / This Master Thesis consists of one theoretical article and one empirical article on the field of Microeconometrics. The first chapter\footnote{We also thank useful suggestions by Marinho Bertanha, Gabriel Cepaluni, Brigham Frandsen, Dalia Ghanem, Ricardo Masini, Marcela Mello, Áureo de Paula, Cristine Pinto, Edson Severnini and seminar participants at São Paulo School of Economics, the California Econometrics Conference 2015 and the 37\textsuperscript{th} Brazilian Meeting of Econometrics.}, called \emph{Synthetic Control Estimator: A Generalized Inference Procedure and Confidence Sets}, contributes to the literature about inference techniques of the Synthetic Control Method. This methodology was proposed to answer questions involving counterfactuals when only one treated unit and a few control units are observed. Although this method was applied in many empirical works, the formal theory behind its inference procedure is still an open question. In order to fulfill this lacuna, we make clear the sufficient hypotheses that guarantee the adequacy of Fisher's Exact Hypothesis Testing Procedure for panel data, allowing us to test any \emph{sharp null hypothesis} and, consequently, to propose a new way to estimate Confidence Sets for the Synthetic Control Estimator by inverting a test statistic, the first confidence set when we have access only to finite sample, aggregate level data whose cross-sectional dimension may be larger than its time dimension. Moreover, we analyze the size and the power of the proposed test with a Monte Carlo experiment and find that test statistics that use the synthetic control method outperforms test statistics commonly used in the evaluation literature. We also extend our framework for the cases when we observe more than one outcome of interest (simultaneous hypothesis testing) or more than one treated unit (pooled intervention effect) and when heteroskedasticity is present. The second chapter, called \emph{Free Economic Area of Manaus: An Impact Evaluation using the Synthetic Control Method}, is an empirical article. We apply the synthetic control method for Brazilian city-level data during the 20\textsuperscript{th} Century in order to evaluate the economic impact of the Free Economic Area of Manaus (FEAM). We find that this enterprise zone had positive significant effects on Real GDP per capita and Services Total Production per capita, but it also had negative significant effects on Agriculture Total Production per capita. Our results suggest that this subsidy policy achieve its goal of promoting regional economic growth, even though it may have provoked mis-allocation of resources among economic sectors. / Esta dissertação de mestrado consiste em um artigo teórico e um artigo empírico no campo da Microeconometria. O primeiro capítulo contribui para a literatura sobre técnica de inferência do método de controle sintético. Essa metodologia foi proposta para responder a questões envolvendo contrafactuais quando apenas uma unidade tratada e poucas unidades controle são observadas. Apesar de esse método ter sido aplicado em muitos trabalhos empíricos, a teoria formal por trás de seu procedimento de inferência ainda é uma questão em aberto. Para preencher essa lacuna, nós deixamos claras hipóteses suficientes que garantem a validade do Procedimento Exato de Teste de Hipótese de Fisher para dados em painel, permitindo que nós testássemos qualquer hipótese nula do tipo \emph{sharp} e, consequentemente, que nós propuséssemos uma nova forma de estimar conjuntos de confiança para o Estimador de Controle Sintético por meio da inversão de uma estatística de teste, o primeiro conjunto de confiança quando temos acesso apenas a dados agregados cuja dimensão de \emph{cross-section} pode ser maior que a dimensão temporal. Ademais, nós analisamos o tamanho e o poder do teste proposto por meio de um experimento de Monte Carlo e encontramos que estatísticas de teste que usam o método de controle sintético apresentam uma performance superior àquela apresentada pelas estatísticas de teste comumente analisadas na literatura de avaliação de impacto. Nós também estendemos nosso procedimento para abarcar os casos em que observamos mais de uma variável de interesse (teste simultâneo de hipótese) ou mais de uma unidade tratada (efeito agregado da intervenção) e quando heterocedasticidade está presente. O segundo capítulo é um artigo empírico. Nós aplicamos o método de controle sintético a dados municipais brasileiros durante o século 20 com o intuito de avaliar o impacto econômico da Zona Franca de Manaus (ZFM). Nós encontramos que essa zona de empreendimento teve efeitos positivos significantes sobre o PIB Real per capita e sobre a Produção Total per capita do setor de Serviços, mas também teve um efeito negativo e significante sobre a Produção total per capita do setor Agrícola. Nossos resultados sugerem que essa política de subsídio alcançou seu objetivo de promover crescimento econômico regional, apesar de possivelmente ter provocado falhas de alocação de recursos entre setores econômicos.

Page generated in 0.1122 seconds