• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 24
  • 6
  • 6
  • 3
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 53
  • 53
  • 25
  • 21
  • 19
  • 12
  • 12
  • 10
  • 8
  • 7
  • 6
  • 6
  • 6
  • 6
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Video Distribution Over Ip Networks

Ozdem, Mehmet 01 February 2007 (has links) (PDF)
As applications like IPTV and VoD (Video on demand) are gaining popularity, it is becoming more important to study the behavior of video signals in the Internet access infrastructures such as ADSL and cable networks. Average delay, average jitter and packet loss in these networks affect the quality of service, hence transmission and access speeds need to be determined such that these parameters are minimized. In this study the behavior of the above mentioned IP networks under variable bit rate (VBR) video traffic is investigated. ns-2 simulator is used for this purpose and actual as well as artificially generated signals are applied to the networks under test. Variable bit rate (VBR) traffic is generated synthetically using ON/OFF sources with ON/OFF times taken from exponential or Pareto distributions. As VBR video shows long range dependence with a Hurst parameter between 0.5 and 1, this parameter was used as a metric to measure the accuracy of the synthetic sources. Two different topologies were simulated in this study: one similar to ADSL access networks and the other behaving like cable distribution network. The performance of the networks (delay, jitter and packet loss) under VBR video traffic and different access speeds were measured. According to the obtained results, minimum access speeds in order achieve acceptable quality video delivery to the customers were suggested.
12

Dvimačių Pareto dydžių maksimumų asimptotinė analizė / Asymptotical Analysis of Two-dimensional Pareto Maxima

Savulytė, Vaida 16 August 2007 (has links)
Darbo tikslas – sukonstruoti dvimatį skirstinį, kai duoti vienmačiai (marginalieji) skirstiniai, atlikti maksimumų asimptotinę analizę ir ištirti konvergavimo greitį. Dvimatis skirstinys konstruojamas dviem atvejais: kai vektorių komponentės yra priklausomos ir nepriklausomos. Detalesnė konvergavimo greičio analizė atlikta, kai komponentės yra priklausomos. Tyrimui buvo pasirinktas Pareto skirstinys. Pirmoje tiriamosios dalies ir rezultatų dalyje yra konstruojamas dvimatis skirstinys, skaičiuojamos jo pagrindinės charakteristikos, tiriama, ar prie visų parametrų reikšmių jos egzistuoja. Taip pat generuojami atsitiktiniai dydžiai, kurių skirstiniai yra sukonstruotosios skirstinio funkcijos marginalieji skirstiniai, ir eksperimentiškai bandoma pagrįsti gautus rezultatus. Antroje dalyje atliekama asimptotinė analizė. Apibrėžiami dvimačiai maksimumai, ieškomas ribinis skirstinys. Juos suradus, apibrėžiamas apytikslis konvergavimo greičio įvertis, atliekama jo bei paklaidų kompiuterinė analizė, ieškoma, kokioms sąlygoms esant jie yra mažiausi. Sukonstruoto dvimačio skirstinio skaitinių charakteristikų tyrimas atliekama programiniu paketu MathCAD. Kompiuterinė konvergavimo greičio įverčių analizė atliekama programinio paketo Matlab pagalba. Jo aplinkoje buvo sukurta programa vartotojui, kuri nubraižo konvergavimo greičio įvertį bei paklaidas. / The aim of this paper is to construct two-dimensional random variables, having one-dimensional ones, carry out the asymptotical analysis and study the speed of convergence. Two-dimensional distribution is constructed in two ways: when the components of random variables are independent and dependent. As in the last few years Pareto distribution is popular in financial models, it was chosen for the analyses. It was proved, that in both cases of independent and dependent components of the vector, the limit distribution is the same. This means that although the components of the vector are dependent, the maxima are asymptotically independent. Besides, the errors are smaller than the approximate estimate. Although, the approximate estimate in the case of independent components is smaller than in the case of dependent components, the errors are on the contrary: they are smaller when the components are dependent than when the components are independent.
13

A distribuição generalizada de Pareto e mistura de distribuições de Gumbel no estudo da vazão e da velocidade máxima do vento em Piracicaba, SP / The generalized Pareto distribution and Gumbel mixture to study flow and maximum wind speed in Piracicaba, SP

Renato Rodrigues Silva 10 October 2008 (has links)
A teoria dos valores extremos é um tópico da probabilidade que descreve a distribuição assintótica das estatísticas de ordem, tais como máximos ou mínimos, de uma seqüência de variáveis aleatórias que seguem uma função de distribuição F normalmente desconhecida. Descreve, ainda, a distribuição assintótica dos excessos acima de um valor limiar de um ou mais termos dessa seqüência. Dessa forma, as metodologias padrões utilizada neste contexto consistem no ajuste da distribuição generalizada dos valores extremos a uma série de máximos anuais ou no ajuste da distribuição generalizada de Pareto a uma série de dados compostas somente de observações excedentes de um valor limiar. No entanto, segundo Coles et al. (2003), há uma crescente insatisfação com o desempenho destes modelos padrões para predição de eventos extremos causada, possivelmente, por pressuposições não atendidas como a de independência das observações ou pelo fato de que os mesmos não sejam recomendados para serem utilizados em algumas situações específicas como por exemplo e quando observações de máximos anuais compostas por duas ou mais populações independentes de eventos extremos sendo que a primeira descreve eventos menos freqüentes e de maior magnitude e a segunda descreve eventos mais freqüentes e de menor magnitude. Então, os dois artigos que compõem este trabalho tem como objetivo apresentar alternativas de análise de valores extremos para estas situações em que o ajuste dos modelos padrões não são adequados. No primeiro, foram ajustadas as distribuições generalizada de Pareto e exponencial, caso particular da GP, aos dados de vazão média diária do Posto de Artemis, Piracicaba, SP, Brasil, conjuntamente com a técnica do desagrupamento, (declustering), e comparadas as estimativas dos níveis de retorno para períodos de 5, 10, 50 e 100 anos. Conclui-se que as estimativas intervalares dos níveis de retorno obtidas por meio do ajuste da distribuição exponencial são mais precisas do que as obtidas com o ajuste da distribuição generalizada de Pareto. No segundo artigo, por sua vez, foi apresentada uma metodologia para o ajuste da distribuição de Gumbel e de misturas de duas distribuições de Gumbel aos dados de velocidades de ventos mensais de Piracicaba, SP. Selecionou-se a distribuição que melhor ajustou-se aos dados por meio de testes de hipóteses bootstrap paramétrico e critérios de seleção AIC e BIC. E concluiu-se que a mistura de duas distribuições de Gumbel é a distribuição que melhor se ajustou-se aos dados de velocidades máxima de ventos dos meses de abril e maio, enquanto que o ajuste da distribuição de Gumbel foi o melhor para os meses de agosto e setembro. / The extreme value theory is a probability topics that describes the asymtoptic distribution of order statistics such as maximum or minimum of random variables sequence that follow a distribution function F normaly unknown. Describes still, the excess asymtoptic distribution over threshold of this sequence. So, the standard methodologies of extremes values analysis are the fitting of generalized extreme value distribution to yearly maximum series or the fitting of generalized Pareto distribution to partial duration series. However, according to Coles et al. (2003), there is a growing dissatisfaction with the use this standard models for the prediction of extremes events and one of possible causes this fact may be a false assumptions about a sequence of observed data as a independence assumptions or because the standards models must not used in some specific situations like for example when maximum sample arise from two or more independents populations, where the first population describes more frequents and low intense events and the second population describes less frequents and more intense events. In this way, the two articles this work has a objective show alternatives about extreme values analysis for this situations that the standards models doesn´t recommended. In the first article, the generalized distribution Pareto and exponencial distribution, particular case of GP, together with to declustering methods was applied to mean daily flow of the Piracicaba river, Artemis station, Piracicaba, SP, and the estimates the return levels of 5, 10, 50 and 100 years were compared. We conclude that the interval estimates of the 50 and 100 year return levels obtained using the fitting the exponencial distribution are more precise than those obtained using the generalized Pareto distribution. In the second article, we propose the fit of Gumbel distribution and the Gumbel mixture to data maximum speed wind in Piracicaba, SP. We select the best model using bootstrap test of hypotheses and the AIC and BIC selection criteria We conclude that the mixture Gumbel is the best model to analyze the maximum wind speed data for months of april e may and otherside the fit of Gumbel distributions was the best fit to months of august e september.
14

Adaptive Asymmetric Slot Allocation for Heterogeneous Traffic in WCDMA/TDD Systems

Park, JinSoo 29 November 2004 (has links)
Even if 3rd and 4th generation wireless systems aim to achieve multimedia services at high speed, it is rather difficult to have full-fledged multimedia services due to insufficient capacity of the systems. There are many technical challenges placed on us in order to realize the real multimedia services. One of those challenges is how efficiently to allocate resources to traffic as the wireless systems evolve. The review of the literature shows that strategic manipulation of traffic can lead to an efficient use of resources in both wire-line and wireless networks. This aspect brings our attention to the role of link layer protocols, which is to orchestrate the transmission of packets in an efficient way using given resources. Therefore, the Media Access Control (MAC) layer plays a very important role in this context. In this research, we investigate technical challenges involving resource control and management in the design of MAC protocols based on the characteristics of traffic, and provide some strategies to solve those challenges. The first and foremost matter in wireless MAC protocol research is to choose the type of multiple access schemes. Each scheme has advantages and disadvantages. We choose Wireless Code Division Multiple Access/Time Division Duplexing (WCDMA/TDD) systems since they are known to be efficient for bursty traffic. Most existing MAC protocols developed for WCDMA/TDD systems are interested in the performance of a unidirectional link, in particular in the uplink, assuming that the number of slots for each link is fixed a priori. That ignores the dynamic aspect of TDD systems. We believe that adaptive dynamic slot allocation can bring further benefits in terms of efficient resource management. Meanwhile, this adaptive slot allocation issue has been dealt with from a completely different angle. Related research works are focused on the adaptive slot allocation to minimize inter-cell interference under multi-cell environments. We believe that these two issues need to be handled together in order to enhance the performance of MAC protocols, and thus embark upon a study on the adaptive dynamic slot allocation for the MAC protocol. This research starts from the examination of key factors that affect the adaptive allocation strategy. Through the review of the literature, we conclude that traffic characterization can be an essential component for this research to achieve efficient resource control and management. So we identify appropriate traffic characteristics and metrics. The volume and burstiness of traffic are chosen as the characteristics for our adaptive dynamic slot allocation. Based on this examination, we propose four major adaptive dynamic slot allocation strategies: (i) a strategy based on the estimation of burstiness of traffic, (ii) a strategy based on the estimation of volume and burstiness of traffic, (iii) a strategy based on the parameter estimation of a distribution of traffic, and (iv) a strategy based on the exploitation of physical layer information. The first method estimates the burstiness in both links and assigns the number of slots for each link according to a ratio of these two estimates. The second method estimates the burstiness and volume of traffic in both links and assigns the number of slots for each link according to a ratio of weighted volumes in each link, where the weights are driven by the estimated burstiness in each link. For the estimation of burstiness, we propose a new burstiness measure that is based on a ratio between peak and median volume of traffic. This burstiness measure requires the determination of an observation window, with which the median and the peak are measured. We propose a dynamic method for the selection of the observation window, making use of statistical characteristics of traffic: Autocorrelation Function (ACF) and Partial ACF (PACF). For the third method, we develop several estimators to estimate the parameters of a traffic distribution and suggest two new slot allocation methods based on the estimated parameters. The last method exploits physical layer information as another way of allocating slot to enhance the performance of the system. The performance of our proposed strategies is evaluated in various scenarios. Major simulations are categorized as: simulation on data traffic, simulation on combined voice and data traffic, simulation on real trace data. The performance of each strategy is evaluated in terms of throughput and packet drop ratio. In addition, we consider the frequency of slot changes to assess the performance in terms of control overhead. We expect that this research work will add to the state of the knowledge in the field of link-layer protocol research for WCDMA/TDD systems. / Ph. D.
15

On the calibration of Lévy option pricing models / Izak Jacobus Henning Visagie

Visagie, Izak Jacobus Henning January 2015 (has links)
In this thesis we consider the calibration of models based on Lévy processes to option prices observed in some market. This means that we choose the parameters of the option pricing models such that the prices calculated using the models correspond as closely as possible to these option prices. We demonstrate the ability of relatively simple Lévy option pricing models to nearly perfectly replicate option prices observed in nancial markets. We speci cally consider calibrating option pricing models to barrier option prices and we demonstrate that the option prices obtained under one model can be very accurately replicated using another. Various types of calibration are considered in the thesis. We calibrate a wide range of Lévy option pricing models to option price data. We con- sider exponential Lévy models under which the log-return process of the stock is assumed to follow a Lévy process. We also consider linear Lévy models; under these models the stock price itself follows a Lévy process. Further, we consider time changed models. Under these models time does not pass at a constant rate, but follows some non-decreasing Lévy process. We model the passage of time using the lognormal, Pareto and gamma processes. In the context of time changed models we consider linear as well as exponential models. The normal inverse Gaussian (N IG) model plays an important role in the thesis. The numerical problems associated with the N IG distribution are explored and we propose ways of circumventing these problems. Parameter estimation for this distribution is discussed in detail. Changes of measure play a central role in option pricing. We discuss two well-known changes of measure; the Esscher transform and the mean correcting martingale measure. We also propose a generalisation of the latter and we consider the use of the resulting measure in the calculation of arbitrage free option prices under exponential Lévy models. / PhD (Risk Analysis), North-West University, Potchefstroom Campus, 2015
16

On the calibration of Lévy option pricing models / Izak Jacobus Henning Visagie

Visagie, Izak Jacobus Henning January 2015 (has links)
In this thesis we consider the calibration of models based on Lévy processes to option prices observed in some market. This means that we choose the parameters of the option pricing models such that the prices calculated using the models correspond as closely as possible to these option prices. We demonstrate the ability of relatively simple Lévy option pricing models to nearly perfectly replicate option prices observed in nancial markets. We speci cally consider calibrating option pricing models to barrier option prices and we demonstrate that the option prices obtained under one model can be very accurately replicated using another. Various types of calibration are considered in the thesis. We calibrate a wide range of Lévy option pricing models to option price data. We con- sider exponential Lévy models under which the log-return process of the stock is assumed to follow a Lévy process. We also consider linear Lévy models; under these models the stock price itself follows a Lévy process. Further, we consider time changed models. Under these models time does not pass at a constant rate, but follows some non-decreasing Lévy process. We model the passage of time using the lognormal, Pareto and gamma processes. In the context of time changed models we consider linear as well as exponential models. The normal inverse Gaussian (N IG) model plays an important role in the thesis. The numerical problems associated with the N IG distribution are explored and we propose ways of circumventing these problems. Parameter estimation for this distribution is discussed in detail. Changes of measure play a central role in option pricing. We discuss two well-known changes of measure; the Esscher transform and the mean correcting martingale measure. We also propose a generalisation of the latter and we consider the use of the resulting measure in the calculation of arbitrage free option prices under exponential Lévy models. / PhD (Risk Analysis), North-West University, Potchefstroom Campus, 2015
17

Estimation of Pareto distribution functions from samples contaminated by measurement errors

Lwando Orbet Kondlo January 2010 (has links)
<p>The intention is to draw more specific connections between certain deconvolution methods and also to demonstrate the application of the statistical theory of estimation in the presence of measurement error. A parametric methodology for deconvolution when the underlying distribution is of the Pareto form is developed. Maximum likelihood estimation (MLE) of the parameters of the convolved distributions is considered. Standard errors of the estimated parameters are calculated from the inverse Fisher&rsquo / s information matrix and a jackknife method. Probability-probability (P-P) plots and Kolmogorov-Smirnov (K-S) goodnessof- fit tests are used to evaluate the fit of the posited distribution. A bootstrapping method is used to calculate the critical values of the K-S test statistic, which are not available.</p>
18

Pareto atsitiktinių dydžių geometrinis maks stabilumas / Geometric max stability of Pareto random variables

Juozulynaitė, Gintarė 30 August 2010 (has links)
Šiame darbe nagrinėjau vienmačių ir dvimačių Pareto atsitiktinių dydžių geometrinį maks stabilumą. Įrodžiau, kad vienmatis Pareto skirstinys yra geometriškai maks stabilus, kai alfa=1. Tačiau nėra geometriškai maks stabilus, kai alfa nelygu 1. Naudodamasi geometrinio maks stabilumo kriterijumi dvimačiams Pareto atsitiktiniams dydžiams, įrodžiau, kad dvimatė Pareto skirstinio funkcija nėra geometriškai maks stabili, kai vektoriaus komponentės nepriklausomos (kai alfa=1, beta=1 ir alfa nelygu 1, beta nelygu 1). Taip pat dvimatė Pareto skirstinio funkcija nėra geometriškai maks stabili, kai vektoriaus komponentės priklausomos (kai alfa=1, beta=1 ir alfa nelygu 1, beta nelygu 1). Dvimačių Pareto skirstinių tyrimas pateikė nelauktus rezultatus. Gauta, kad dvimatė Pareto skirstinio funkcija nėra geometriškai maks stabili, kai alfa=1, beta=1. Tačiau vienmatės marginaliosios Pareto skirstinio funkcijos yra geometriškai maks stabilios, kai alfa=1, beta=1. / In this work I analyzed geometric max stability of univariate and bivariate Pareto random variables. I have proved, that univariate Pareto distribution is geometrically max stable when alpha=1. But it is not geometrically max stable when alpha unequal 1. Using the criterion of geometric max stability for bivariate Pareto random variables, I have proved, that bivariate Pareto distribution function is not geometrically max stable, when vectors’ components are independent (when alpha=1, beta=1 and alpha unequal 1, beta unequal 1). Also bivariate Pareto distribution function is not geometrically max stable, when vectors’ components are dependent (when alpha=1, beta=1 and alpha unequal 1, beta unequal 1). Research of bivariate Pareto distributions submitted unexpected results. Bivariate Pareto distribution function is not geometrically max stable, when alpha=1, beta=1. But marginal Pareto distribution functions are geometrically max stable, when alpha=1, beta=1.
19

Estimation of Pareto distribution functions from samples contaminated by measurement errors

Lwando Orbet Kondlo January 2010 (has links)
<p>The intention is to draw more specific connections between certain deconvolution methods and also to demonstrate the application of the statistical theory of estimation in the presence of measurement error. A parametric methodology for deconvolution when the underlying distribution is of the Pareto form is developed. Maximum likelihood estimation (MLE) of the parameters of the convolved distributions is considered. Standard errors of the estimated parameters are calculated from the inverse Fisher&rsquo / s information matrix and a jackknife method. Probability-probability (P-P) plots and Kolmogorov-Smirnov (K-S) goodnessof- fit tests are used to evaluate the fit of the posited distribution. A bootstrapping method is used to calculate the critical values of the K-S test statistic, which are not available.</p>
20

A lei de Zipf e os efeitos de um tratado de livre comércio : caso da Guatemala

Orellana Aragón, Jorge Alberto January 2009 (has links)
Nos últimos 50 anos, registrou-se na América Central um dos processos de integração econômica e regional mais antigos do continente americano. O comércio intra-regional aumentou e dinamizou-se significativamente a partir da formação, em 1960, do Mercado Comum Centro-Americano (MCCA), assim como processos de integração de acordos bilaterais, regionais e multilaterais de livre comércio. A partir desses acordos, surge uma nova perspectiva para estudar os efeitos do comércio internacional, segundo a Nova Geografia Econômica (NGE), a qual tenta explicar como a evolução da distribuição do tamanho das cidades pode ser representada por uma distribuição de Pareto, que deriva numa regularidade empírica chamada Lei de Zipf, que brinda uma explicação de como interagem as forças de aglomeração nos centros urbanos, que favorecem a atividade econômica e o comércio internacional em geral. Esta dissertação procura investigar a maneira como as mudanças na política comercial geraram impacto sobre a ordem no tamanho das cidades e a influência no crescimento econômico da Guatemala. Para esse propósito, foi estimado o coeficiente de Pareto no período compreendido entre 1921-2002, e como um valor agregado na proposta original, foram introduzidas duas não-linearidades na distribuição e uma medida de apoio, como o Índice Hirschman-Herfindahl, para medir o grau da concentração urbana. Por outra parte, foi utilizado um modelo de taxas de variação para medir o impacto de abertura comercial no período de 1960-2002 sobre o crescimento econômico resultante. Portanto, pode-se enfatizar que alterações no tamanho da amostra podem conduzir a diferentes interpretações. Os resultados obtidos apontam um leve crescimento na desigualdade e divergência, apesar do índice de concentração urbana mostrar uma queda gradual desde o ano de 1964, na época MCCA, até o ano de 2002. No caso do período de 1973-2002, pode-se verificar a Lei de Gibrat, que indica ser o crescimento das cidades independente do seu tamanho. Também se verifica a hipótese de que a concentração urbana tem uma relação inversa com a abertura comercial, e que ela está correlacionada de forma positiva com o crescimento econômico no período de 1921- 1964. Com esses resultados, pode-se mostrar o caminho futuro da evolução do crescimento urbano, onde as maiores cidades reduziram o seu crescimento e as médias e pequenas cidades cresceram a um ritmo mais acelerado que os grandes centros, impulsionadas pelo crescimento do comércio internacional. / Over the last 50 years, in Central America was developed one of the oldest processes of economic and regional integration of the American Continent. Since the establishment in 1960 of the Central American Common Market (CACM), intra-regional trade significantly increased under multilateral, bilateral and regional free trade agreements of the integration process. Today, a new perspective exists in the study of the effects of international trade offered by the New Economic Geography (NEG) that seeks to explain the evolution and distribution of the size of the cities that can be represented by Pareto's distribution, derived from a well-known empirical regularity known as the Zipf's Law, which promotes an explanation of how the agglomeration forces in the urban centers interact in favor of economic activity and international trade. This dissertation tries to investigate the way in which the changes in trade policy generate changes in the order of the size in the cities, thus influencing the economic growth of Guatemala. To this purpose Pareto's coefficient was estimated for the period between 1921 and 2002 and it was considered as an aggregated value and therefore the original proposal of two not-linealities were introduced in the distribution as support, as the Hirschman-Herfindahl Index to measure the degree of the urban concentration. On the other hand, a model of variation rates was used during the 1960 and 2002 period to measure the trade impact of the trade opening on the resulting economic growth. Therefore, a model of variation rates was used to measure the impact of the trade opening on the resulting economic growth during the 1960-2002 period. For that reason, it is possible to emphasize the alterations in the size of the sample that can achieve different interpretations. The results obtained point to a slight growth in inequality and divergence, even though the index of urban concentration shows a gradual fall from 1964 during the CACM period up to 2002; which otherwise means that small cities grew at a smaller rate than the larger cities did. In the case of the 1973-2002 period, it is possible to verify Gibrat's Law which indicates that the growth of the cities is independent to its size. Also the hypothesis is verified that the urban concentration has an inverse relation with the trade opening and that the urban concentration is correlated in a positive form with the economic growth during the 1921-1964 period. With these results it is possible to show the future way of the evolution of urban growth where major cities would reduce its growth, and the middle and small cities will grow further at a more accelerated rate than the major cities driven by the growth of international trade. / En los últimos 50 años, se registró en Centro América uno de los procesos de integración económica y regional más antiguos del continente. El comercio intra-regional aumento y se dinamizo significativamente a partir de la formación, en 1960, del Mercado Común Centroamericano (MCCA), así como a los procesos de integración como acuerdos bilaterales, regionales y multilaterales de libre comercio. A partir de esos acuerdos, surge una nueva perspectiva para estudiar los efectos del comercio internacional, la Nueva Geografía Económica (NGE) la cual intenta explicar como la evolución de la distribución del tamaño de las ciudades puede ser representada por una distribución de Pareto, que se deriva en una regularidad empírica llamada la Ley de Zipf, que brinda una explicación de como interactúan las fuerzas de aglomeración en los centros urbanos y que favorecen a la actividad económica en el comercio internacional en general. Esta disertación busca investigar como los cambios en la política comercial generaran un impacto sobre el orden en el tamaño de las ciudades y esto a su vez como influencia en el crecimiento económico de Guatemala. Para ese propósito, fue estimado el coeficiente de Pareto en el período comprendido entre 1921-2002 y como un valor agregado en la propuesta original, fueran introducidas dos no-linealidades en la distribución y una medida de apoyo, como el Índice Hirschman-Herfindahl, para medir el grado de concentración urbana. Por otra parte, fue utilizado un modelo de tasas de variación para medir el impacto de apertura comercial en el período de 1960-2002 sobre el crecimiento económico resultante. Por lo tanto, se puede enfatizar que alteraciones en el tamaño de la muestra pueden conducir a diferentes interpretaciones. Los resultados obtenidos apuntan un leve crecimiento en la desigualdad y divergencia, a pesar de que el índice de concentración urbana muestra una caída gradual desde el año de 1964, en la época del MCCA, hasta el año de 2002. En el caso del período de 1973-2002, se puede verificar la Ley de Gibrat, que indica que el crecimiento de las ciudades es independiente de su tamaño. También se verifica la hipótesis de que la concentración urbana tiene una relación inversa con una apertura comercial y que está correlacionada de forma positiva con el crecimiento económico en el período de 1921-1964. Con estos resultados, se puede mostrar el camino futuro de la evolución del crecimiento urbano, donde las mayores ciudades reducirían su crecimiento y las medianas y pequeñas ciudades crecerán a un ritmo más acelerado que los grandes centros, impulsadas por el crecimiento del comercio internacional.

Page generated in 0.1109 seconds