1 |
Telecommunications reform and efficiency performance: do good institutions matter?Mohamad, Noorihsan January 2010 (has links)
Until recently, most studies investigating the telecommunication reforms performance fail to incorporate the importance of institutions into the empirical analysis. This study highlights the importance of institutional governance on telecommunications efficiency and provides empirical results for the impact of institutions on reform outcomes. It provides significant evidence that the institutional environment in which reform progress takes place is an important determinant for successful reform. This study uses stochastic distance function approach to capture the role of institutions in explaining the efficiency differences across 70 countries. The empirical analysis reveals that policy stability in the form of substantive checks and balances on executive power is the most important aspect for successful reform. Independently, legal integrity improves telecommunications efficiency through privatization, while greater freedom from corruption influences the effectiveness of a regulatory body.
|
2 |
Properties of Distance Functions and Minisum Location ModelsBrimberg, Jack 03 1900 (has links)
This study is divided into two main parts. The first section deals with mathematical properties of distance functions. The fp norm is analyzed as a function of its parameter p, leading to useful insights for fitting this distance measure to a transportation network. Properties of round norms are derived, which allow us later to generalize some well-known results. The properties of a norm raised to a power are also investigated, and these prove useful in our subsequent analysis of location problems with economies or diseconomies of scale. A positive linear combination of the Euclidean and rectangular distance measures, which we term the weighted one-two norm, is introduced. This distance function provides a linear regression model with interesting implications on the characterization of transportation networks. A directional bias function is defined, and examined in detail for the Pp and weighted one-two norms.
In the second part of this study, several properties are derived for various forms of the continuous minisum location model. The Weiszfeld iterative solution procedure for the standard Weber problem with fp distances is also examined, and global and local convergence results obtained. These results are extended to the mixed-norm problem. In addition, optimality criteria are derived at non-differentiable points of the objective function. / Thesis / Doctor of Philosophy (PhD)
|
3 |
An investigation of assignment rules for fitting new subjects into clusters established by hierarchical pattern analysisFrary, Jewel McDow 02 March 2010 (has links)
Cluster analysis has been used fairly extensively as a means of grouping objects or subjects on the basis of their similarity over a number of variables. Almost all of the work to this point has been for the purpose of classifying an extant collection of similar objects into clusters or types. However, there often arises a need for methods of identifying additional objects as members of clusters that have already been established. Discriminant function analysis has been used for this purpose even though its underlying assumptions often cannot be met.
This study explored a different approach to the problem, namely, the use of distance functions as a means of identifying subjects as members of types which had been established by hierarchical pattern analysis. A sample of subjects was drawn randomly from a population; these subjects were assigned to the types that appeared in other samples that were drawn from the same population. Each type was defined by the vector of mean scores on selected variables for the subjects in that cluster. A new subject was identified as a member of a type if the distance function described by the assignment rule was a minimum for that type. Various criteria were established for judging the adequacy of the assignments.
Five distance functions were identified as being potential ways of assigning new subjects to types. Recommendations were not made for immediate practical application. However, the results were generally positive, and successful applications should be possible with the suggested methodological refinement. / Ph. D.
|
4 |
Hardware-Accelerated Ray Tracing of Implicit Surfaces : A study of real-time editing and rendering of implicit surfacesHansson Söderlund, Herman January 2021 (has links)
Background. Rasterization of triangle geometry has been the dominating rendering technique in the real-time rendering industry for many years. However, triangles are not always easy to work with for content creators. With the introduction of hardware-accelerated ray tracing, rasterization-based lighting techniques have been steadily replaced by ray tracing techniques. This shift may signify the opportunity of exploring other, more easily manipulated, geometry-type alternatives compared to triangle geometry. One such geometry type is implicit surfaces. Objectives. This thesis investigates the rendering speed, editing speed, and image quality of different implicit surface rendering techniques using a state-of-the-art, hardware-accelerated, path tracing implementation. Furthermore, it investigates how implicit surfaces may be edited in real time and how editing affects rendering. Methods. A baseline direct sphere tracing algorithm is implemented to render implicit surfaces. Additionally, dense and narrow band discretization algorithms that sphere trace a discretization of the implicit surface are implemented. For each technique, two variations that provide potential benefits in rendering speed are also tested. Additionally, a real-time implicit surface editor that can utilize all the mentioned rendering techniques is created. Rendering speed, editing speed, and image quality metrics are captured for all techniques using different scenes created with the editor and an existing hardware-accelerated path tracing solution. Image quality differences are measured using mean squared error and the image difference evaluator FLIP. Results. Direct sphere tracing achieves the best image quality results but has the slowest rendering speed. Dense discretization achieves the best rendering speed in most tests and achieves better image quality results compared to narrow band discretization. Narrow band discretization achieves significantly better editing speed than both direct sphere tracing and dense discretization. All variations of each algorithm achieve better or equal rendering and editing speed compared to their standard implementation. All algorithms achieve real-time rendering and editing performance. However, only discretized methods display real-time rendering performance for all scenes, and only narrow band discretization displays real-time editing performance for a larger number of primitives. Conclusions. Implicit surfaces can be rendered and edited in real time while using a state-of-the-art, hardware-accelerated, path tracing algorithm. Direct sphere tracing degrades in performance when the implicit surface has an increased number of primitives, whereas discretization techniques perform independently of this. Furthermore, narrow band discretization is fast enough so that editing can be performed in real time even for implicit surfaces with a large number of primitives, which is not the case for direct sphere tracing or dense discretization. / Bakgrund. Triangelrastrering har varit den dominerande renderingstekniken inom realtidsgrafik i flera år. Trianglar är dock inte alltid lätta att jobba med för skapare av grafiska modeller. Med introduktionen av hårdvaruaccelererad strålspårning har rastreringsbaserade ljussättningstekniker stadigt ersatts av strålspårningstekniker. Detta skifte innebär att det kan finnas möjlighet för att utforska andra, mer lättredigerade geometrityper jämfört med triangelgeometri, exempelvis implicita ytor. Syfte. Detta examensarbete undersöker rendering- och redigeringshastigheten, samt bildkvaliteten av olika renderingstekniker för implicita ytor tillsammans med en spjutspetsalgoritm för hårdvaruaccelererad strålföljning. Den undersöker även hur implicita ytor kan redigeras i realtid och hur det påverkar rendering. Metod. En direkt sfärspårningsalgoritm implementeras som baslinje för att rendera implicita ytor. Även algoritmer som utför sfärstrålning över en kompakt- och smalbandsdiskretisering av den implicita ytan implementeras. För varje teknik implementeras även två variationer som potentiellt kan ge bättre prestanda. Utöver dessa renderingstekniker skapas även ett redigeringsverktyg för implicita ytor. Renderingshastighet, redigeringshastighet, och bildkvalité mäts för alla tekniker över flera olika scener som har skapats med redigeringsverktyget tillsammans med en hårdvaruaccelererad strålföljningsalgoritm. Skillnader i bildkvalité utvärderas med hjälp av mean squared error och evalueringsverktyget för bildskillnader som heter FLIP. Resultat. Direkt sfärspårning åstadkommer bäst bildkvalité, men har den långsammaste renderingshastigheten. Kompakt diskretisering renderar snabbast i de flesta tester och åstadkommer bättre bildkvalité än vad smalbandsdiskretisering gör. Smalbandsdiskretisering åstadkommer betydligt bättre redigeringshastighet än både direkt sfärspårning och kompakt diskretisering. Variationerna för respektive algoritm presterar alla lika bra eller bättre än standardvarianten för respektive algoritm. Alla algoritmer uppnår realtidsprestanda inom rendering och redigering. Endast diskretiseringsmetoderna uppnår dock realtidsprestanda för rendering med alla scener och endast smalbandsdiskretisering uppnår realtidsprestanda för redigering med ett större antal primitiver. Slutsatser. Implicita ytor kan renderas och redigeras i realtid tillsammans med en spjutspetsalgoritm för hårdvaruaccelererad strålföljning. Vid användning av direkt sfärstrålning minskar renderingshastigheten när den ytan består av ett stort antal primitiver. Diskretiseringstekniker har dock en renderingshastighet som är oberoende av antalet primitiver. Smalbandsdiskretisering är tillräckligt snabb för att redigering ska kunna ske i realtid även för implicita ytor som består stora antal primitiver.
|
5 |
Distance Functions and Image Processing on Point-Lattices : with focus on the 3D face- and body-centered cubic gridsStrand, Robin January 2008 (has links)
There are many imaging techniques that generate three-dimensional volume images today. With higher precision in the image acquisition equipment, storing and processing these images require increasing amount of data processing capacity. Traditionally, three-dimensional images are represented by cubic (or cuboid) picture elements on a cubic grid. The two-dimensional hexagonal grid has some advantages over the traditionally used square grid. For example, less samples are needed to get the same reconstruction quality, it is less rotational dependent, and each picture element has only one type of neighbor which simplifies many algorithms. The corresponding three-dimensional grids are the face-centered cubic (fcc) grid and the body-centered cubic (bcc) grids. In this thesis, image representations using non-standard grids is examined. The focus is on the fcc and bcc grids and tools for processing images on these grids, but distance functions and related algorithms (distance transforms and various representations of objects) are defined in a general framework allowing any point-lattice in any dimension. Formulas for point-to-point distance and conditions for metricity are given in the general case and parameter optimization is presented for the fcc and bcc grids. Some image acquisition and visualization techniques for the fcc and bcc grids are also presented. More theoretical results define distance functions for grids of arbitrary dimensions. Less samples are needed to represent images on non-standard grids. Thus, the huge amount of data generated by for example computerized tomography can be reduced by representating the images on non-standard grids such as the fcc or bcc grids. The thesis gives a tool-box that can be used to acquire, process, and visualize images on high-dimensional, non-standard grids.
|
6 |
Efficiency and Ratio Analysis in Assessing Firms' Corporate Performance. A Closer Look to the Case of RomaniaFilimon, Nela 09 July 2004 (has links)
El marco conceptual de la tesis esta definido por la utilización del Análisis Envolvente de Datos (DEA) para evaluar resultados de empresas y sectores industriales de países con economía de mercado avanzada y países en transición. DEA esta relacionado con los conceptos económicos de función de producción y frontera eficiente dentro de un marco non-parametrico. El análisis se centra en el tratamiento de temas como: los principales factores que determinan la tasa de variación del output; las economías de escala; eficiencia en beneficios; y el grado de utilización de la capacidad, para mencionar solo algunas de las aplicaciones que se van a encontrar en esta tesis. Desde un punto de vista metodológico, a medida que se avanza de un capitulo a otro, se pasa gradualmente de un marco de análisis menos restrictivo como por ejemplo las fronteras técnicas, a un marco mas restrictivo como las fronteras de costes y de beneficios. Se trabaja con los conceptos de función de distancia y de función de distancia direccional, así como con rendimiento de escala variables y constantes (VRS, CRS) y con orientación input y/o output. El primer Capitulo esta dedicado a la base de datos que consiste de 1379 empresas agrupadas en seis sectores de la industria manufacturera perteneciendo a: textil, papel y productos de papel, químicos, caucho y productos plásticos. Las empresas pertenecen a siete países europeos, cinco de la UE - Bélgica, Francia, Italia, Holanda y España - y dos países en transición, Bulgaria y Rumania. La base de datos contiene información contable de final de año y cubre un periodo de tiempo de cuatro años, de 1995 a 1998. En el Capitulo 2, el objetivo es cuantificar los principales factores explicativos de la tasa de variación del output y de aquí de la productividad global. La literatura tradicional sobre este tema da como principales factores explicativos el cambio en la eficiencia técnica, el cambio técnico y el consumo de inputs. La novedad sobre la metodología DEA utilizada en el Capitulo 2 consiste primero, en la medición del cambio técnico de tres maneras diferentes: (a) con datos del año final; (b) con datos del año inicial y (c) promediando los resultados de (a) y (b). Segundo, se calcula el efecto de escala partiendo de la variación en el consumo de inputs (en vez de la eficiencia técnica que se suele utilizar habitualmente). El Capitulo 3 trata el tema de la eficiencia en costes. El objetivo es presentar un método para estimar la ineficiencia debida a la existencia de inputs fijos en el proceso de producción. La dificultad de ajustarlos a corto plazo puede generar variaciones en el grado de utilización de la capacitad productiva. En el Capitulo 4, se construye una medida de la eficiencia en beneficios de una empresa, basada en el concepto de función de distancia direccional. Se define como la desviación normalizada entre los beneficios máximos y observados y se conoce con el nombre de Nerlovian Profit Efficiency (NPE). La normalización viene dada por el valor de la dirección de los variables input y output. En la maximización de los beneficios se considera también el impacto de restricciones adicionales sobre créditos, gastos financieros e inmovilizado y se relacionan con la literatura sobre restricciones presupuestarias.Globalmente, los resultados indican que las economías en transición van por detrás de las economías de mercado avanzadas, en todos los países las empresas presentan ineficiencia en costes debido a problemas de ajuste de los factores fijos de producción a corto plazo y las restricciones de presupuesto son limitativas especialmente para los países en transición. / The conceptual framework of this dissertation is defined by the use of Data Envelopment Analysis (DEA) tools for assessing corporate performance of firms and industrial sectors from countries acting long ago under the laws of the market mechanism, and from transition economies. DEA relates to the economic notion of a production function and an efficiency frontier in a non-parametric setting. The analysis performed here focuses on the treatment of issues related to: the performance of the firm analysing the main contributing factors in the output growth rate; an assessment of the effects of economies of scale; benchmarking of a firm's profit performance; and an assessment of the capacity utilisation degree, only to mention some of the applications to be found in this dissertation. From a methodological point of view, as we go over the chapters, we switch from a less restrictive framework of analysis, i.e. technical frontiers, to gradually more restrictive settings that is, cost and profit frontiers. We work in turn with distance functions and directional distance functions, VRS and CRS, and with both input, and/or output orientations.Chapter 1 is dedicated to the database which consists of 1379 firms grouped in six industrial sectors from the manufacturing industry: textile weaving; other textiles; pulp, paper and paperboard; basic chemicals; rubber products, and plastic products. We work with seven European countries, five of them belonging to the advanced market economies - Belgium, France, Italy, Netherlands, and Spain - and two from transition economies, Bulgaria and Romania respectively. The database consists of accounting information, end-year observations, and covers a time period of four years, from 1995 to 1998. In Chapter 2, the objective is to quantify the main contributing factors in explaining the growth in output, and hence firms' performance in productivity. The traditional literature on this topic gives as main explanatory factors for the observed changes in productivity: the technical efficiency change, technical change, and the increase in inputs' usage. The novelty about the non-parametric methodology (DEA) we use in Chapter 2 comes first, from the fact that it allows us to measure technical change, using three different settings: (a) work with final year data; (b) with initial year, and (c) averaging the results previously obtained in (a) and (b). Second, we capture the scale effect (usually isolated from the technical efficiency) from the decomposition of input usage factor. In Chapter 3, we take up the issue of assessing firms' performance from the perspective of cost efficiency analysis, maintaining the non-parametric framework. The objective of the chapter is to present a method for estimating the inefficiency due to the existence of fixed input factors in the production process. The difficulty to adjust them in the short-run could generate variations in the degree of utilisation of the productive capacity.In Chapter 4, our profit efficiency measure is constructed based on directional distance function concept rather than the usual distance function, commonly used in most applications. We define our profit efficiency measure as the normalised deviation between maximal and observed profits and we call it the Nerlovian Profit Efficiency (NPE). The normalisation is given by the value of the direction of input and output variables. We maximise profits also considering the impact of additional constraints on debts, interests paid and fixed assets, and link with the literature on soft/hard budget constraints.Overall, our findings show that the transition economies perform well behind the advanced market economies, in all countries firms exhibit cost inefficiency due to adjustment problems, and the budget constraints are binding especially for the transition countries.
|
7 |
Distance Functions and Their Use in Adaptive Mathematical MorphologyĆurić, Vladimir January 2014 (has links)
One of the main problems in image analysis is a comparison of different shapes in images. It is often desirable to determine the extent to which one shape differs from another. This is usually a difficult task because shapes vary in size, length, contrast, texture, orientation, etc. Shapes can be described using sets of points, crisp of fuzzy. Hence, distance functions between sets have been used for comparing different shapes. Mathematical morphology is a non-linear theory related to the shape or morphology of features in the image, and morphological operators are defined by the interaction between an image and a small set called a structuring element. Although morphological operators have been extensively used to differentiate shapes by their size, it is not an easy task to differentiate shapes with respect to other features such as contrast or orientation. One approach for differentiation on these type of features is to use data-dependent structuring elements. In this thesis, we investigate the usefulness of various distance functions for: (i) shape registration and recognition; and (ii) construction of adaptive structuring elements and functions. We examine existing distance functions between sets, and propose a new one, called the Complement weighted sum of minimal distances, where the contribution of each point to the distance function is determined by the position of the point within the set. The usefulness of the new distance function is shown for different image registration and shape recognition problems. Furthermore, we extend the new distance function to fuzzy sets and show its applicability to classification of fuzzy objects. We propose two different types of adaptive structuring elements from the salience map of the edge strength: (i) the shape of a structuring element is predefined, and its size is determined from the salience map; (ii) the shape and size of a structuring element are dependent on the salience map. Using this salience map, we also define adaptive structuring functions. We also present the applicability of adaptive mathematical morphology to image regularization. The connection between adaptive mathematical morphology and Lasry-Lions regularization of non-smooth functions provides an elegant tool for image regularization.
|
8 |
Incluindo funções de distância e extratores de características para suporte a consultas por similaridade / Including distance functions and features extractors to support similarity queriesBêdo, Marcos Vinícius Naves 20 September 2013 (has links)
Sistemas Gerenciadores de Bases de Dados Relacionais (SGBDR) são capazes de lidar com um alto volume de dados. As consultas nestes sistemas são realizados a partir da relação de ordem total, domínio sob o qual estão definidos dados simples como números ou strings, por exemplo. No caso de dados complexos, como imagens médicas, áudio ou séries-temporais financeiras que não obedecem as propriedade da relação acima citada e necessária uma abordagem que seja capaz de realizar a recuperação por conteúdo destes dados em tempo hábil e com semântica adequada. Nesse sentido, a literatura nos apresenta, como paradigma consolidado, as consultas por similaridade. Esse paradigma e a base para o funcionamento de muitos aplicativos de auxílio a tomada de decisão pelo especialista como Recuperação de Imagens Médicas por Conteúdo (CBMIR) e Recuperação de Áudio por Conteúdo (CBAR) e inclui diversas sub-áreas de pesquisa tais como extratores de características, funções de distância e métodos de acesso métrico. O desenvolvimento de novos métodos extratores de características e novas funções de distância são de fundamental importância para a diminuição do gap semântico entre os aplicativos e usuários, enquanto os métodos de acesso métricos são os reponsáveis diretos pela rápida resposta dos sistemas. Integrar todas essas funcionalidades em um framework de suporte a consultas por similaridade dentro de um SGBDR permanece um grande desafio. Esse trabalho objetiva estender uma proposta inicial dos recursos disponíveis no SIREN, inserindo novos extratores de características e funções de distância para imagens médicas e séries-temporais financeiras transformando-o em um framework, de forma que seus componentes possam ser utilizados via comandos Structured Query Language (SQL). Os resultados poderão ser diretamente utilizados por aplicativos de auxílio a tomada de decisão pelo especialista / Database Management Systems (DBMS) can deal with large amount of data. The queries on those systems obey the total order relation (TOR), domain where simple data such as numbers or strings are defined. In the case of complex data (e.g.: medical images, audio or temporal time-series) which does not obey the TOR properties, it\'s mandatory a new approach that can retrieve complex data by content with time skilful and proper semantics. To do so, the literature presents us, as consolidated paradigm, the similarity queries. This paradigm is the base of many computer aided applications (e.g.: Content-Based Medical Image Retrieval (CBMIR) and Content-Based Audio Retrieval (CBAR)) and include several research areas such as features extraction, distance functions and metrical access methods (MAM). Developing new features extractors methods and new distance functions (and combine them) are crucial to reduce the semantic gap between the content-based applications and the users. The MAM are responsible to provide fast and scalable answer to the systems. Integrate all those functionalities in one framework that can provide support to similarity queries inside a DBMS remains a huge challenge. The main objective of this work is extend the initial resources of the system SIREN, inserting new features extractor methods and distance functions to medical images, audio and financial time-series, turning it into a framework. All components may be used by extended Structured Query Language (SQL) commands. The SQL can be directly used by computer-aided applications
|
9 |
A Multiple Criteria Sorting Approach Based On Distance FunctionsCelik, Bilge 01 July 2011 (has links) (PDF)
Sorting is the problem of assignment of alternatives into predefined ordinal classes according to multiple criteria. A new distance function based solution approach is developed for sorting problems in this study. The distance to the ideal point is used as the criteria disaggregation function to determine the values of alternatives. These values are used to sort them into the predefined classes. The distance function is provided in general distance norm. The criteria disaggregation function is determined according to the sample preference set provided by decision maker. Two mathematical models are used in order to determine the optimal values and assign classes. The method also proposes an approach for handling alternative opt imal solutions, which are widely seen in sorting problems. Probabilities of belonging to each class for an alternative are calculated using the alternative optimal solutions and provided as the outputs of the model. Decision maker assigns the alternatives into classes according to these probabilities. The method is applied to five data sets and results are provided for different performance measures. Different distance norms are tried for each data set and their performances are evaluated for each data set. The probabilistic approach is also applied to UTADIS. The performance of the distance based model and modified UTADIS are compared with the previous sorting methods such as UTADIS and classification tree. The developed method has new aspects such as using distances to ideal point for sorting purpose and providing probabilities of belonging to classes. The handling of alternative optimal solutions within the method instead of a post-optimality analysis is another new and c ritical aspect of the study.
|
10 |
Incluindo funções de distância e extratores de características para suporte a consultas por similaridade / Including distance functions and features extractors to support similarity queriesMarcos Vinícius Naves Bêdo 20 September 2013 (has links)
Sistemas Gerenciadores de Bases de Dados Relacionais (SGBDR) são capazes de lidar com um alto volume de dados. As consultas nestes sistemas são realizados a partir da relação de ordem total, domínio sob o qual estão definidos dados simples como números ou strings, por exemplo. No caso de dados complexos, como imagens médicas, áudio ou séries-temporais financeiras que não obedecem as propriedade da relação acima citada e necessária uma abordagem que seja capaz de realizar a recuperação por conteúdo destes dados em tempo hábil e com semântica adequada. Nesse sentido, a literatura nos apresenta, como paradigma consolidado, as consultas por similaridade. Esse paradigma e a base para o funcionamento de muitos aplicativos de auxílio a tomada de decisão pelo especialista como Recuperação de Imagens Médicas por Conteúdo (CBMIR) e Recuperação de Áudio por Conteúdo (CBAR) e inclui diversas sub-áreas de pesquisa tais como extratores de características, funções de distância e métodos de acesso métrico. O desenvolvimento de novos métodos extratores de características e novas funções de distância são de fundamental importância para a diminuição do gap semântico entre os aplicativos e usuários, enquanto os métodos de acesso métricos são os reponsáveis diretos pela rápida resposta dos sistemas. Integrar todas essas funcionalidades em um framework de suporte a consultas por similaridade dentro de um SGBDR permanece um grande desafio. Esse trabalho objetiva estender uma proposta inicial dos recursos disponíveis no SIREN, inserindo novos extratores de características e funções de distância para imagens médicas e séries-temporais financeiras transformando-o em um framework, de forma que seus componentes possam ser utilizados via comandos Structured Query Language (SQL). Os resultados poderão ser diretamente utilizados por aplicativos de auxílio a tomada de decisão pelo especialista / Database Management Systems (DBMS) can deal with large amount of data. The queries on those systems obey the total order relation (TOR), domain where simple data such as numbers or strings are defined. In the case of complex data (e.g.: medical images, audio or temporal time-series) which does not obey the TOR properties, it\'s mandatory a new approach that can retrieve complex data by content with time skilful and proper semantics. To do so, the literature presents us, as consolidated paradigm, the similarity queries. This paradigm is the base of many computer aided applications (e.g.: Content-Based Medical Image Retrieval (CBMIR) and Content-Based Audio Retrieval (CBAR)) and include several research areas such as features extraction, distance functions and metrical access methods (MAM). Developing new features extractors methods and new distance functions (and combine them) are crucial to reduce the semantic gap between the content-based applications and the users. The MAM are responsible to provide fast and scalable answer to the systems. Integrate all those functionalities in one framework that can provide support to similarity queries inside a DBMS remains a huge challenge. The main objective of this work is extend the initial resources of the system SIREN, inserting new features extractor methods and distance functions to medical images, audio and financial time-series, turning it into a framework. All components may be used by extended Structured Query Language (SQL) commands. The SQL can be directly used by computer-aided applications
|
Page generated in 0.1187 seconds