391 |
“A Show about Language”: A Linguistic Investigation of the Creation of Humor in SeinfeldKing, Lindsey N 01 May 2017 (has links)
This study investigates the creation of humor in the dialog of the television sit-com Seinfeld to gain a deeper understanding of humor techniques in a long format. By analyzing six episodes of the series, it is seen that the Incongruity Theory of Humor, violations of Grice’s maxims of the Cooperative Principle, and perspective clashes (such as miscommunications) are essential to the humor throughout each episode.
|
392 |
Siguiendo Las Huellas De La Chola En Bolivia: Levantamiento De Una Cartografía Cultural AlteñaJanuary 2019 (has links)
abstract: The surge of the chola alteña in Bolivia as a woman who, after being historically discriminated, has achieved her empowerment through her practices of resistance and agency is a very particular and new phenomenon hardly studied. The contribution of this research is in principle to describe and discover the complexity of this occurrence, but at the same time to open a field of understanding the works of the chola as a preliminary input for alternative feminisms, in accordance to the particularity of each context. As a result, an eclectic perspective from different non-canonical theories stemming from the Americas has been adopted. For example, intersectionality stemming from various social, cultural, racial, and gender contexts is addressed by Kimberlé Crenshaw, Dora Inés Munévar, Ann Phoenix, Breny Mendoza y Sonia Montecinos. Research from Aníbal Quijano, Walter Mignolo and María Lugones proposes the decolonization of knowledge. From a Bolivian perspective, the proposal of communitarian feminism by Julieta Paredes and the chi’xi approach by Silvia Rivera Cusicanqui. At the same time, the documenting of the chola practices has been obtained from non-conventional digital and oral sources. Thus, this research becomes a referent for future feminist research about the chola, but also for understanding other movements and practices of subaltern and discriminated women in similar or different contexts.
The chola is characterized by her peculiar garment which was imposed by the colonizer in the XVIII century, nullifying her indigenous identity. However, this woman has continued to wear it to the present day as much as a tactic of resistance as of empowerment and agency and has transformed it into a current fashion for the valorization of her identity. She is a chi’xi subject who complements or antagonizes opposites without subsuming them. Finally, what guides her practices and strategies are her native cultural values, such as the principle of Living Well, cooperation, reciprocity, and godfatherhood.
. / Dissertation/Thesis / Doctoral Dissertation Spanish 2019
|
393 |
COMBINATORIAL SCREENING APPROACH IN DEVELOPING NON-EQUIATOMIC HIGH ENTROPY ALLOYSAkbari, Azin 01 January 2018 (has links)
High entropy alloys (HEA) are a relatively new group of alloys first introduced in 2004. They usually contain 5 to 6 different principle elements. Each of these elements comprise 5-35 at. % of the chemical composition of the alloy. There is a growing interest in the research community about the development of these alloys as well as their engineering applications. Some HEAs have interesting properties that have made them well suited for higher temperature applications, particularly refractory uses, while some have been shown to maintain their mechanical properties even at cryogenic temperatures.
Initially, the HEA research was focused on developing alloys with equiatomic compositions as it was believed that the single phase HEA would only form at such composition ratios. However, further research have found multiple HEAs with non-equiatomic chemical compositions. A major question that needs to be answered at this point is how to identify these non-equiatomic single phase alloy systems. Unlike the conventional alloys, the HEAs do not have a base element as a solvent, which complicates the identification of new alloy systems via conventional development techniques. To find a potential HEA, alloy development techniques of both exploratory and computational natures are being conducted within the community. Even though multiple HEAs have been successfully identified and fabricated by these techniques, in most cases they require extensive experimental data and are relatively time consuming and expensive. This study proposes a thin film combinatorial approach as a more efficient experimental method in developing new HEA alloy systems.
In order to study HEA systems with different crystal structures, nominal HEA compositions were selected, including: CoFeMnNiCu in order to achieve face centered cubic (FCC) HEA, OsRuWMoRe to obtain hexagonal closed packed (HCP) and VNbMoTaW in an attempt to form a body centered cubic (BCC) crystal structure. Thin film samples were fabricated by simultaneous magnetron sputtering of the elements onto silicon wafer substrates. The arrangement of the sputtering targets yielded a chemical composition gradient in the films which ultimately resulted in the formation of various phases. Some of these phases exhibited the desired single-phase HEA, albeit with non-equiatomic chemical compositions. Bulk samples of the identified HEA compositions were prepared by arc melting mixtures of the metals. Microstructure of both thin film samples and bulk samples were characterized via scanning electron microscopy (SEM), focused ion beam (FIB) and energy dispersive x-ray spectroscopy (EDX). The crystal structures of the samples were studied by X-ray diffraction (XRD) and electron backscattered diffraction (EBSD) technique. Applying nano-indentation technique, the mechanical properties of some of the samples were screened over the composition gradient as well.
By applying this combinatorial thin film approach, single-phase FCC, HCP and BCC HEAs were detected and successfully produced in bulk form. Additionally, screening the properties of the compositionally gradient thin films, as well as their chemical composition and crystal structure, provided a thorough understanding of the phase space. This experimental approach proved to be more efficient in identifying new alloy systems than conventional exploratory development methods.
|
394 |
The law relating to double jeopardy in labour lawTshikovhi, Rotondwa Happy January 2014 (has links)
Thesis (LLM. (Labour Law)) --University of Limpopo, 2014 / This research focuses on the application of the double jeopardy principle in labour law, section 188(1)(a (b) of the Labour Relations Act 66 of 1995, (herein the LRA) which provides that the dismissal is unfair if the employer fails to prove that the reason for the dismissal is fair and was effected in accordance with a fair procedure.
The first point which I would explain is the meaning of double jeopardy and whether it is applicable in labour law. The research articulates that the double jeopardy principle applies to labour law and enumerates ways it can be applied. The South African courts, in particular, the Labour Court and the Labour Appeal Court have delivered several judgements on the double jeopardy principle. These cases will be critically discussed in detail.
Comparison will be made with foreign labour law jurisprudence on the double jeopardy principle, particularly in Australia and the United States of America.
|
395 |
Puissance et nuisance de l’expression : les conceptions de la liberté d'expression à l'épreuve de la pornographie / Power and harm of expression : the theories of freedom of expression to the test of pornographyRamond, Denis 14 December 2015 (has links)
Partant du postulat selon lequel la principale justification de la répression de formes d’expressions réside dans leur nocivité supposée, nous tentons de répondre à la question suivante : comment définir des limites claires et cohérentes à la liberté d’expression ? L’analyse des controverses relatives à la pornographie, et en particulier de la manière dont les notions de liberté d’expression et de nuisance ont été articulées, contribue à répondre à cette question générale. À travers l’analyse des débats portant sur la restriction des représentations sexuelles, nous tentons de montrer que les parties en présence ne sont pas parvenues à définir la notion de « nuisance » de manière claire et satisfaisante, et ne permettent pas, dès lors, de définir avec précision les limites légitimes de la liberté d’expression. Les deux voies théoriques alternatives que nous avons identifiées, les conceptions instrumentales et déontologiques de la liberté d’expression, ne se révèlent pas plus convaincantes. Nous montrons néanmoins qu’il est possible de préciser le principe de non-nuisance en y intégrant deux éléments auparavant négligés : la subjectivité du récepteur, et les rapports d’autorité qui existent entre le locuteur et le récepteur. Nous défendons ainsi l’idée que le principe de non-nuisance reste l’instrument le plus clair et le plus cohérent pour fonder les limites de la liberté d’expression, à condition de l’amender et de le compléter. / Acknowledging the fact that the main justification to restrict some forms of expression lies in the harm they may cause to others, this thesis aims at answering the following question: how do we define clear and coherent limits of the freedom of expression? The study of the controversies regarding pornography, and particularly the way in which the concepts of freedom of expression and harm are closely linked together, is an important contribution in order to answer this vast subject. Through the analysis of debates with regard to sexual representations, this thesis aims at gaining a deeper understanding on how the authors were unsuccessful in defining the notion of « harm » in a clear and convincing way, and fail at allowing to set precisely the legitimate limits of freedom of expression. The two alternative theoretical approaches that were identified and established - the instrumental and deontological conceptions of freedom of expression – were not proven to be more satisfactory either. However, this research confirms that the harm principle can be clarified if two previously neglected aspects are included in the analysis: the receiver’s subjectivity, and the authority relationship established between the speaker and the viewer. Thus, it is argued that the harm principle, given that it is modified and completed, remains the most effective and adequate tool in order to ground the limits of freedom of expression.
|
396 |
A Hybrid Dynamic Modeling of Time-to-event Processes and ApplicationsAppiah, Emmanuel A. 31 May 2018 (has links)
In the survival and reliability data analysis, parametric and nonparametric methods are used to estimate the hazard/risk rate and survival functions. A parametric approach is based on the assumption that the underlying survival distribution belongs to some specific family of closed form distributions (normal, Weibull, exponential, etc.). On the other hand, a nonparametric approach is centered around the best-fitting member of a class of survival distribution functions. Moreover, the Kaplan-Meier and Nelson-Aalen type nonparametric approach do not assume either distribution class or closed-form distributions. Historically, well-known time-to-event processes are death of living specie in populations and failure of component in engineering systems. Recently, the human mobility, electronic communications, technological changes, advancements in engineering, medical, and social sciences have further diversified the role and scope of time-to-event processes in cultural, epidemiological, financial, military, and social sciences. To incorporate extensions, generalizations and minimize scope of existing methods, we initiate an innovative alternative modeling approach for time-to-event dynamic processes. The innovative approach is composed of the following basic components: (1) development of continuous-time state of dynamic process, (2) introduction of discrete-time dynamic intervention process, (3) formulation of continuous and discrete-time interconnected dynamic system, (4) utilizing Euler-type discretized schemes, developing theoretical dynamic algorithms, and (5) introduction of conceptual and computational state and parameter estimation procedures. The presented approach is motivated by state and parameter estimation of time-to-event processes in biological, chemical, engineering, epidemiological, medical, military, multiple-markets and social dynamic processes under the influence of discrete-time intervention processes. We initiate (1) a time-to-event process to be a probabilistic dynamic process with unitary state. Action, normal, operational, radical, survival, susceptible, etc. and its complementary states, reaction, abnormal, nonoperational, non-radical, failure, infective and so on (quantitative and qualitative variables), are considered to be illustrations of a unitary state of time-to-event dynamic processes. A unitary state is measured by a probability distribution function. Employing Newtonian dynamic modeling approach and observing the definition of hazard rate as a specific rate, survival or failure probabilistic state dynamic model is developed. This dynamic model is further extended to incorporate internal or external discrete-time dynamic intervention processes acting on unitary state time-to-event processes (2). This further demanded a formulation and development of an interconnected continuous-discrete-time hybrid, and totally discrete-time dynamic models for time-to-event processes (3). Employing the developed hybrid model, Euler-type discretized schemes, a very general fundamental conceptual analytic algorithm is outlined (4). Using the developed theoretical computational procedure in (4), a general conceptual computational data organizational and simulation schemes are presented (5) for state and parameter estimation problems in unitary state time-to-event dynamic processes. The well-known theoretical existing results in the literature are exhibited as special cases in a systematic and unified manner (6). In fact, the Kaplan-Meier and Nelson-Aalen type nonparametric estimation approaches are systematically analyzed by the developed totally discrete-time hybrid dynamic modeling process. The developed approach is applied to two data sets. Moreover, this approach does not require a knowledge of either a closed-form solution distribution or a class of distributions functions. A hazard rate need not be constant. The procedure is dynamic.
In the existing literature, the failure and survival distribution functions are treated to be evolving/progressing mutually exclusively with respect to corresponding to two mutually exclusive time varying events. We refer to these two functions (failure and survival) as cumulative distributions of two mutually disjoint state output processes with respect to two mutually exclusive time-varying complementary unitary states of a time-to-event processes in any discipline of interest (7). This kind of time-to-event process can be thought of as a Bernoulli-type of deterministic/stochastic process. Corresponding to these two complementary output processes of the Bernoulli-type of stochastic process, we associate two unitary dynamic states corresponding to a binary choice options/actions (8), namely, ({action, reaction}, {normal, abnormal}, {survival, failure}, {susceptible, infective}, {operational, nonoperational}, {radical, non-radical}, and so on.) Under this consideration, we extend unitary state time-to-event dynamic model to binary state time-to-event dynamic model. Using basic tools in mathematical sciences, we initiate a Newtonian-type dynamic approach for binary state time-to-event processes in the sciences, technologies, and engineering (9). Introducing an innovative concept of “survival state dynamic principle”, an innovative interconnected nonlinear non-stationary large-scale hybrid dynamic model for number of units/species and its unitary survival state corresponding to binary state time-to-event process is formulated (10). The developed model in (10) includes dynamic model (3) as a special case. The developed approach is directly applicable to binary state time-to-event dynamic processes in biological, chemical, engineering, financial, medical, physical, military, and social sciences in a coherent manner. A by-product of this is a transformed interconnected nonlinear hybrid dynamic model with a theoretical discrete-time conceptual computational dynamic process (11). Employing the transformed discrete-time conceptual computational dynamic process, we introduce notions of data coordination, state data decomposition and aggregation, theoretical conceptual iterative processes, conceptual and computational parameter estimation and simulation schemes, conceptual and computational state simulation schemes in a systematic way (12). The usefulness of the developed interconnected algorithm is validated by using three real world data sets (13). We note that the presented algorithm does not need a closed-form representation of distribution/likelihood function. In fact, it is free from any required assumptions of the “Classical Maximum Likelihood Function Approach” in the “Survival and Reliability Analysis.”
The rapid electronic communication and human mobility processes have facilitated to transform information, knowledge, and ideas almost instantly around the globe. This indeed generates heterogeneity, and it causes to form nonlinear and non-stationary dynamic processes. Moreover, the heterogeneity, non-linearity, non-stationarity, further generates two types of uncertainties, namely, deterministic, and stochastic. In view of this, it is obvious that nothing is deterministic. In short, the 21st century problems are highly nonlinear, non-stationary and under the influence of internal and external random perturbations. Using tools in stochastic analysis, interconnected deterministic models in (3) and (10) are extended to interconnected stochastic hybrid dynamic model for binary state time-to-event processes (14). The developed model is described by a large-scale nonlinear and non-stationary stochastic differential equations. Moreover, a stochastic version of a survival function is also introduced (15). Analytical, computational, statistical, and simulation algorithms/procedures are also extended and analyzed in a systematic and unified way (16). The presented interconnected stochastic model is motivated to initiate conceptual computational parameter and state estimation schemes for time-to-event statistical data (17). Again, stochastic version of computational algorithms are validated in the context of three real world data sets. The obtained parameter and state estimates show that the algorithm is independent of the choice of nonlinear transformation (18).
Utilizing the developed alternative innovative procedure and the recently modified deterministic version of Local Lagged Adapted Generalized Method of Moments (LLGMM) is also extended to stochastic version in a natural way (19). This approach provides a degree of measure of confidence, prediction, and planning assessments (20). In addition, it initiates a conceptual computational parameter and state estimation and simulation schemes that is suitable for the usage of mean square sub-optimal procedure (21). The usefulness and the significance of the approach is illustrated by applying to three data sets (22). The approach provides insight for investigating various type of invariant sets, namely, sustainable/unsustainable, survival/failure, reliable/unreliable (23), and qualitative properties such as sustainability versus unsustainability, reliability versus unreliability, etc. (24) Once again, the presented algorithm is independent of any form of survival distribution functions or data sets. Moreover, it does not require a closed form survival function distribution. We also note that the introduction of intervention processes provides a measure of influence and confidence for the usage of new tools/procedures/approaches in continuous-time binary state time-to-event dynamic process (25). Moreover, the presented dynamic modeling is more feasible for its usage of investigating a more complex time-to-event dynamic process (26). The developed procedure is dynamic and indeed non-parametric (27). The dynamic approach adapts with current changes and updates statistic process (28). The dynamic nature is natural rather than the existing static and single-shot techniques (29).
|
397 |
Contribuição ao estudo dos direitos fundamentais em matéria tributária: restrições a direitos do contribuinte e proporcionalidade / Contributo allo studio dei diritti fondamentali in materia fiscale: restrizioni sui diritti del contribuente e proporzionalità.Rocha, Paulo Victor Vieira da 09 June 2014 (has links)
O presente trabalho tem por objeto os direitos fundamentais em matéria tributária, mais especificamente as restrições a que tais direitos se sujeitam sob a justificativa da realização de bens coletivos. Toma-se como premissa que o controle de proporcionalidade é o método que melhor cumpre os propósitos de: a) garantir a máxima eficácia possível a esses direitos; b) permitir o mais intenso controle intersubjetivo de decisões judiciais acerca de restrições a eles impostas. Assume-se como parâmetro de trabalho os direitos de igualdade assegurados ao contribuinte, especialmente o direito à graduação de impostos conforme a capacidade contributiva, em busca da definição dos pressupostos e limites à aplicação do controle de proporcionalidade em matéria tributária. Discute-se o forte consenso que existe sobre a classificação das normas tributárias em fiscais e extrafiscais, concluindo-se que tal classificação é possível. Por conta disso, também se conclui não poder tal classificação servir de critério para delimitar os casos em que o controle de proporcionalidade é devido ou possível. Os pressupostos para aplicação do controle de proporcionalidade em matéria tributária, portanto, são em relação a cada direito fundamental: de um lado, sua construção interpretativa sob a forma de princípio, com um amplo âmbito de proteção; de outro, identificarem-se, em medidas estatais, efeitos restritivos sobre tal direito. / This work aims at the constitutional rights concerning tax matters, specially the restrictions to which such rights are subjected, due to justifications related to collective goods. It takes for granted the proportionality principle as the method which: a) ensures the best realization possible to theses rights; b) enables the most intense inter-subjective control over judicial decisions regarding the mentioned restrictions. The equality rights granted to taxpayers, specially the ability to pay principle were chosen as a pattern on searching to define the requirements and limits to the application of the proportionality principle in tax matters. It questions the strong doctrine of the classification of tax norms according to its fiscal or non-fiscal purposes, concluding that such a classification is not possible. Because of this, it also concludes that this distinction can not be the criteria to define the cases to which the proportionality principle is possible and should be done. The requirements to the application of the proportionality principle in tax matters, concerning each constitutional right, are therefore: a) its interpretative construction with the structure of a legal principle, with a wide realm of protection; b) the identification of State measures that restrict such a realm.
|
398 |
Planification de trajectoire pour drones de combat / Path planning of unmanned combat aircraft vehiclesMaillot, Thibault 03 October 2013 (has links)
L’objectif principal de ce travail est l’étude de la planification de trajectoires pour des drones de type HALE ou MALE. Les modèles cinématiques de ces drones sont étudiés. Les drones HALE sont modélisés par le système de Dubins. Pour les drones MALE, le modèle est construit en étudiant le repère cinématique du drone. Nous considérons les problèmes de planification de trajectoires point-point et point-pattern. Il s’agit, à partir de la position courante du drone, de rejoindre un point ou une figure prédéfinie dans l’espace. La planification point-point est abordée sous forme d’un problème de contrôle optimal. Deux méthodes sont proposées pour résoudre le problème point-pattern. D’abord nous présentons la synthèse en temps minimal pour le système de Dubins. Ensuite, nous développons une méthode basée sur le principe de LaSalle. La première méthode est utilisée au sein d’un algorithme de planification pour des drones HALE. La deuxième permet de stabiliser les deux types de drones considérés vers un pattern. Nous proposons une extension des algorithmes de planification développés, basée sur une discrétisation del’espace grâce aux graphes de Voronoï et une méthode de planification discrète, pour construire des trajectoiresdans des milieux encombrés. Nous étudions également le problème de couplage drone/capteur. Il s’agit de calculer une trajectoire permettant de satisfaire les objectifs du drone et de son capteur (une caméra). L’algorithme proposé est construit à partir de la résolution d’un problème quadratique sous contraintes.Dans une seconde partie, nous analysons un problème de contrôle optimal inverse. Celui-ci permet d’améliorer les résultats des méthodes de planification en s’inspirant du comportement des pilotes. Après avoir posé le problème, les résultats théoriques sont exposés et le cas particulier du système de Dubins est étudié en pratique. / This thesis is about path planning for HALE or MALE UAVs (Unmanned Aircraft Vehicles), possibly under mission constraints. As such, the study is performed at the kinematic level : HALE UAVs are represented as Dubins systems, and a model for MALE UAVs is constructed by studying their kinematic frame. In the first part, we tackle the path planning problem for a UAV that must join a target (a point or a pattern), starting from any position. The point to point path planning problem is addressed as an optimal control problem. Regarding the point to pattern path planning problem, two different methods are proposed. The former consists in solving the minimum time synthesis for the Dubins system, in order to obtain a basis for a HALE UAVs planning algorithm. The latter method relies on the LaSalle principle ; it permits to stabilize a HALE or MALE UAV to a pattern.In addition, extensions of the previously developed algorithms to cluttered environnement are provided. This extension is achieved thanks to a space discretization using Voronoi diagrams and a discrete planning method. Finally, the mission constraints are dealt with as a coupling problem between the UAV and its sensors. The proposed algorithm is presented in the form of a constrained quadratic problem.In the second part of this thesis, we want to refine the planning algorithm to get a result closer to trajectories of pilots. In order to do that, we solve an inverse optimal control problem where the cost to find is computed from the experience of pilots. Theoretical results are presented and applied to the particular case of the Dubins system.
|
399 |
我國制定政府審計準則可行性之研究 / Research of the Exercisability of Making Governmental Auditing Principles in Our Country許明昌 Unknown Date (has links)
近幾年來,我國之政府預算一直不斷地膨脹,民國八十二年之政府歲出預算更突破一億台幣之歷史性關卡。由此可知,我國之政府就如同一個大企業,擁有很多資源,作很多的事,但支出與效益是否成比例,則是一個很嚴肅的問題。所以,如何促進政府對資源作更有效的利用,避勉舞弊以及浪費的發生,乃為當前政府所面臨的重大課題。尤其,最近因為重大工程弊案的不斷發生,使得大家逐漸瞭解到,唯有透過一套完整的監督及控制系統,才可徹底解決此問題。亦即,唯有建立一套健全的政府審計制度,才可避免政府的資源被不當或無效率的使用。政府審計的目的乃在瞭解政府的預算是否被妥善運用,是否遵照有關法令規定辦理,並對政府資金是否被濫用、誤用及有無達成既定目標尤其重視與關切。由此可知,政府審計主要著重在遵循審計及績效審計,與一般民營組織所著重的財務審計有所區別。由於我國為大陸法系國家,政府審計主要係根據審計法及審計法施行細則來執行,然而由於法律不易修訂,無法隨著需要作即時的改變,再加上審計法及審計法施行細則所規定都是一些程序問題。因此,近幾年來審計部乃醞釀制訂政府審計準則,以彌補審計法及審計法施行細則規定的不足,並使政府審計有一套原則性的規範,以提昇我國政府審計的品質。唯我國之情況是否適合制定政府審計準則,相關之條件是否成熟,都頗令人質疑,值得我們深入去探討。
|
400 |
Formulary approach to the taxation of transnational corporations A realistic alternative?Celestin, Lindsay Marie France Clement January 2000 (has links)
The Formulary Approach to the Taxation of Transnational Corporations: A Realistic Alternative? Synopsis The central hypotheses of this thesis are: that global formulary apportionment is the most appropriate method for the taxation of transnational corporations (TNCs) in lieu of the present system commonly referred to as the separate accounting/arm's length method; and that it is essential, in order to implement the proposed global formulary model, to create an international organisation which would fulfil, in the taxation field, a role equivalent to that of the World Trade Organisation (WTO) in international trade. The world economy is fast integrating and is increasingly dominated by the activities of transnational enterprises. These activities create a dual tax problem for various revenue authorities seeking to tax gains derived thereon: Firstly, when two or more countries entertain conflicting tax claims on the same base, there arises what is commonly referred to as a double taxation problem. Secondly, an allocation problem arises when different jurisdictions seek to determine the quantum of the gains to be allocated to each jurisdiction for taxation purposes. The traditional regime for solving both the double taxation and the allocation problem is enshrined in a series of bilateral treaties signed between various nations. These are, in general, based on the Organisation for Economic Co-operation and Development (OECD) Model Treaty.1 It is submitted, in this thesis, that while highly successful in an environment characterised by the coexistence of various national taxation systems, the traditional regime lacks the essential attributes suitable to the emerging 'borderless world'. The central theme of this thesis is the allocation problem. The OECD Model attempts to deal with this issue on a bilateral basis. Currently, the allocation problem is resolved through the application of Articles 7 and 9 of the OECD Model. In both instances the solution is based on the 'separate enterprise' standard, also known as the separate entity theory. This separate accounts/arm's length system was articulated in the 1930s when international trade consisted of flows of raw materials and other natural products as well as flows of finished manufactured goods. Such trade is highly visible and may be adequately valued both at the port of departure or at the port of entry in a country. It follows that within this particular system of international trade the application of the arm's length principle was relatively easy and proved to be extremely important in resolving both the double taxation and apportionment problems. Today, however, the conditions under which international trade is conducted are substantially different from those that prevailed until the 1960s. * Firstly, apart from the significant increase in the volume of traditionally traded goods, trade in services now forms the bulk of international exchanges. In addition, the advent of the information age has dramatically increased the importance of specialised information whose value is notoriously difficult to ascertain for taxation purposes. * Secondly, the globalisation phenomenon which gathered momentum over the last two decades has enabled existing TNCs to extend their global operations and has favoured the emergence of new transnational firms. Thus, intra-firm trade conducted outside market conditions accounts for a substantial part of international trade. * Thirdly, further economic integration has been achieved following the end of the Cold War and the acceleration of the globalisation phenomenon. In this new world economic order only TNCs have the necessary resources to take advantage of emerging opportunities. The very essence of a TNC is 'its ability to achieve higher revenues (or lower costs) from its different subsidiaries as a whole compared to the results that would be achieved under separate management on an arm's length basis.'2 Yet, the prevailing system for the taxation of TNCs overlooks this critical characteristic and is therefore incapable of fully capturing, for taxation purposes, the aggregate gains of TNCs. The potential revenue loss arising from the inability of the present system to account for and to allocate synergy gains is substantial. It follows that the perennial questions of international taxation can no longer be addressed within the constraints of the separate entity theory and a narrow definition of national sovereignty. Indeed, in order to mirror the developments occurring in the economic field, taxation needs to move from a national to an international level. Moreover, a profound reform of the system is imperative in order to avoid harmful tax competition between nations and enhance compliance from TNCs. Such a new international tax system needs to satisfy the test of simplicity, equity, efficiency, and administrative ease. To achieve these objectives international cooperation is essential. The hallmark of international cooperation has been the emergence, after World War II, of a range of international organisations designed to facilitate the achievement of certain goals deemed essential by various nations. The need for an organisation to deal specifically with taxation matters is now overwhelming. Consequently, this thesis recommends the creation of an international organisation to administer the proposed system. The main objective of this international organisation would be to initiate and coordinate the multilateral application of a formulary apportionment system which, it is suggested, would deal in a more realistic way with 'the difficult problems of determining the tax base and allocating it appropriately between jurisdictions'.3 The global formulary apportionment methodology is derived from the unitary entity theory. The unitary theory considers a TNC as a single business which, for convenience, is divided into 'purely formal, separately-incorporated subsidiaries'.4 Under the unitary theory the global income of TNCs needs to be computed, then such income is apportioned between the various component parts of the enterprise by way of a formula which reflects the economic contribution of each part to the derivation of profits. The question that arises is whether the world of international taxation is ready for such a paradigm shift. It is arguable that this shift has already occurred albeit cautiously and in very subtle ways. Thus, the latest of the OECD Guidelines on the transfer pricing question provides that 'MNE [Multinational Enterprise] groups retain the freedom to apply methods not described in this Report to establish prices provided those prices satisfy the arm's length principle in accordance with these Guidelines.'5 Arguably, the globalisation process has created 'the specific situation' allowed for by the OECD. This thesis, therefore, explores the relative obsolescence of the bilateral approach to the taxation of TNCs and then suggests that a multilateral system is better adapted to the emerging globalised economy. The fundamental building blocks of the model proposed in this thesis are the following: * First, the administration and coordination of the proposed system is to be achieved by the creation of a specialised tax organisation, called Intertax, to which member countries would devolve a limited part of their fiscal sovereignty. * Second, in order to enable the centralised calculation of TNC's profits, the proposed system requires the formulation of harmonised methods for the measurement of the global profits of TNCs. Therefore, the efforts of the International Accounting Standards Committee (IASC) to produce international accounting standards and harmonised consolidation rules must be recognised and, if needs be, refined and ultimately implemented. * Third, the major function of Intertax would be to determine the commercial profits of TNCs on a standardised basis and to apportion the latter to relevant countries by way of an appropriate formula/formulas. Once this is achieved, each country would be free, starting from its share of commercial profits, to determine the taxable income in accordance with the particular tax base that it adopts and, ultimately, the tax payable within its jurisdiction. In the proposed system, therefore, a particular country would be able to independently set whatever depreciation schedules or investment tax credits it chooses, and adopt whatever tax accounting rules it deems fit relative to its policy objectives. Moreover, this thesis argues that the global formulary apportionment model it proposes is not dramatically opposed to the arm's length principle. Indeed, it suggests that the constant assumption to the contrary, even with regard to the usual formulary apportionment methodology, is extravagant because both methodologies are based on a common endeavour, that is, to give a substantially correct reflex of a TNC's true profits. It has often been objected that global formulary apportionment is arbitrary and ignores market conditions. This thesis addresses such concerns by rejecting the application of a single all-purpose formula. Rather, it recognises that TNCs operating in different industries require different treatment and, therefore, suggests the adoption of different formulas to satisfy specific industry requirements. For example, the formula applicable to a financial institution would be different to that applicable to the pharmaceutical industry. Each formula needs to be based on the fundamental necessity to capture the functions, taking into consideration assets used, and risks assumed within that industry. In addition, if the need arises, each formula should be able to be fine-tuned to fit specific situations. Moreover, it is also pertinent to note that the OECD already accepts 'the selected application of a formula developed by both tax administrations in cooperation with a specific taxpayer or MNE group...such as it might be used in a mutual agreement procedure, advance transfer pricing agreement, or other bilateral or multilateral determination.'6 The system proposed in this thesis can thus be easily reconciled with the separate accounting/arm's length which the OECD so vehemently advocates. Both models have the same preoccupations so that what is herein proposed may simply be characterised as an institutionalised version of the system advocated by the OECD. Multilateral formulary apportionment addresses both the double taxation and the allocation problems in international taxation. It resolves the apportionment question 'without depending on an extraordinary degree of goodwill or compliance from taxpayers.'7 It is therefore submitted that, if applied on a multilateral basis with a minimum of central coordination, it also seriously addresses the double taxation problem. Indeed, it is a flexible method given that different formulas may be devised to suit the needs of TNCs operating in different sectors. Consequently, formulary apportionment understood in this sense, is a realistic alternative to the limitations of the present system.
|
Page generated in 0.0759 seconds