• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 12
  • 10
  • 4
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 46
  • 10
  • 10
  • 10
  • 8
  • 6
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Conscience and Attestation : The Methodological Role of the “Call of Conscience” (Gewissensruf) in Heidegger’s Being and Time

Kasowski, Gregor Bartolomeus 10 1900 (has links)
Cette étude vise à exposer le rôle méthodologique que Martin Heidegger attribue à la conscience (Gewissen) dans Être et temps et à faire ressortir les implications de son interprétation de « l’appel de la conscience » comme le moyen de produire l’attestation (Bezeugung) de l’existence authentique en tant que possibilité du Dasein (ou être-dans-le-monde). Notre objectif initial est de montrer comment la notion heideggérienne de conscience a évolué avant la publication d’Être et temps en 1927 et d’identifier les sources qui ont contribué à l’interprétation existentiale de la conscience comme « l’appel du souci. » Notre analyse historique révèle notamment que Heidegger n’a jamais décrit la conscience comme un « appel » avant sa lecture du livre Das Gewissen (1925) par Hendrik G. Stoker, un jeune philosophe sud-africain qui a étudié à Cologne sous la direction de Max Scheler. Nous démontrons plus spécifiquement comment l’étude phénoménologique de Stoker—qui décrit la conscience comme « l’appel du devoir (Pflichtruf) » provenant de l’étincelle divine (synteresis) placée dans l’âme de chaque personne par Dieu—a influencé l’élaboration du concept de « l’appel existentiel » chez Heidegger. Mettant l’accent sur le rôle méthodologique de la conscience dans Être et temps, nous soulignons aussi l’importance des liens entre son concept de la conscience et la notion de « l’indication formelle » que Heidegger a mise au cœur de sa « méthode » dans ses cours sur la phénoménologie à Freiburg et Marbourg. Alors que de nombreux commentateurs voient dans « l’appel de la conscience » une notion solipsiste qui demeure impossible en tant qu’expérience, nous proposons un moyen de lever cette difficulté apparente en tentant de faire ressortir ce qui est « indiqué formellement » par la notion même de la conscience (Gewissen) dans Être et temps. Cette approche nous permet d’affirmer que le concept de conscience chez Heidegger renvoie à un phénomène de « témoignage » qui est radicalement différent de la notion traditionnelle de conscientia. Guidé par les principes mêmes de la phénoménologie heideggérienne, nous procédons à une analyse « destructrice » de l’histoire du mot allemand Gewissen qui nous révèle que la signification originelle de ce mot (établie dans le plus ancien livre préservé dans la langue allemande : le Codex Abrogans) était testimonium et non conscientia. À l’origine, Gewissen signifiait en effet « attestation »—ce qui est précisément le rôle assigné à la conscience par Heidegger dans Être et temps. Sur la base de cette découverte, nous proposons une manière de comprendre cette « attestation » comme une expérience possible : l’écoute du « témoignage silencieux » du martyr qui permet à Dasein de reconnaître sa propre possibilité d’authenticité. / This study aims to exhibit the methodological role that Martin Heidegger assigns to conscience (Gewissen) in Being and Time and to reveal the implications of his interpretation of the “call of conscience” as the means of producing the attestation (Bezeugung) of authentic existence as a possibility of Being-in-the-world (or Dasein). We begin by seeking to understand how Heidegger’s notion of conscience evolved prior to the 1927 publication of Being and Time and to identify the sources which contributed to his interpretation of conscience as the “call of care.” Our historical analysis notably reveals that Heidegger never once describes conscience as a “call” before reading Das Gewissen (1925) by Hendrik G. Stoker, a young South African philosopher who studied under Max Scheler’s direction at the University of Cologne. We specifically examine how Stoker’s phenomenological study—which describes conscience as the “call-of-duty” issued to each human being by the divine “spark” (synteresis) placed in his or her soul by God—contributed to shaping Heidegger’s account of the “existential call.” Focusing on the methodological role of conscience in Being and Time, we analyze Heidegger’s major work in light of his early lectures on phenomenology at Freiburg and Marburg. This approach confirms the relation between conscience in Being and Time and the concept of “formal indication” that Heidegger placed at the heart of his evolving “method” of phenomenological investigation. While many commentators have argued that Heidegger’s “call of conscience” is solipsistic and impossible to experience, we propose a way of reconsidering this apparent impasse by examining what Being and Time itself “formally indicates” with regard to conscience. We show that Heidegger’s conscience points to a phenomenon of existential “testimony” which is radically different from the traditional notion of conscientia. Guided by Heidegger’s “formal indication” of conscience, we “destructively” review the history of the German word Gewissen and reveal its original meaning to be “testimonium” not “conscientia.” In recognizing that Gewissen originally meant “attestation,” we show how Heidegger’s existential phenomenon of conscience can be understood as Dasein’s experience of hearing the “silent testimony” of the martyr. / Travail réalisé en cotutelle (Université de Paris IV-La Sorbonne).
32

Secure network programming in wireless sensor networks

Tan, Hailun, Computer Science & Engineering, Faculty of Engineering, UNSW January 2010 (has links)
Network programming is one of the most important applications in Wireless Sensor Networks as It provides an efficient way to update program Images running on sensor nodes without physical access to them. Securing these updates, however, remains a challenging and important issue, given the open deployment environment of sensor nodes. Though several security schemes have been proposed to impose the authenticity and Integrity protection on network programming applications, they are either energy Inefficient as they tend to use digital signature or lacks the data confidentiality. In addition, due to the absence of secure memory management in the current sensor hardware, the attacker could inject malicious code into the program flash by exploiting buffer overflow In the memory despite the secure code dissemination. The contribution of this thesis Is to provide two software-based security protocols and one hardware-based remote attestation protocol for network programming application. Our first protocol deploys multiple one-way key chains for a multi-hop sensor network. The scheme Is shown to be lower In computational, power consumption and communication costs yet still able to secure multi??hop propagation of program images. Our second protocol utilizes an Iterative hash structure to the data packets in network programming application, ensuring the data confidentiality and authenticity. In addition, we Integrated confidentiality and DoS-attack-resistance in a multi??hop code dissemination protocol. Our final solution is a hardware-based remote attestation protocol for verification of running codes on sensor nodes. An additional piece of tamper-proof hardware, Trusted Platform Module (TPM), is imposed into the sensor nodes. It secures the sensitive information (e.g., the session key) from attackers and monitors any platform environment changes with the Internal registers. With these features of TPM, the code Injection attack could be detected and removed when the contaminated nodes are challenged in our remote attestation protocol. We implement the first two software-based protocols with Deluge as the reference network programming protocol in TinyOS, evaluate them with the extensive simulation using TOSSIM and validate the simulation results with experiments using Tmote. We implement the remote attestation protocol on Fleck, a sensor platform developed by CSIRO that Integrates an Atmel TPM chip.
33

Secure network programming in wireless sensor networks

Tan, Hailun, Computer Science & Engineering, Faculty of Engineering, UNSW January 2010 (has links)
Network programming is one of the most important applications in Wireless Sensor Networks as It provides an efficient way to update program Images running on sensor nodes without physical access to them. Securing these updates, however, remains a challenging and important issue, given the open deployment environment of sensor nodes. Though several security schemes have been proposed to impose the authenticity and Integrity protection on network programming applications, they are either energy Inefficient as they tend to use digital signature or lacks the data confidentiality. In addition, due to the absence of secure memory management in the current sensor hardware, the attacker could inject malicious code into the program flash by exploiting buffer overflow In the memory despite the secure code dissemination. The contribution of this thesis Is to provide two software-based security protocols and one hardware-based remote attestation protocol for network programming application. Our first protocol deploys multiple one-way key chains for a multi-hop sensor network. The scheme Is shown to be lower In computational, power consumption and communication costs yet still able to secure multi??hop propagation of program images. Our second protocol utilizes an Iterative hash structure to the data packets in network programming application, ensuring the data confidentiality and authenticity. In addition, we Integrated confidentiality and DoS-attack-resistance in a multi??hop code dissemination protocol. Our final solution is a hardware-based remote attestation protocol for verification of running codes on sensor nodes. An additional piece of tamper-proof hardware, Trusted Platform Module (TPM), is imposed into the sensor nodes. It secures the sensitive information (e.g., the session key) from attackers and monitors any platform environment changes with the Internal registers. With these features of TPM, the code Injection attack could be detected and removed when the contaminated nodes are challenged in our remote attestation protocol. We implement the first two software-based protocols with Deluge as the reference network programming protocol in TinyOS, evaluate them with the extensive simulation using TOSSIM and validate the simulation results with experiments using Tmote. We implement the remote attestation protocol on Fleck, a sensor platform developed by CSIRO that Integrates an Atmel TPM chip.
34

Hardening High-Assurance Security Systems with Trusted Computing

Ozga, Wojciech 12 August 2022 (has links)
We are living in the time of the digital revolution in which the world we know changes beyond recognition every decade. The positive aspect is that these changes also drive the progress in quality and availability of digital assets crucial for our societies. To name a few examples, these are broadly available communication channels allowing quick exchange of knowledge over long distances, systems controlling automatic share and distribution of renewable energy in international power grid networks, easily accessible applications for early disease detection enabling self-examination without burdening the health service, or governmental systems assisting citizens to settle official matters without leaving their homes. Unfortunately, however, digitalization also opens opportunities for malicious actors to threaten our societies if they gain control over these assets after successfully exploiting vulnerabilities in the complex computing systems building them. Protecting these systems, which are called high-assurance security systems, is therefore of utmost importance. For decades, humanity has struggled to find methods to protect high-assurance security systems. The advancements in the computing systems security domain led to the popularization of hardware-assisted security techniques, nowadays available in commodity computers, that opened perspectives for building more sophisticated defense mechanisms at lower costs. However, none of these techniques is a silver bullet. Each one targets particular use cases, suffers from limitations, and is vulnerable to specific attacks. I argue that some of these techniques are synergistic and help overcome limitations and mitigate specific attacks when used together. My reasoning is supported by regulations that legally bind high-assurance security systems' owners to provide strong security guarantees. These requirements can be fulfilled with the help of diverse technologies that have been standardized in the last years. In this thesis, I introduce new techniques for hardening high-assurance security systems that execute in remote execution environments, such as public and hybrid clouds. I implemented these techniques as part of a framework that provides technical assurance that high-assurance security systems execute in a specific data center, on top of a trustworthy operating system, in a virtual machine controlled by a trustworthy hypervisor or in strong isolation from other software. I demonstrated the practicality of my approach by leveraging the framework to harden real-world applications, such as machine learning applications in the eHealth domain. The evaluation shows that the framework is practical. It induces low performance overhead (<6%), supports software updates, requires no changes to the legacy application's source code, and can be tailored to individual trust boundaries with the help of security policies. The framework consists of a decentralized monitoring system that offers better scalability than traditional centralized monitoring systems. Each monitored machine runs a piece of code that verifies that the machine's integrity and geolocation conform to the given security policy. This piece of code, which serves as a trusted anchor on that machine, executes inside the trusted execution environment, i.e., Intel SGX, to protect itself from the untrusted host, and uses trusted computing techniques, such as trusted platform module, secure boot, and integrity measurement architecture, to attest to the load-time and runtime integrity of the surrounding operating system running on a bare metal machine or inside a virtual machine. The trusted anchor implements my novel, formally proven protocol, enabling detection of the TPM cuckoo attack. The framework also implements a key distribution protocol that, depending on the individual security requirements, shares cryptographic keys only with high-assurance security systems executing in the predefined security settings, i.e., inside the trusted execution environments or inside the integrity-enforced operating system. Such an approach is particularly appealing in the context of machine learning systems where some algorithms, like the machine learning model training, require temporal access to large computing power. These algorithms can execute inside a dedicated, trusted data center at higher performance because they are not limited by security features required in the shared execution environment. The evaluation of the framework showed that training of a machine learning model using real-world datasets achieved 0.96x native performance execution on the GPU and a speedup of up to 1560x compared to the state-of-the-art SGX-based system. Finally, I tackled the problem of software updates, which makes the operating system's integrity monitoring unreliable due to false positives, i.e., software updates move the updated system to an unknown (untrusted) state that is reported as an integrity violation. I solved this problem by introducing a proxy to a software repository that sanitizes software packages so that they can be safely installed. The sanitization consists of predicting and certifying the future (after the specific updates are installed) operating system's state. The evaluation of this approach showed that it supports 99.76% of the packages available in Alpine Linux main and community repositories. The framework proposed in this thesis is a step forward in verifying and enforcing that high-assurance security systems execute in an environment compliant with regulations. I anticipate that the framework might be further integrated with industry-standard security information and event management tools as well as other security monitoring mechanisms to provide a comprehensive solution hardening high-assurance security systems.
35

Формирование профессиональных компетенций фитнес-инструктора в рамках создания корпоративной книги спортивного центра «Я фитнес» : магистерская диссертация / Formation of professional competence of a fitness instructor as part of a corporate book of a sports center "I am a fitness"

Филиппова, В. А., Filippova, V. A. January 2016 (has links)
Данная работа рассматривает систему формирования профессиональных компетенций фитнес-инструктора, уровень которых в значительной степени влияет на конкурентоспособность специалиста в сфере фитнес-индустрии. / This paper considers the formation of professional competencies of a fitness instructor, the professional level of which largely affects the competitiveness of a specialist in the fitness industry.
36

Le rôle et l'identité professionnelle des intervenants-accompagnants en reconnaissance des acquis et des compétences (RAC) auprès de la clientèle immigrante : une exploration

Bélanger, Mireille 24 April 2018 (has links)
Introduction : Depuis quelques décennies, et, en particulier, depuis quelques années, l'enjeu de la reconnaissance des acquis et des compétences au Québec pour la clientèle immigrante est réitéré à cause de son importance dans une insertion juste au marché du travail pour les clientèles qualifiées et les services offerts dans cette foulée sont de plus en plus diversifiés (CSE 2000, p. 10; Delors, J., 1996 ; MEQ, 2002a, 2005 ; MICC, 2005 ; CIQ, 2015). Les professionnels qui travaillent dans ce secteur de pratique dit de la reconnaissance des acquis et des compétences (RAC) ont la responsabilité d'offrir des services de qualité afin d'accompagner la clientèle dans ce qui s'avère quelquefois un véritable «parcours du combattant». Dans ce mémoire, notre objectif a été de nous centrer sur l'exploration des compétences requises pour accompagner ces personnes adéquatement et de nous intéresser à l'apport subjectif qu'ont ces compétences dans la définition d'un rôle particulier et d'une identité professionnelle singulière aux conseillers-ères d'orientation travaillant dans le secteur de pratique d'évaluation des compétences et de la reconnaissance des acquis. Objectifs : 1) Explorer l'identité professionnelle de même que les compétences professionnelles qui sont mises en œuvre par les intervenants travaillant auprès des immigrants en processus de reconnaissance de leurs acquis et compétences, qui définissent et alimentent la construction de cette identité; 2) Identifier leurs représentation des obstacles rencontrés par les personnes immigrantes et les lacunes, difficultés et facteurs potentiellement explicatifs de l'absence constatée des conseillères et conseillers d'orientation dans ce secteur de pratique; 3) Décrire les pratiques et le rôle de l'intervenant-e dans l'accompagnement des personnes immigrantes en processus de reconnaissance des acquis et des compétences. Méthodologie : Étude qualitative effectuée par l'entremise d'entretiens individuels auprès de participants majoritairement composés de femmes (6/8) ayant une expérience de travail variant d'un an à 24 ans dans le secteur éducatif de la reconnaissance des acquis et des compétences (RAC). La codification des verbatim a été effectuée à partir du logiciel QDA Miner. Résultats : Les principaux thèmes identifiés montrent : 1) des compétences professionnelles multiples et diversifiées, et un sentiment de compétence à un accompagnement adéquat pour la clientèle immigrante engagée dans un processus en reconnaissance des acquis et des compétences et en transition vers une insertion professionnelle; 2) une identité professionnelle collective caractérisée par un sentiment d'appartenance au groupe de professionnels œuvrant en reconnaissance des acquis et des compétences; 3) la manifestation d'une insatisfaction quant au manque de ressources financières, matérielles et humaines en environnement de travail de même que le constat de problèmes majeurs au niveau de la transmission de l'information liée au processus de reconnaissance des acquis et aux services offerts à la clientèle, aux partenaires et aux entreprises. Conclusion : Le processus de reconnaissance des acquis et compétences étant ancré dans un travail d'équipe qui requiert la participation et la collaboration de tout le personnel impliqué, l'émergence d'une identité professionnelle collective valorisée, traduite par une culture professionnelle de la reconnaissance des acquis et des compétences émerge du discours sur le Soi professionnel. De nombreuses compétences professionnelles sont ainsi nécessaires pour permettre aux intervenants en reconnaissance des acquis et des compétences d'accompagner leur clientèle québécoise et immigrante et d'être en mesure de répondre à leurs besoins respectifs. Mots clés : Reconnaissance des acquis et des compétences (RAC), rôle, identité professionnelle, accompagnement, compétences professionnelles (savoirs, savoirs-faire, savoirs-être), personnes immigrantes, conseillers d'orientation.
37

Hardware-Aided Privacy Protection and Cyber Defense for IoT

Zhang, Ruide 08 June 2020 (has links)
With recent advances in electronics and communication technologies, our daily lives are immersed in an environment of Internet-connected smart things. Despite the great convenience brought by the development of these technologies, privacy concerns and security issues are two topics that deserve more attention. On one hand, as smart things continue to grow in their abilities to sense the physical world and capabilities to send information out through the Internet, they have the potential to be used for surveillance of any individuals secretly. Nevertheless, people tend to adopt wearable devices without fully understanding what private information can be inferred and leaked through sensor data. On the other hand, security issues become even more serious and lethal with the world embracing the Internet of Things (IoT). Failures in computing systems are common, however, a failure now in IoT may harm people's lives. As demonstrated in both academic research and industrial practice, a software vulnerability hidden in a smart vehicle may lead to a remote attack that subverts a driver's control of the vehicle. Our approach to the aforementioned challenges starts by understanding privacy leakage in the IoT era and follows with adding defense layers to the IoT system with attackers gaining increasing capabilities. The first question we ask ourselves is "what new privacy concerns do IoT bring". We focus on discovering information leakage beyond people's common sense from even seemingly benign signals. We explore how much private information we can extract by designing information extraction systems. Through our research, we argue for stricter access control on newly coming sensors. After noticing the importance of data collected by IoT, we trace where sensitive data goes. In the IoT era, edge nodes are used to process sensitive data. However, a capable attacker may compromise edge nodes. Our second research focuses on applying trusted hardware to build trust in large-scale networks under this circumstance. The application of trusted hardware protects sensitive data from compromised edge nodes. Nonetheless, if an attacker becomes more powerful and embeds malicious logic into code for trusted hardware during the development phase, he still can secretly steal private data. In our third research, we design a static analyzer for detecting malicious logic hidden inside code for trusted hardware. Other than the privacy concern of data collected, another important aspect of IoT is that it affects the physical world. Our last piece of research work enables a user to verify the continuous execution state of an unmanned vehicle. This way, people can trust the integrity of the past and present state of the unmanned vehicle. / Doctor of Philosophy / The past few years have witnessed a rising in computing and networking technologies. Such advances enable the new paradigm, IoT, which brings great convenience to people's life. Large technology companies like Google, Apple, Amazon are creating smart devices such as smartwatch, smart home, drones, etc. Compared to the traditional internet, IoT can provide services beyond digital information by interacting with the physical world by its sensors and actuators. While the deployment of IoT brings value in various aspects of our society, the lucrative reward from cyber-crimes also increases in the upcoming IoT era. Two unique privacy and security concerns are emerging for IoT. On one hand, IoT brings a large volume of new sensors that are deployed ubiquitously and collect data 24/7. User's privacy is a big concern in this circumstance because collected sensor data may be used to infer a user's private activities. On the other hand, cyber-attacks now harm not only cyberspace but also the physical world. A failure in IoT devices could result in loss of human life. For example, a remotely hacked vehicle could shut down its engine on the highway regardless of the driver's operation. Our approach to emerging privacy and security concerns consists of two directions. The first direction targets at privacy protection. We first look at the privacy impact of upcoming ubiquitous sensing and argue for stricter access control on smart devices. Then, we follow the data flow of private data and propose solutions to protect private data from the networking and cloud computing infrastructure. The other direction aims at protecting the physical world. We propose an innovative method to verify the cyber state of IoT devices.
38

Une ontologie de la liberté dans l'œuvre de Paul Ricoeur / An ontology of freedom in the works of Paul Ricœur

Kuang, Quan 05 December 2016 (has links)
Penser l’être humain comme un être libre : tel est le souci constant de la réflexion de Paul Ricœur. Au niveau anthropologique, le philosophe cherche à remettre la liberté en lien avec la condition concrète de l’homme, où la nature, le corps, autrui et la société sont en jeu. Ce n’est qu’en reconnaissant une telle condition que la liberté devient réelle, affirmative et puissante. Au niveau ontologique, c’est la liberté humaine qui offre un accès privilégié à la compréhension de l’être en général. Après avoir écarté la notion de l’être comme objectivité, la liberté le dévoile comme un acte affirmatif. Au niveau méthodologique, le développement d’une herméneutique phénoménologique constitue un élément essentiel de l’ontologie de la liberté, dans la mesure où la pensée est elle-même toujours impliquée dans cette interrogation ontologique. C’est par un tel « engagement » que la pensée atteste l’être de la liberté, surtout face à l’énigme du mal. / One of the persistent concerns of Paul Ricœur’s philosophy is to reflect upon the human being as being free. At the anthropological level, Ricœur always considers freedom within concrete human condition, in which nature, body, others and society are all involved. One’s freedom becomes real, affirming and powerful only when such condition is recognized. At the ontological level, it is only as a free being that one has the privileged access to the understanding of being in general. Human as free being reveals that being should not be understood as objectivity, but as an affirming act. Finally, at the methodological level, Ricœur’s elaboration of phenomenological hermeneutics constitutes an essential element of his ontology of freedom. From his methodological development, it can be seen that the thinking philosopher himself, as a free being, is also engaged in his inquiry. In this regard, Ricœur’s philosophy as such becomes an attestation of free being, especially in his confrontation with the enigma of evil.
39

Artikel 2(3) van die Wet op Testamente 7 van 1953

Du Plessis, Jan Abraham. 11 1900 (has links)
Text in Afrikaans, abstract in Afrikaans and English / Title page in Afrikaans and English / Die verhandeling handel oor die howe se hantering van testamentere geskrifte wat nie aan die testamentsformaliteite voldoen nie. Die posisie voor die inwerkingtreding van artikel 2(3) word eerste bespreek. Daarna bespreek ek die wysigings wat aanbeveel is deur die regskommissie. Vervolgens bespreek ek die inwerkingtreding van artikel 2(3) wat aan die howe die bevoegdheid gee om 'n testamentere geskrif wat nie aan die formaliteite voldoen nie tot geldige testament te verhef. Die spesifieke probleemareas met die interpretasie van artikel 2(3) word uitgelig en in detail bespreek. Daarna maak ek 'n aanbeveling dat 'n handtekening of merk op 'n testamentere geskrif 'n drempelvereiste moet wees alvorens 'n hof dit kan kondoneer. Ek sluit af met 'n opinie oor hoe die artikel in die toekoms geinterpreteer behoort te word. / The dissertation is about the way in which the courts handle testamentary writings which do not comply with the formalities of a will. Firstly I discuss the position before the implementation of section 2(3). Thereafter I discuss the recommendations of the law commission. Then I discuss the implementation of section 2(3) which empowers the court to legalise a document which does not comply with the formal requirements of a will. The specific problem areas with regard to the interpretation of section 2(3) are highlighted and discussed in detail. Thereafter a recommendation is made that a signature or a mark on a testamentary writing must be a prerequisite before a court can consider condoning it. I conclude my dissertation with an opinion on how this section should be interpreted in the future. / Jurisprudence / LL. M. (Legum)
40

會計師簽證、稅務選查與租稅逃漏之關聯性:以自我選擇模型分析 / The Effects of Audit Selection and Usage of CPA Attestation on Tax Evasions: A Self-selection Analysis

洪聖閔, Hung, Sheng-Min Unknown Date (has links)
本研究使用台灣的營利事業所得稅資料並且以托賓自我選擇模型來檢視租稅逃漏、稅務選查與會計師稅務簽證三者的關聯性。實證結果顯示台灣的稅務機關在查核預算的限制之下會進行有系統的稅務選查程序;進一步來說,使用會計師稅務簽證的企業組織較不易被稅務機關選查。此外,企業組織的重要特徵如規模、財務狀況、獲利能力以及會計師稅務簽證的使用與否將會影響企業組織的租稅逃漏額。最後,我們發現會計師稅務簽證的使用對於租稅逃漏的影響會隨著公司的規模大小而改變,營收大於一億的企業組織若使用會計師稅務簽證將會比未使用會計師稅務簽證的企業組織有更高的逃漏稅額。因此,我們認為與會計師稅務簽證相關的政策法規可能需要重新評估是否適切。 / This study uses the business income tax dada in Taiwan to empirically examine the relationships between audit selection, tax evasion, and the usage of CPA-attestation. Tobit model with self-selection is used in this study. Empirical results provide evidence that tax agencies in Taiwan employ a systematic audit selection process, subject to the constraints of audit budgets. Additionally, firms using CPA-attestation are less likely to be selected for audit. The results also provide evidence that some important characteristics of firms, such as size, financial position, profitability, and the usage of CPA-attestation, may influence the amount of tax evasion. Finally, we find that the effect of CPA-attestation on tax evasion may vary with different sizes of firms using CPA-attestation. Hence, the policy and tax regulations associated with the usage of CPA-attestation may need to be re-evaluated.

Page generated in 0.0996 seconds