• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4
  • 1
  • 1
  • Tagged with
  • 7
  • 7
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

A case study: an exploration of the implications of computer-assisted audit techniques on the audit approach in terms of the key elements of an assurance engagement.

MacDonald, Darren Kyle 08 1900 (has links)
A research report submitted to the Faculty of Commerce, Law and Management, University of the Witwatersrand, Johannesburg, In partial fulfillment of the requirements for the degree of Masters of Commerce in Accounting / Not only has IT become more prominent in the business environment, but it has also expanded the available tools at the auditors’ disposal. These tools are more commonly known as CAATs. The implications of CAATs have not been addressed adequately in the academic environment. As a result, this leads to the purpose of this research report: to illustrate the implications of introducing CAATs in the audit process on the five key elements of an assurance engagement. A case study methodology has been selected to explore this audit approach in great detail by focusing on one client and its audit firm. This methodology has been chosen to illustrate the context of a computerised audit and its specific consequences over a period of time. As a result, this study has managed to demonstrate the benefits from introducing CAATs throughout each key area of the audit process. In order to achieve these benefits, the auditor needs to consider several matters to ensure an efficient IT-based audit is realised. / PH2020
2

A survey on the current practices in computer audit carried out by internal auditors of organizations in Hong Kong.

January 1984 (has links)
by Ho Tai-wai, David [and] Woo Hou-kwong, Paul. / Bibliography: leaves 82-84 / Thesis (M.B.A.)--Chinese University of Hong Kong, 1984
3

En applicering av generaliserade linjära modeller på interndata för operativa risker.

Bengtsson Ranneberg, Emil, Hägglund, Mikael January 2015 (has links)
Examensarbetet använder generaliserade linjära modeller för att identifiera och analysera enhetsspecifika egenskaper som påverkar risken för operativa förluster. Företag exponeras sällan mot operativa förluster vilket gör att det finns lite information om dessa förluster. De generaliserade linjära modellerna använder statistiska metoder som gör det möjligt att analysera all tillgänglig interndata trots att den är begränsad. Dessutom möjliggör metoden att analysera frekvensen av förlusterna samt magnituden av förlusterna var för sig. Det är fördelaktigt att göra två separata analyser, oberoende av varandra, för att identifiera vilka enhetsspecifika egenskaper som påverkar förlustfrekvensen respektive förlustmagnituden. För att modellera frekvensen av förlusterna används en Poissonfördelning. För att modellera magnituden av förlusterna används en Tweediefördelning som baseras på en semiparametrisk fördelning. Frekvens- och magnitudmodellen kombineras till en gemensam modell för att analysera vad som påverkar den totala kostnaden för operativa förluster. Resultatet visar att enhetens region, inkomst per tjänstgjord timme, storlek, internbetyg och erfarenhet hos personalen påverkar kostnaden för operativa förluster. / The objective of this Master’s Thesis is to identify and analyze explanatory variables that affect operational losses. This is achieved by applying Generalized Linear Models and selecting a number of explanatory variables that are based on the company’s unit attributes. An operational loss is a rare event and as a result, there is a limited amount of internal data. Generalized Linear Models uses a range of statistical tools to give reliable estimates although the data is scarce.  By performing two separate and independent analyses, it is possible to identify and analyze various unit attributes and their impact of the loss frequency and loss severity. When modeling the loss frequency, a Poisson distribution is applied. When modeling the loss severity, a Tweedie distribution that is based on a semi-parametric distribution is applied. To analyze the total cost as a consequence of operational losses for a single unit with certain attributes, the frequency model and the severity model are combined to form one common model. The result from the analysis shows that the geographical location of the unit, the size of the unit, the income per working hour, the working experience of the employees and the internal rating of the unit are all attributes that affects the cost of operational losses.
4

Towards the Development of Business Intelligence : The Role of Business Intelligence in Managerial Decision Making - Evidence from the B2B Sector

Bravo, Mariangeles, Appelkvist, Jesper January 2018 (has links)
Information is the key for managers to make well-informed decisions. In recent years, technological advancements have been developed which made it possible for organizations to store and manage large quantities of data. Business intelligence is used to structure and narrow down data in order to acquire relevant information which could assist managers. BI is formed by a variety of systems and concepts which are interconnected and can work simultaneously. Furthermore, it was found that there are claims implying that BI can assist in the decision making of an organization. The following research will focus on how does business intelligence does that, with a specific emphasis on marketing managers working on large business-to-business organizations. Following a qualitative research with an exploratory approach, comparing relevant literature with the results obtained from ten performed interviews. Where it was observed how BI helps managers through providing useful and selected information.
5

Assessment of structural damage using operational time responses

Ngwangwa, Harry Magadhlela 31 January 2006 (has links)
The problem of vibration induced structural faults has been a real one in engineering over the years. If left unchecked it has led to the unexpected failures of so many structures. Needless to say, this has caused both economic and human life losses. Therefore for over forty years, structural damage identification has been one of the important research areas for engineers. There has been a thrust to develop global structural damage identification techniques to complement and/or supplement the long-practised local experimental techniques. In that respect, studies have shown that vibration-based techniques prove to be more potent. Most of the existing vibration-based techniques monitor changes in modal properties like natural frequencies, damping factors and mode shapes of the structural system to infer the presence of structural damage. Literature also reports other techniques which monitor changes in other vibration quantities like the frequency response functions, transmissibility functions and time-domain responses. However, none of these techniques provide a complete identification of structural damage. This study presents a damage detection technique based on operational response monitoring, which can identify all the four levels of structural damage and be implemented as a continuous structural health monitoring technique. The technique is based on monitoring changes in internal data variability measured by a test statistic <font face="symbol">c</font>2Ovalue. Structural normality is assumed when the <font face="symbol">c</font>2Om value calculated from a fresh set of measured data is within the limits prescribed by a threshold <font face="symbol">c</font>2OTH value . On the other hand, abnormality is assumed when this threshold value has been exceeded. The quantity of damage is determined by matching the <font face="symbol">c</font>2Om value with the <font face="symbol">c</font>2Op values predicted using a benchmark finite element model. The use of <font face="symbol">c</font>2O values is noted to provide better sensitivity to structural damage than the natural frequency shift technique. The analysis carried out on a numerical study showed that the sensitivity of the proposed technique ranged from three to thousand times as much as the sensitivity of the natural frequencies. The results from a laboratory structure showed that accurate estimates of damage quantity and remaining service life could be achieved for crack lengths of less than 0.55 the structural thickness. This was due to the fact that linear elastic fracture mechanics theory was applicable up to this value. Therefore, the study achieved its main objective of identifying all four levels of structural damage using operational response changes. / Dissertation (MSc (Mechanics))--University of Pretoria, 2007. / Mechanical and Aeronautical Engineering / unrestricted
6

Reducing Internal Theft and Loss in Small Businesses

Luster, Eric L 01 January 2018 (has links)
Every year, several documented data breaches happen in the United States, resulting in the exposure of millions of electronic records. The purpose of this single-case study was to explore strategies some information technology managers used to monitor employees and reduce internal theft and loss. The population for this study consisted of 5 information technology managers who work within the field of technology in the southwestern region of the United States. Participants were selected using purposeful sampling. The conceptual framework for this study included elements from information and communication boundary theories. Data were collected from semistructured interviews, company standard operating procedures, and policy memorandums, which provided detailed information about technology managers' experiences with data security. The collected data were transcribed, member checked, and triangulated to validate credibility and trustworthiness. Two themes emerged from data analysis: the development of policies, procedures, and standards on internal theft and loss, and the use of technology-driven systems to monitor employees and control theft and loss. Technology-based interventions allow leaders within an organization to protect the integrity of systems and networks while monitoring employee actions and behaviors. Study findings could be used by leaders of business organizations to identify and respond to theft and fraud in the workplace. Business leaders may also be able to use study findings to develop employee monitoring programs that help to prevent the loss of both organizational and customers' data, enhancing public trust as a potential implication for positive social change.
7

Product Information Management - bohatství ukryté v datech o produktu / Product Information Management - the fortune hidden in product data

Bort, Tomáš January 2008 (has links)
The exceeding supply over demand and very hard competitive conditions are nowadays the main features of the majority of sectors. A successful company is the one that is able to satisfy specific customers' needs, the one that has efficient cooperation with its suppliers throughout the whole supply chain and also the one that is able to speed up the in-house information exchange. Thus the company has to seek constantly new and innovative solutions. This is not possible without standardization and automatization of business processes. This master's thesis is dedicated to one of the possible solutions -- the Product Information Management (PIM). Since it is intended for business managers (without deep IT knowledge), at the beginning it answers the question why it is so important to know master data and to manage it. It specializes in managing product data, brings its comprehensive overview and identifies the advantages and drawbacks of the implementation as well as financial and organizational impacts. The consecutive chapter deals with simplified yet applicable approach to data management analysis (with emphasis on the PIM) and based on research, it mentions main mistakes of the implementation. In addition to the overview of main vendors of the PIM solution, it presents the latest trends in the PIM. Besides internal data synchronization, the thesis analyses several product standards -- the fundamental step towards external data synchronization, the key topic of the practical part. The whole thesis is conceived to provide an organization with a simple yet compact and therefore very effective tool for master product data insight and thus to help it to gain a competitive advantage.

Page generated in 0.1271 seconds