131 |
ISA 315R: I LJUSET AV OMARBETNINGEN : En kvalitativ studie av revisorers arbete med identifiering av risker och riskbedömning.Ek, Sara, Lundberg, Simon January 2023 (has links)
Revision handlar enligt branschorganisationen FAR (2006) om att på ett oberoende sätt bistå företag med professionella bedömningar av den ekonomiska rapporteringen som företag upprättar. Detta görs i syfte att förse de intressenter som företaget har med tillförlitlig information att använda som underlag för intressenternas beslutsprocesser. En revisors huvudsakliga uppgift blir därmed att bedöma huruvida ett företags ekonomiska rapportering innehåller väsentliga felaktigheter eller inte. Detta menar Messier et al., (2016, s.17), som belyser vikten av att som revisor förstå de risker som är kopplade till företagets verksamhet och hur dessa hanteras, i syfte att göra bedömningen av risker för väsentliga felaktigheter i den ekonomiska rapporteringen på ett optimalt sätt. För att en revisor ska veta hur denne ska gå till väga för att identifiera och bedöma risker för väsentliga felaktigheter i företagets ekonomiska rapportering så används en revisionsstandard för riskbedömning som ramverk och vägledning. Denna är upprättad av det internationella standardiseringsorganet IAASB och kallas ISA 315, vilket är vad denna studie kommer att handla om. Under 2019 så omarbetades denna revisionsstandard. Enligt Alfredsson, & Irle (2022) gjordes detta med huvudsakligt syfte att ge en tydligare vägledning för revisorer. Detta beslut togs till följd av den teknologisk utvecklingen som förändrat den ekonomiska miljön. Denna studie aspirerar till att undersöka den omarbetade revisionsstandarden för riskbedömning, och benämns ISA 315R. Det huvudsakliga målet är att undersöka om den omarbetade standarden har förändrat hur auktoriserade revisorer och revisorassistenter genomför sina bedömningar av risker för väsentliga felaktigheter i revisionen, med särskild inriktning på IT-relaterade risker. Genom en kvalitativ metod har denna studie kommit fram till att riskbedömningen är oförändrad efter ISA 315R, men att revisionsprocessen erhållit ett tillskott i form av fler obligatoriska arbetsdokument i revisionsarbetet. Omarbetningen har därför inte påverkat auktoriserade revisorers och revisorsassistenters bedömningar av väsentliga risker i revisioner. Förståelse för klienten och dess verksamhet var lika viktig innan standarden omarbetades, som den är nu.
|
132 |
Säkerhetspolicyer på svenska företag : Hur efterlevnad säkerställs bland anställda / Security policies at Swedish companiesCederstrand, Christopher, Berg, Elias January 2023 (has links)
I den digitala världen som vi lever i är vi mer uppkopplade och hanterar mer information än någonsin tidigare. Informationen, som i fel händer kan leda till katastrofala händelser för dagens företag. En viktig aspekt som ofta diskuteras är de anställda som ofta regleras av företagsspecifika policyer för att hålla företaget på en säker väg. Efterlevnad från de anställda är inte så enkelt som att bara tala om för dem att följa företagets regler, det måste finnas något mer för att motivera de anställda och därigenom skapa en säker miljö att arbeta inom. Denna rapport presenterar en studie där forskning har utförts för att analysera hur företag i Sverige arbetar för att höja efterlevnaden av företagets policyer med fokus påinformationssäkerhet och medvetenhet om informationssäkerhet. Genom att använda grounded theory i kombination med kvalitativa, semistrukturerade intervjuer med en representant från varje företag som har ansvar för företagets informationssäkerhet, har vi studerat vilka metoder företag använder för att öka efterlevnaden av deras policyer. Resultaten visar att utbildning är en viktig metod som tillämpas, men det finns fleraolika variationer av utbildning och företagen skiljer sig åt i hur de informerar sina anställda om risker som är kopplade till policyer. En annan upptäckt som har kommit fram i denna forskning är att ansvaret för utbildning och policyer är snarlika blandföretagen. I slutsatsen av denna rapport presenteras rekommendationer som potentiellt kan hjälpa företag att öka efterlevnaden av sina IT-policyer. Dessa rekommendationer baseras på de metoder som har framkommit i de intervjuer som utförts i denna studie.
|
133 |
Approaches to DIVA vaccination for fish using infectious salmon anaemia and koi herpesvirus disease as modelsMonaghan, Sean J. January 2013 (has links)
The expanding aquaculture industry continues to encounter major challenges in the form of highly contagious aquatic viruses. Control and eradication measures targeting the most lethal and economically damaging virus-induced diseases, some of which are notifiable, currently involve ‘stamping out’ policies and surveillance strategies. These approaches to disease control are performed through mass-culling followed by restriction in the movement of fish and fish products, resulting in considerable impacts on trade. Although effective, these expensive, ethically complex measures threaten the sustainability and reputation of the aquatic food sector, and could possibly be reduced by emulating innovative vaccination strategies that have proved pivotal in maintaining the success of the terrestrial livestock industry. DIVA ‘differentiating infected from vaccinated animal’ strategies provide a basis to vaccinate and contain disease outbreaks without compromising ‘disease-free’ status, as antibodies induced specifically to infection can be distinguished from those induced in vaccinated animals. Various approaches were carried out in this study to assess the feasibility of marker/DIVA vaccination for two of the most important disease threats to the global Atlantic salmon and common carp/koi industries, i.e. infectious salmon anaemia (ISA) and koi herpesvirus disease (KHVD), respectively. Antibody responses of Atlantic salmon (Salmo salar L.), following immunisation with an ISA vaccine, administered with foreign immunogenic marker antigens (tetanus toxoid (TT), fluorescein isothiocyanate (FITC) and keyhole limpet hemocyanin (KLH)) were assessed by antigen-specific enzyme linked immunosorbent assay (ELISA). Although antibodies were induced to some markers, these were unreliable and may have been affected by temperature and smoltification. Detectable antibodies to ISAV antigen were also largely inconsistent despite low serum dilutions of 1/20 being employed for serological analysis. The poor antibody responses of salmon to the inactivated ISA vaccine suggested that DIVA vaccination is not feasible for ISA. A similar approach for KHV, utilising green fluorescent protein (GFP) as the marker, similarly failed to induce sufficiently detectable antibody responses in vaccinated carp (Cyprinus carpio L.). However, as high anti-KHV antibody titres were obtained with an inactivated KHV vaccine (≥1/3200), alternative approaches were carried out to assess the feasibility of DIVA vaccination for carp. Investigations of early KHV pathogenesis in vivo and antigen expression kinetics in vitro (0-10 days post infection (dpi)) provided valuable data for the diagnostics necessary for DIVA surveillance strategies. Following viral infection, molecular methods were shown to be the most effective approach for early detection of KHV infected fish prior to sero-conversion, during which time antibodies are not detectable. An experimental immersion challenge with KHV, however, revealed complications in molecular detection during early infection. The KHV DNA was detected in external biopsies of skin and gills, but also internally in gut and peripheral blood leukocytes ≤ 6 hours post infection (hpi), suggesting rapid virus uptake by the host. The gills and gut appeared to be possible portals of entry, supported by detection of DNA in cells by in situ hybridisation (ISH). However, many false negative results using organ biopsies occurred during the first 4 dpi. The gills were the most reliable lethal biopsy for KHV detection by various polymerase chain reaction (PCR) assays, with a PCR targeting a glycoprotein-gene (ORF56) and a real-time PCR assay being the most sensitive of the 7 methods investigated. Importantly, non-lethal mucus samples reduced the number of false negative results obtained by all KHV PCR assays during the earliest infection stages with large levels of viral DNA being detected in mucus (up to 80,000 KHV DNA genomic equivalents 200 μL-1). KHV DNA was consistently detected in the mucus as a consequence of virus being shed from the skin. Determining the expression kinetics of different viral structural proteins can be useful for DIVA serological tests. Analysis of KHV antigen expression in tissues by immunohistochemistry and indirect fluorescent antibody test was inconclusive, therefore 2 novel semi-quantitative immunofluorescence techniques were developed for determining KHV antigen expression kinetics in susceptible cell lines. During the course of KHV infection in vitro, a greater abundance of capsid antigen was produced in infected cells compared to a glycoprotein antigen (ORF56), as determined by detection with antigen-specific monoclonal antibodies (MAbs). The capsid antigen was characterised as a ~100 kDa protein by SDS-PAGE and identified as a product of KHV ORF84 by matrix-assisted laser desorption ionisation time-of-flight mass spectrometry (MALDI-TOF/TOF MS). This antigen was subsequently detected in the serum of >25% of KHV infected/exposed carp (6/17), as well as in carp vaccinated with a live attenuated vaccine (3/4), but not with an inactivated vaccine (0/7), by Western blot making it a potential DIVA target for an inactivated vaccine. Attempts were made to improve the sensitivity of KHV serological testing by taking advantage of recombinant proteins specific for KHV (CyHV-3), rORF62 and rORF68 and eliminating any interference by cross-reacting antibodies to carp pox (CyHV-1). These proteins successfully reacted with anti-KHV antibodies. The feasibility of DIVA strategies for KHVD was determined using these recombinant antigens to coat ELISA plates. Differential antibody responses were detected from carp sera to an internal virus tegument protein (rORF62) and external region of a transmembrane protein (rORF68). Fish vaccinated with an inactivated vaccine produced significantly lower antibody responses to rORF62 than to rORF68, whereas infected, exposed and live attenuated vaccinated fish recognised both proteins allowing differentiation between vaccinated and infected carp. However, the sensitivity of the assay was limited, possibly by high levels of natural antibodies detected at the relatively low serum dilutions (1/200) used. As the capsid antigen (ORF84) and tegument protein (ORF62) are derived from internal KHV structural proteins, they induce non-neutralising antibodies, which may be useful for DIVA strategies. Such antibodies are longer lasting than neutralising antibodies and often comprise the majority of fish anti-viral antibodies. This was noted in a fish surviving experimental challenge, which had an antibody titre of 1/10,000, but neutralising titre of 1/45. Such antigens may therefore hold potential for developing effective serological diagnostic tests for KHV and provide the potential for DIVA strategies against KHVD. Natural antibodies will, however, continue to present a challenge to the development of sensitive and reliable KHV serological tests, and hence the application of DIVA strategies.
|
134 |
MIGRATING FROM A VAX/VMS TO AN INTEL/WINDOWS-NT BASED GROUND STATIONPenna, Sergio D., Rios, Domingos B. 10 1900 (has links)
International Telemetering Conference Proceedings / October 25-28, 1999 / Riviera Hotel and Convention Center, Las Vegas, Nevada / Upgrading or replacing production systems is always a very resource-consuming task, in particular if the systems being replaced are quite specialized, such as those serving any Flight Test Ground Station. In the recent past a large number of Ground Station systems were based in Digital’s VAX/VMS architecture. The computer industry then expanded very fast and by 1990 realtime PCM data processing systems totally dependent on hardware and software designed for IBM-PC compatible micro-computers were becoming available. A complete system replacement in a typical Ground Station can take from one to several years to become a reality. It depends on how complex the original system is, how complex the resulting system needs to be, how much resources are available to support the operation, how soon the organization needs it, etc. This paper intends to review the main concerns encountered during the replacement of a typical VAX/VMS-based by an Intel-Windows NT-based Ground Station. It covers the transition from original requirements to totally new requirements, from mini-computers to micro-computers, from DMA to high-speed LAN data transfers, while conserving some key architectural features. This 8-month development effort will expand EMBRAER’s capability in acquiring, processing and archiving PCM data in the next few years at a lower cost, while preserving compatibility with old legacy flight test data.
|
135 |
Assistance à la conduite en conditions atmosphériques dégradées par la prise en compte du risque routierGallen, Romain 10 December 2010 (has links) (PDF)
Les conditions atmosphériques dégradées telles que la pluie et le brouillard altèrent temporairement les conditions de conduite. Sur le réseau secondaire, la sur-accidentologie observée dans ces conditions témoigne d'une mauvaise adaptation de la conduite et en particulier de la vitesse. Nous proposons une méthodologie permettant d'estimer une vitesse de référence le long d'un trajet ainsi qu'une méthode fondée sur l'étude du risque routier pour moduler la vitesse en conditions d'adhérence et de visibilité dégradées. Notre estimation du risque routier s'appuie sur la réalisation de scénarios, extraits de l'accidentologie, à l'aide d'un modèle d'interactions véhicule-infrastructure-conducteur. Nous prenons en compte des caractéristiques statiques propres à l'infrastructure et à son environnement et les conditions météorologiques estimées en temps réel dans l'environnement direct du véhicule. Nous montrons qu'il existe des outils permettant d'alimenter les modèles en caractéristiques statiques. Enfin, nous présentons les méthodes fondées sur l'utilisation d'une caméra embarquée permettant de détecter et de caractériser en ligne les conditions atmosphériques dégradées. Nous détaillons en particulier notre contribution au travers d'une méthode de détection et de caractérisation du brouillard de nuit. Celle-ci est constituée d'un système dual s'appuyant sur la détection des halos autour des sources de lumière et sur la détection du voile de rétrodiffusion des phares du véhicule. Nous proposons finalement une méthode statique par caméra permettant de calibrer le système en conditions écologiques.
|
136 |
Contribution à la modélisation des durées de séjour du CHU de GrenobleDelhumeau, Cécile 06 December 2002 (has links) (PDF)
Cette thèse propose une méthodologie permettant l'identification des groupes homogènes de malades (GHM) du Centre Hospitalier Universitaire (CHU) de Grenoble dont la durée de séjour (DS) s'écarte de la "référence nationale", mais aussi d'identifier la présence d'éventuels groupes d'"outliers" (patients avec des DS extrêmes) dans ces GHM. L'enjeu est de taille : des séjours longs entraînent une occupation des lits supérieure à ce que permet la valorisation financière correspondant à la pathologie prise en charge. Il est donc important de repérer ces GHM, responsables du coût élevé du point Indice Synthétique d'Activité (ISA), unité d'oeuvre des hôpitaux utilisée par le Programme de Médicalisation des Systèmes d'Information pour connaître leur activité, leur allouer le budget correspondant, et contraindre à une productivité optimale. Les écarts entre la distribution des DS des GHM grenoblois et celle de leurs homologues de la base nationale ont été comparés. Pour chaque GHM, un profil, fondé sur la comparaison des quartiles de DS des distributions nationales et grenobloises, deux à deux, a été construit. Des méthodes statistiques de classification (analyse en composantes principales, classification hiérarchique, analyse discriminante et modèles de mélange) ont été utilisées pour repérer aisément et sûrement les GHM économiquement coûteux. De manière empirique, les profils grenoblois semblent se structurer en 9 catégories. La classification hiérarchique identifie 4 catégories de GHM, dont une particulière comptant 16 GHM, dans laquelle un tiers des patients participe fortement à la dérive en points ISA du CHU, et pour qui la mise en place d'une action économiquement rentable serait aisée. Le modèle de mélange montre que les GHM se structurent en 3 catégories et permet de valider la classification issue de l'approche multidimensionnelle.
|
137 |
Detecting financial reporting fraud : the impact and implications of management motivations for external auditors : evidence from the Egyptian contextKassem, Rasha January 2016 (has links)
Financial reporting fraud is a concern for investors, regulators, external auditors, and the public. Although the responsibility for fraud detection lies upon management and those charged with governance, external auditors are likely to come under scrutiny if fraud scandals come to light. Despite the audit regulators efforts in fighting fraud, evidence from prior literature revealed that external auditors still need guidance in assessing and responding to fraud risks. Hence the current study aims at helping external auditors properly assess and respond to the risk of financial reporting fraud in an effort to increase the likelihood of detecting it. In order to achieve this, the current study sought to explore the significance of various fraud factors in assessing the risks of financial reporting fraud and examined how external auditors could assess these fraud factors. The current study also explored the likely motivations behind management fraud, the impact of management motivations on the financial statements, and how external auditors could assess the impact of management motivations. The data for the current study was collected from external auditors working at various audit firms in Egypt via the use of mixed research methods, namely through an online questionnaire and semi-structured interviews. The findings of the current study revealed that management motives are the most significant factor in assessing the risk of financial reporting fraud. Hence the current study suggests that external audit should be viewed in terms of management motivations rather than just the audit of financial statements figures and disclosures. The current study offers detailed guidance to external auditors in this area. The findings of the current study also revealed that management integrity is a significant factor in assessing the risk of financial reporting fraud and that rationalisation of fraud should be assessed as part of management integrity rather than a separate fraud risk factor. The current study found that fraud perpetrators capabilities are equally significant to the opportunity to commit fraud factor yet it is currently ignored by the audit standards and thus should be assessed as part of opportunity to commit fraud. The current study was the first to explore financial reporting fraud and the extent by which external auditors comply with ISA 240 in the Egyptian context. The current study offered recommendations to external auditors, audit firms, audit regulators, and the Egyptian government on how to combat financial reporting fraud. Potential areas for future research were also identified by the current study.
|
138 |
Automatická verifikace v procesu soubežného návrhu hardware a software / Automated Verification in HW/SW Co-designCharvát, Lukáš Unknown Date (has links)
Předmětem dizertační práce je návrh nových technik pro verifikaci hardwaru, které jsou optimalizovány pro použití v procesu souběžného vývoje hardwaru a softwaru. V rámci tohoto typu vývoje je hardware spolu se software vyvíjen paralelně s cílem urychlit vývoj nových systémů. Současné nástroje pro tvorbu mikroprocesorů stavějící na tomto stylu vývoje obvykle umožňují vývojářům ověřit jejich návrh využitím různých simulačních technik a/nebo za pomoci tzv. funkční verifikace. Společnou nevýhodou těchto přístupů je, že se zaměřují pouze na hledání chyb. Výsledný produkt tedy může stále obsahovat nenalezené netriviální defekty. Z tohoto důvodu se v posledních letech stává stále více žádané nasazení formálních metod. Na rozdíl od výše uvedených přístupů založených na hledání chyb, se formální verifikace zaměřuje na dodání rigorózního důkazu, že daný systém skutečně splňuje požadované vlastnosti. I když bylo v uplynulých letech v této oblasti dosaženo značného pokroku, tak aktuální formální přístupy nemají zdaleka schopnost plně automaticky prověřit všechny relevantní vlastnosti verifikovaného návrhu bez výrazného a často nákladného zapojení lidí v rámci verifikačního procesu. Tato práce se snaží řešit problém s automatizací verifikačního procesu jejím zaměřením na verifikační techniky, ve kterých je záměrně kladen menší důraz na jejich přesnost a obecnost, za cenu dosažení plné automatizace (např. vyloučením potřeby ručně vytvářet modely prostředí). Dále se práce také zaměřuje na efektivitu navrhovaných technik a jejich schopnost poskytovat nepřetržitou zpětnou vazbu o verifikačním procesu (např. v podobě podání informace o aktuálním stavu pokrytí). Zvláštní pozornost je pak věnována vývoji formálních metod ověřujících ekvivalenci návrhů mikroprocesorů na různých úrovních abstrakce. Tyto návrhy se mohou lišit ve způsobu, jakým jsou vnitřně zpracovány programové instrukce, nicméně z vnějšího pohledu (daného např. obsahem registrů viditelných z pozice programátora) musí být jejich chování při provádění stejného vstupního programu shodné. Kromě těchto témat se práce také zabývá problematikou návrhu metod pro verifikaci správnosti mechanismů zabraňujících výskytu datových a řídících hazardů v rámci linky zřetězeného zpracování instrukcí. Veškeré metody popsané v této práci byly implementovány ve formě několika nástrojů. Aplikací těchto nástrojů pro verifikaci návrhů netriviálních procesorů bylo dosaženo slibných experimentálních výsledků.
|
139 |
Spojitá/dávková výroba v prostředí Industry 4.0 / Continuous/Batch production in Industry 4.0 environmentRučka, Petr January 2017 (has links)
This master’s thesis named Continuous/batch production in environment Industry 4.0 deals with software and principles that are designed for discrete manufacturing and their application to the continuous and batch production. First, the functions and properties of individual programs for discrete manufacturing were descripted, then this acquired knowledge was related to continuous and batch production. On the basis of the gained knowledge, a program for the processing of liquid materials with using two tanks was created. This program is aimed at customizing of the final product.
|
140 |
The Spatial and Temporal Distribution of the Metal Mineralisation in Eastern Australia and the Relationship of the Observed Patterns to Giant Ore DepositsRobinson, Larry J. Unknown Date (has links)
The introduced mineral deposit model (MDM) is the product of a trans-disciplinary study, based on Complexity and General Systems Theory. Both investigate the abstract organization of phenomena, independent of their substance, type, or spatial or temporal scale of existence. The focus of the research has been on giant, hydrothermal mineral deposits. They constitute <0.001% of the total number of deposits yet contain 70-85% of the world's metal resources. Giants are the definitive exploration targets. They are more profitable to exploit and less susceptible to fluctuations of the market. Consensus has it that the same processes that generate small deposits also form giants but those processes are simply longer, vaster, and larger. Heat is the dominant factor in the genesis of giant mineral deposits. A paleothermal map shows where the vast heat required to generate a giant has been concentrated in a large space, and even allows us to deduce the duration of the process. To generate a paleothermal map acceptable to the scientific community requires reproducibility. Experimentation with various approaches to pattern recognition of geochemical data showed that the AUTOCLUST algorithm not only gave reproducibility but also gave the most consistent, most meaningful results. It automatically extracts boundaries based on Voronoi and Delaunay tessellations. The user does not specify parameters; however, the modeller does have tools to explore the data. This approach is near ideal in that it removes much of the human-generated bias. This algorithm reveals the radial, spatial distribution, of gold deposits in the Lachlan Fold Belt of southeastern Australia at two distinct scales – repeating patterns every ~80 km and ~230 km. Both scales of patterning are reflected in the geology. The ~80 km patterns are nested within the ~230 km patterns revealing a self-similar, geometrical relationship. It is proposed that these patterns originate from Rayleigh-Bénard convection in the mantle. At the Rayleigh Number appropriate for the mantle, the stable planform is the spoke pattern, where hot mantle material is moving upward near the centre of the pattern and outward along the radial arms. Discontinuities in the mantle, Rayleigh-Bénard convection in the mantle, and the spatial distribution of giant mineral deposits, are correlative. The discontinuities in the Earth are acting as platforms from which Rayleigh-Bénard convection can originate. Shallow discontinuities give rise to plumelets, which manifest at the crust as repeating patterns ranging, from ~100 to ~1,000 km in diameter. Deeper discontinuities give rise to plumes, which become apparent at the crust as repeating patterns ranging from >1,000 to ~4,000 km in diameter. The deepest discontinuities give rise to the superplumes, which become detectable at the crust as repeating patterns ranging from >4,000 to >10,000 km in diameter. Rayleigh-Bénard convection concentrates the reservoir of heat in the mantle into specific locations in the crust; thereby providing the vast heat requirements for the processes that generate giant, hydrothermal mineral deposits. The radial spatial distribution patterns observed for gold deposits are also present for base metal deposits. At the supergiant Broken Hill deposit in far western New South Wales, Australia, the higher temperature Broken Hill-type deposits occur in a radial pattern while the lower temperature deposits occur in concentric patterns. The supergiant Broken Hill deposit occurs at the very centre of the pattern. If the supergiant Broken Hill Deposit was buried beneath alluvium, water or younger rocks, it would now be possible to predict its location with accuracy measured in tens of square kilometres. This predictive accuracy is desired by every exploration manager of every exploration company. The giant deposits at Broken Hill, Olympic Dam, and Mount Isa all occur on the edge of an annulus. There are at least two ways of creating an annulus on the Earth's surface. One is through Rayleigh-Bénard convection and the other is through meteor impact. It is likely that only 'large' meteors (those >10 km in diameter) would have any permanent impact on the mantle. Lesser meteors would leave only a superficial scar that would be eroded away. The permanent scars in the mantle act as ‘accidental templates’ consisting of concentric and possibly radial fractures that impose those structures on any rocks that were subsequently laid down or emplaced over the mantle. In southeastern Australia, the proposed Deniliquin Impact structure has been an 'accidental template' providing a 'line-of-least-resistance' for the ascent of the ~2,000 km diameter, offshore, Cape Howe Plume. The western and northwestern radial arms of this plume have created the very geometry of the Lachlan Fold Belt, as well as giving rise to the spatial distribution of the granitic rocks in that belt and ultimately to the gold deposits. The interplay between the templating of the mantle by meteor impacts and the ascent of plumelets, plumes or superplumes from various discontinuities in the mantle is quite possibly the reason that mineral deposits occur where they do.
|
Page generated in 0.0601 seconds