Spelling suggestions: "subject:"benchmark.""
21 |
Entanglement quantification and quantum benchmarking of optical communication devicesKilloran, Nathan January 2012 (has links)
In this thesis, we develop a number of operational tests and tools for benchmarking the quantum nature of optical quantum communication devices. Using the laws of quantum physics, ideal quantum devices can fundamentally outperform their classical counterparts, or even achieve objectives which are classically impossible. Actual devices will not be ideal, but they may still be capable of facilitating quantum communication. Benchmarking tests, based on the presence of entanglement, can be used to verify whether or not imperfect quantum devices offer any advantage over their classical analogs. The general goal in this thesis is to provide strong benchmarking tools which simultaneously require minimal experimental resources but also offer a wide range of applicability. Another major component is the extension of existing qualitative benchmarks (`Is it quantum or classical?') to more quantitative forms (`How quantum is it?'). We provide a number of benchmarking results applicable to two main situations, namely discrete remote state preparation protocols and continuous-variable quantum device testing. The theoretical tools derived throughout this thesis are also applied to the tasks of certifying a remote state preparation experiment and a continuous-variable quantum memory.
|
22 |
Desenvolvimento de Benchmarks para sistemas multiagente: o caso da patrulha orientada a eventosALMEIDA, Marcelo José Siqueira Coutinho de 25 February 2013 (has links)
Submitted by João Arthur Martins (joao.arthur@ufpe.br) on 2015-03-12T19:09:47Z
No. of bitstreams: 2
Tese Marcelo Jose de Almeida.pdf: 12608090 bytes, checksum: 22ed921ba735815e050ee89c203a02ff (MD5)
license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) / Made available in DSpace on 2015-03-12T19:09:47Z (GMT). No. of bitstreams: 2
Tese Marcelo Jose de Almeida.pdf: 12608090 bytes, checksum: 22ed921ba735815e050ee89c203a02ff (MD5)
license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5)
Previous issue date: 2013-02-25 / Sistemas Multiagente (SMA) tem se apresentado como uma abordagem eficiente no
estudo de comportamentos inteligentes, bem como um importante paradigma de
desenvolvimento de software distribuído e complexo. No entanto, a validação de seus
produtos quase sempre é feita por meio de testes isolados, o que compromete o valor de
seus resultados do ponto de vista científico. Uma solução para essa limitação é utilizar
benchmarks, os quais são meios sistemáticos e rigorosos para estudar, planejar e
incrementar técnicas e teorias por meio da comparação dos trabalhos desenvolvidos. No
contexto da pesquisa científica, eles funcionam como instrumentos de validação,
possibilitando medir o progresso de áreas onde os critérios de sucesso não são formais.
Com o objetivo de demonstrar sua viabilidade na pesquisa em SMA, desenvolvemos um
benchmark denominado PMAOE (Event Oriented MultiAgent Patrolling) baseado no
problema da patrulha, o qual vem despertando atenção crescente da comunidade de
SMA e Robótica. Uma vez que benchmarks ainda são desenvolvidos informalmente,
desenvolvemos também um processo a fim de auxiliar pesquisadores a conduzir seu
desenvolvimento de maneira sistemática.
|
23 |
Development of different technical, economic and financial benchmarks as management tool for intensive milk producers on the Highveld of South AfricaMaree, David Andreas 22 April 2008 (has links)
Extensive studies have been done in the various fields of dairy production such as, reproduction, herd and animal health, feeding and nutrition and the economics of milk production. This study aims to incorporate the standards or benchmarks set out in these studies, in order to identify different technical and financial benchmarks that can be used as management tool by intensive milk producers. Benchmarking can be described as a process whereby a firm (farm) compare its processes, results or actions against that of competitors with the best practice in the industry. To become competitive a farm business must have the ability to compare (benchmark) itself against others, and preferably against others that perform better, but also make adjustments according to the comparison. Benchmarking is therefore a continuous process of comparing and adjusting where necessary. The dairy industry in South Africa changed dramatically since deregulation in the early 1990’s. The industry went from a highly regulated one-channel market to a completely free-market system. This meant that farmers had to become more competitive, both locally and internationally. Three different types of production systems are used in the six production regions in South Africa. These production regions can be divided into two main regions: the coastal regions and the Highveld region. Production in the coastal regions is normally pasture-based, with additional concentrate feeding in some cases. On the Highveld and in the Western Cape, production is based on a total mixed ration (TMR), where cows are fed the complete ration in an intensive production system. Benchmarks were identified for herd health and reproduction, feeding and nutrition and economic and financial performance. Lastly, some additional general benchmarks were defined for bio-security and capacity utilisation. Herd health and reproduction can be divided into the three main areas of fertility performance, udder health and general herd health. Nutrition and feeding can be divided into benchmarks for: intake, nutrient requirements, body condition scoring, calf and heifer feeding, and additional general feeding benchmarks. The economic and financial performance of the dairy farm business can be evaluated against benchmarks for costs, solvability, liquidity, profitability, debt repayment and capital efficiency. It is very important to note and remember that when a dairy farm is evaluated, all the norms or benchmarks must be seen in a holistic way. All the parameters, and therefore every benchmark, are interrelated and cannot be judged or applied individually. Feeding will have an impact on production and reproduction and therefore on financial performance. This study focuses only on benchmarks for intensive milk producers, but it is recommended that it can be extended to include benchmarks for milk production in the pasture-based production systems as well. Since the dairy industry operates in a free-market system and South Africa is an open economy, it is important to be globally competitive. This can only be achieved if local producers benchmark themselves against international standards. Benchmarks can only be used if they are quantified. It is therefore recommended that benchmarks are published for the dairy farmer to use in his evaluations. The Nominal Group Technique worked well to establish the parameters and their benchmarks and farmers can also benefit from this technique. All related parties to the dairy farm, such as the financial consultant or agricultural economist, animal nutritionist, veterinarian and other input suppliers can form a specialist group to evaluate the performance of the dairy together with the producer or herd manager. This specialist group can then recommend adjustments to be made, after discussing the effects on every aspect of production. / Dissertation (MSc (Agric): Agricultural Economics)--University of Pretoria, 2008. / Agricultural Economics, Extension and Rural Development / MSc(Agric) / unrestricted
|
24 |
Essays On Audit Report LagTanyi, Paul N 14 June 2011 (has links)
Audit reporting lag continues to remain an issue of significant interest to regulators, financial statement users, public companies, and auditors. The SEC has recently acted to reduce the deadline for filing annual and quarterly financial statements. Such focus on audit reporting lag arises because, as noted by the Financial Accounting Standards Board, relevance and reliability are the two primary qualities of accounting information; and, to be relevant, information has to be timely.
In my dissertation, I examine three issues related to the audit report lag. The first essay focuses on the association between audit report lag and the meeting or beating of earnings benchmarks. I do not find any association between audit report lag and just meeting or beating earnings benchmarks. However, I find that longer audit report lag is negatively associated with the probability of using discretionary accruals to meet or beat earnings benchmarks. We can infer from these results that audit effort, for which audit report lag is a proxy, reduces earnings management.
The second part of my dissertation examines the association between types of auditor changes and audit report lag. I find that the resignation of an auditor is associated longer audit report lag compared to the dismissal of an auditor. I also find a significant positive association between the disclosure of a reportable event and audit report lag.
The third part of my dissertation investigates the association between senior executive changes and audit report lag. I find that audit report lag is longer when client firms have a new CEO or CFO. Further, I find that audit report lag is longer when the new executive is someone from outside the firm. These results provide empirical evidence about the importance of senior management in the financial reporting process.
|
25 |
Je nadhodnocení účetních výnosů pro překonání očekávání finančních analytiků informativní? / Is Revenue Management to Meet Earnings Benchmarks Informative?Habětínek, Jan January 2020 (has links)
We propose and empirically test a new hypothesis that managers rationally choose between specific channels of earnings management to meet earnings benchmarks. Prior research documents that managers are ready to interfere with the neutrality of financial reporting process to report earnings above zero, earnings above last year's earnings, and earnings above analysts' forecast. However, there is a controversy over whether this earnings management to meet or beat earnings benchmarks is intended to distort investors' view by delaying the disclosure of bad news or whether it is intended to communicate managers' private information about the firm's strong future performance. We argue that the credibility of the earnings management signal crucially depends on the cost of its imitation. As revenue management is more costly to imitate than cost management, we argue that managers who intend to send a credible signal about their firm's future performance likely boost revenues rather than depress costs. To test this prediction, we use a recently developed model of discretionary revenues that is arguably more powerful in detecting earnings management than traditional techniques. The empirical results are consistent with our predictions for the most important earnings benchmark - the consensus of analysts'...
|
26 |
Modeling Elevator System With Coloured Petri NetsAssiri, Mohammed January 2015 (has links)
A fairly general model of the elevator system is presented. Coloured Petri Nets (CPN) and CPN tools are adopted as modeling tools. The model, which is independent of the number of floors and elevators, covers different stages of the elevator system in substantial detail. The model assists simulation-based analysis of different algorithms and rules which govern real elevator systems. The results prove the compatibility and applicability of this model in various situations and demonstrate the expressive power and convenience of CPN. / Thesis / Master of Applied Science (MASc)
|
27 |
Trends in accrual quality and real activity-based earnings management in the pre and post Sarbanes-Oxley erasLynch, Nicholas Christopher 03 May 2008 (has links)
An increase in the prevalence of earnings restatements and cases of financial statement fraud in the early 21st century led to a significant loss of market capitalization and investor confidence in the attestation process. In an effort to restore such confidence, Congress passed the Sarbanes-Oxley Act (SOX) in July of 2002. The Act significantly increased the penalties for engaging in accrual activities aimed at either misleading users of the financial statements concerning the underlying economic condition of the firm or influencing contractual outcomes. Recent literature separates earnings management into accrual and real activities. Accrual activities include the management of accounts that have not yet been realized in cash, such as receivables and payables. Real activities include the management of actions that deviate from normal business practices, such as price discounts aimed at temporarily increasing sales, excessive inventory production aimed at lowering the cost of goods sold, and aggressively reducing discretionary expenditures such as R&D to improve profit margins. As a result of the increased penalties for engaging in accrual activities, one would expect a relative shift from accrual activities to real activities to facilitate earnings management in the post-SOX period. As with most academic social disciplines, the test employed in my dissertation is a joint test of the sensitivity of the tools available to detect management activities, the research design, and the presence and strength of the effect for which I am searching. This dissertation is the first to test for changes in both accrual quality and real activity-based earnings management in the post-SOX period. In order to test for a change in accrual quality in the post-SOX period, I utilize a model developed by Dechow and Dichev in 2002. The Dechow and Dichev (2002) model of accrual quality is an appropriate measure of accrual information risk, and may therefore be superior to the use of discretionary accrual models to test for an economic effect (Francis et al. 2004). I also utilize three empirical measures of real activity-based earnings management developed by Roychowdhury (2006) to document a change in real earnings management in the post-SOX period. The findings of the study empirically support a change in earnings management techniques in the post-SOX period compared to the pre-SOX period. Specifically, the quality of accruals incorporated into the accounting earnings figure have significantly increased in the post-SOX period. However, instances of earnings management using real activities have also significantly increased in the post-Sox period. These findings inform academics about the power of the tools used in academic accounting research and the overall quality of the argument. They inform users of financial statements about where to direct their attention in reading and evaluating the financials. Finally, they inform regulators, practitioners and policy makers of the effectiveness of the law at improving the quality of accruals, and bring to their attention a potential substitution in the techniques used to manage earnings.
|
28 |
InfiniBand Network Analysis and Monitoring using OpenSMDandapanthula, Nishanth 19 October 2011 (has links)
No description available.
|
29 |
Impact of Increased Cache Misses on Runtime Performance of MPX-enabled ProgramsSharma, Niti 10 June 2019 (has links)
Low level languages like C and C++ provide high performance and direct control over memory management. But these languages are prone to memory safety violations. Intel introduced a new ISA extension-Memory Protection Extension(MPX), a hardware-assisted full-stack solution, to protect against the memory safety violations. While MPX efficiently prevents memory errors like buffer overflows and out of bound memory accesses, it comes at the cost of high performance overheads. Also, the cache locality worsens in MPX protected applications.
In our research, we analyze if there is a correlation between increase in cache misses and runtime degradation in programs compiled with MPX support. We analyze 15 SPEC CPU benchmark programs for different input sizes on Windows platform, compiled with Intel's ICC compiler. We find that for input sizes train(medium) and ref(large), the average performance overheads are 140% and 144% respectively. We find that 5 out of 15 benchmarks do not have any runtime overheads and also, do not have any change in cache misses at any level. However for rest of the 10 benchmarks, we find a strong correlation between runtime overheads and cache misses overheads, with the correlation coefficients ranging from 0.8 to 0.36 for different input sizes. Based on our findings, we conclude that there is a direct correlation between runtime overheads and increase in cache misses. We also find that instructions overheads and runtime overheads have a positive correlation, with the coefficient values ranging from 0.7 to 0.33 for different input sizes. / Master of Science / Low level programming languages like C and C++ are primary choices to write low-level systems software such as operating systems, virtual machines, embedded software, and performance-critical applications. But these languages are considered as unsafe and prone to memory safety errors. Intel introduced a new technique- Memory Protection Extensions (MPX) to protect against these memory errors. But prior research found that applications supported with MPX have increased runtimes (slowdowns).
In our research, we analyze these slowdowns for different input sizes(medium and large) in 15 benchmark applications. Based on the input sizes, the average slowdowns range from 140% to 144%. We then examine if there is a correlation between increase in cache misses under MPX and the slowdowns. A hardware cache is a component that stores data so that future requests for that data can be served faster. Hence, cache miss is a state where the data requested for processing by a component or application is not found in the cache. Whenever a cache miss happen, the processor waits for the data to be fetched from the next cache level or from main memory before it can continue to execute. This wait influences the runtime performance of the application. Our evaluations find that 10 out of 15 applications which have increased runtimes, also have increase in cache misses. This shows a positive correlation between these two parameters. Along with that, we also found that increase in instruction size in MPX protected applications also has a direct correlation with the runtime degradation. We also quantify these relationships with a statistical measure called correlation coefficient.
|
30 |
EVOLUTION OF THE COST EFFECTIVE, HIGH PERFORMANCE GROUND SYSTEMS: A QUANTITATIVE APPROACHHazra, Tushar K., Stephenson, Richard A., Troendly, Gregory M. 10 1900 (has links)
International Telemetering Conference Proceedings / October 17-20, 1994 / Town & Country Hotel and Conference Center, San Diego, California / During the recent years of small satellite space access missions, the trend has
been towards designing low-cost ground control centers to maintain the space/ground
cost ratio. The use of personal computers (PC) in combination with high speed
transputer modules as embedded parallel processors, provides a relatively affordable,
highly versatile, and reliable desktop workstation upon which satellite telemetry
systems can be built to meet the ever-growing challenge of the space missions today
and of the future.
This paper presents the feasibility of cost effective, high performance ground
systems and a quantitative analysis and study in terms of performance, speedup,
efficiency, and the compatibility of the architecture to commercial off the shelf
(COTS) tools, and finally, introduces an operational high performance, low cost
ground system to strengthen the insight of the concept.
|
Page generated in 0.0522 seconds