Spelling suggestions: "subject:"cerification**"" "subject:"erification**""
291 |
Visits: An Essential and Effective PillarPearson, Graham S., Dando, Malcolm R. 01 1900 (has links)
No description available.
|
292 |
System Level Techniques for Verification and Synchronization after Local Design RefinementsRaudvere, Tarvo January 2007 (has links)
Today's advanced digital devices are enormously complex and incorporate many functions. In order to capture the system functionality and to be able to analyze the needs for a final implementation more efficiently, the entry point of the system development process is pushed to a higher level of abstraction. System level design methodologies describe the initial system model without considering lower level implementation details and the objective of the design development process is to introduce lower level details through design refinement. In practice this kind of refinement process may entail non-semantic-preserving changes in the system description, and introduce new behaviors in the system functionality. In spite of new behaviors, a model formed by the refinement may still satisfy the design constraints and to realize the expected system. Due to the size of the involved models and the huge abstraction gap, the direct verification of a detailed implementation model against the abstract system model is quite impossible. However, the verification task can be considerably simplified, if each refinement step and its local implications are verified separately. One main idea of the Formal System Design (ForSyDe) methodology is to break the design process into smaller refinement steps that can be individually understood, analyzed and verified. The topic of this thesis is the verification of refinement steps in ForSyDe and similar methodologies. It proposes verification attributes attached to each non-semantic-preserving transformation. The attributes include critical properties that have to be preserved by transformations. Verification properties are defined as temporal logic expressions and the actual verification is done with the SMV model checker. The mapping rules of ForSyDe models to the SMV language are provided. In addition to properties, the verification attributes include abstraction techniques to reduce the size of the models and to make verification tractable. For computation refinements, the author defines the polynomial abstraction technique, that addresses verification of DSP applications at a high abstraction level. Due to the size of models, predefined properties target only the local correctness of refined design blocks and the global influence has to be examined separately. In order to compensate the influence of temporal refinements, the thesis provides two novel synchronization techniques. The proposed verification and synchronization techniques have been applied to relevant applications in the computation area and to communication protocols. / QC 20100816
|
293 |
Examining Methods and Practices of Source Data Verification in Canadian Critical Care Randomized Controlled TrialsWard, Roxanne E. 21 March 2013 (has links)
Statement of the Problem: Source data verification (SDV) is the process of comparing data collected at the source to data recorded on a Case Report Form, either paper or electronic (1) to ensure that the data are complete, accurate and verifiable. Good Clinical Practice (GCP) Guidelines are vague and lack evidence as to the degree of SDV and whether or not SDV affects study outcomes.
Methods of Investigation: We performed systematic reviews to establish the published evidence-base for methods of SDV and to examine the effect of SDV on study outcomes. We then conducted a national survey of Canadian Critical Care investigators and research coordinators regarding their attitudes and beliefs regarding SDV. We followed by an audit of the completed and in-progress Randomized Controlled Trials (RCTs) of the Canadian Critical Care Trials Group (CCCTG).
Results: Systematic Review of Methods of SDV: The most common reported or recommended frequency of source data verification (10/14 - 71%) was either based on level or risk, or that it be conducted early (i.e. after 1st patient enrolled). The amount of SDV recommended or reported, varied from 5-100%. Systematic Review of Impact of SDV on Study Outcomes: There was no difference in study outcomes for 1 trial and unable to assess in the other. National Survey of Critical Care Investigators and Research Coordinators: Data from the survey found that 95.8% (115/120) of respondents believed that SDV was an important part of Quality Assurance; 73.3% (88/120) felt that academic studies should do more SDV; and 62.5% (75/120) felt that there is insufficient funding available for SDV. Audit of Source Data Verification Practices in CCCTG RCTs: In the national audit of in-progress and completed CCCTG RCTs, 9/15 (60%) included a plan for SDV and 8/15 (53%) actually conducted SDV. Of the 9 completed published trials, 44% (4/9) conducted SDV.
Conclusion: There is little evidence base for methods and effect of SDV on study outcomes. Based on the results of the systematic review, survey, and audit, more research is needed to support the evidence base for the methods and effect of SDV on study outcomes.
|
294 |
Auditing Fair Value measurements and Disclosures: A case of the Big 4 Audit FirmsAhmed, Kemal January 2013 (has links)
Abstract Problem: In today’s business environment, rising demand in financial reporting and frequent changes in accounting frameworks lead to an increased focus on reliability in Fair Value Measurement (FVM) and disclosures. The frequent changes in accounting frameworks create a challenge for managers in measuring accounting estimates accurately and have been an exceedingly difficult task. The difficult task is that of the auditors. How would auditors endorse and ensure the reliability and relevance of financial statements? Also how could they evaluate the accuracy of the measurement of fair values as presented in the financial statements? (IFAC, 2011, ISA 540). Purpose: The purpose of this thesis is to explore the methods and approaches used by auditors while auditing fair values from practical perspectives. Method: A multiple case study with pure qualitative methods and an inductive approach has been adopted. The qualitative method used a semi-structured interview to collect data. Result: The result shows that by understanding the challenges and following the phases of auditing, auditors can maintain the quality of financial reporting. Four key audit phases are relevant to audit FVM. These are: understanding the Client-Business environment, Engagement, Internal Control, and Planning phases of auditing. Furthermore, the results revealed key challenges of auditing FVM and disclosures. These challenges are information insufficiency in the market (reliability), competence, auditors’ lack of fair value audit exposure, and the manager's leadership role and style. Moreover, as previous studies on FV have primarily relied on synthesis of academic literature, the thesis contributes knowledge to academia by using an empirical approach.
|
295 |
Application of a bayesian network to integrated circuit tester diagnosisMittelstadt, Daniel Richard 06 December 1993 (has links)
This thesis describes research to implement a Bayesian belief network based
expert system to solve a real-world diagnostic problem troubleshooting integrated
circuit (IC) testing machines. Several models of the IC tester diagnostic problem
were developed in belief networks, and one of these models was implemented
using Symbolic Probabilistic Inference (SPI). The difficulties and advantages
encountered in the process are described in this thesis.
It was observed that modelling with interdependencies in belief networks
simplified the knowledge engineering task for the IC tester diagnosis problem, by
avoiding procedural knowledge and sticking just to diagnostic component's
interdependencies. Several general model frameworks evolved through knowledge
engineering to capture diagnostic expertise that facilitated expanding and modifying
the networks. However, model implementation was restricted to a small portion of
the modelling - contact resistance failures - because evaluation of the probability
distributions could not be made fast enough to expand the code to a complete
model with real-time diagnosis. Further research is recommended to create new
methods, or refine existing methods, to speed evaluation of the models created in
this research. If this can be done, more complete diagnosis can be achieved. / Graduation date: 1994
|
296 |
Modelling the impact of total stress changes on groundwater flowDissanayake, Nalinda 29 April 2008
The research study involved using the modified FEMWATER code to investigate the impact of total stress changes on groundwater flow in the vicinity of a salt tailings pile. Total stress and pore-pressure data observed at the Lanigan and Rocanville potash-mine sites were used to assist the development of a generic FEMWATER model. The original 3-D mesh considered for model study covers a region of 7.6 km x 7.6 km x 60 m. The simulated pile itself covers a surface area of 1.6 km x 1.6 km within the region. Symmetry of the idealized system allowed half of the system to be modelled to reduce the size of the mesh. The model was layered to facilitate different materials representing different hydrostratigraphic scenarios. The GMS-release of the FEMWATER code (version 2.1) was modified to simulate the pore-pressure response to total stress changes caused by tailings pile loading at the ground surface to be modelled. The modified code was verified before applying to present study.<p>Long-term pore pressure generation and dissipation due to pile construction was investigated for eleven hydrostratigraphic scenarios consisting of plastic clays, stiff till and dense sand layers commonly found in Saskatchewan potash mining regions. The model was run for two distinctive pile loading patterns. Model results indicated that the loading pattern has a significant influence on pore pressure generation beneath the pile. The model was initially run for 30 year pile construction period and later simulated for 15, 25 and 35 year construction periods to investigate the impact of loading rate. These results showed that, as expected, the peak pore water pressure head is proportional to the pile construction rate. A sensitivity analysis, which was carried out by changing hydraulic conductivity of stiff till, revealed that the lower the hydraulic conductivity, the greater the pore pressure generation beneath the pile.<p>Overall, the research study helped to understand and predict the influence of pile construction and hydrostratigraphy on pore-pressure changes beneath salt tailing piles. Low K/Ss or cv materials (compressible tills) demonstrate a slow dissipation rate and high excess pressures. Compared to dense sand which has very high K/Ss, till has very low K/Ss which causes in high excess pore pressure generation. Sand layers act as drains, rapidly dissipating pore pressures. Thicker low K/Ss units result in slower dissipation and higher pressures. As the thickness of the low K/Ss layer increases, the peak pressures increase as the drainage path lengthens. Thin plastic clay layers give rise to the highest pressures.<p>The model study showed that hydrostratigraphic scenarios similar to those found at Saskatchewan potash mine sites can generate the high pore pressures observed in the vicinity of salt tailings piles as a result of pile loading. Peak pressures are very sensitive to pile construction rates, loading patterns and hydrostratiagraphy of the region. Peak pressures can reach levels that would be of concern for pile stability on the presence of adverse geological conditions.
|
297 |
Robust Consistency Checking for Modern FilesystemsSun, Kuei 19 March 2013 (has links)
A runtime file system checker protects file-system metadata integrity. It checks the consistency of file system update operations before they are committed to disk, thus preventing corrupted updates from reaching the disk. In this thesis, we describe our experiences with building Brunch, a runtime checker for an emerging Linux file system called Btrfs. Btrfs supports many modern file-system features that pose challenges in designing a robust checker. We find that the runtime consistency checks need to be expressed clearly so that they can be reasoned about and implemented reliably, and thus we propose writing the checks declaratively. This approach reduces the complexity of the checks, ensures their independence, and helps identify the correct abstractions in the checker. It also shows how the checker can be designed to handle arbitrary file system corruption. Our results show that runtime consistency checking is still viable for complex, modern file systems.
|
298 |
Conditions Affecting the Relationship between Power and Identity Verification in Power Imbalanced DyadsDavis, Jennifer 1983- 14 March 2013 (has links)
In the present study, I look at the relationship between power and identity verification and the conditions under which this relationship can be disrupted. Specifically, I look at the role of information in disrupting power differences within identity processes. I examine these processes through an experiment with task-oriented, power-imbalanced, dyads (N=144). Priming participants with a task-leader identity, I test how the introduction of negotiation resources—or information discrepant and external to a high power actor’s self presentation, affect presentation power—or the degree to which an actor can maintain identity meanings in light of partner negotiations.
In contrast with existing literature, I did not find a direct relationship between power and identity verification. I did, however, find that those in higher positions of power experience greater identity stability, while those in lower positions of power experience increased identity change. Interestingly, I found that identity change and identity verification varied with identity valence, such that those with dominant task leader identity meanings experienced greater identity stability but less identity verification than their more submissive counterparts. These relationships, however were power dependent, such that differences disappeared among power-high actors, and were magnified for power-low actors. Negotiation Resources did not have a significant main effect, but showed a significant interaction with identity valence when predicting identity verification among power-low actors.
|
299 |
Safety Verification of Material Handling Systems Driven by Programmable Logic Controller : Consideration of Physical Behavior of PlantsOKUMA, Shigeru, SUZUKI, Tatsuya, KONAKA, Eiji 01 April 2004 (has links)
No description available.
|
300 |
Reasoning About Multi-stage ProgramsInoue, Jun 24 July 2013 (has links)
Multi-stage programming (MSP) is a style of writing program
generators---programs which generate programs---supported by special
annotations that direct construction, combination, and execution of
object programs. Various researchers have shown MSP to be effective
in writing efficient programs without sacrificing genericity.
However, correctness proofs of such programs have so far received
limited attention, and approaches and challenges for that task have
been largely unexplored. In this thesis, I establish formal
equational properties of the multi-stage lambda calculus and related
proof techniques, as well as results that delineate the intricacies
of multi-stage languages that one must be aware of.
In particular, I settle three basic questions that naturally arise
when verifying multi-stage functional programs. Firstly, can adding
staging MSP to a language compromise the interchangeability of terms
that held in the original language? Unfortunately it can, and more
care is needed to reason about terms with free variables. Secondly,
staging annotations, as the term ``annotations'' suggests, are often
thought to be orthogonal to the behavior of a program, but when is
this formally guaranteed to be the case? I give termination
conditions that characterize when this guarantee holds. Finally, do
multi-stage languages satisfy extensional facts, for example that
functions agreeing on all arguments are equivalent? I develop a
sound and complete notion of applicative bisimulation, which can
establish not only extensionality but, in principle, any other valid
program equivalence as well. These results improve our general
understanding of staging and enable us to prove the correctness of
complicated multi-stage programs.
|
Page generated in 0.1099 seconds