311 |
NONEYang, Dennis 27 July 2001 (has links)
The inspection (verification) or Certification firm is to utilize professional specialist, technology and equipment, standing on an independent¡Aimpartial and objective position to conduct inspection, testing and assessment on the quantity/quality of commodity, performance of machinery and components, and implementation of the established quality system, then provide the supplier/buyer and stakeholder with certificate or report for fulfilling the contractual obligation or commitment made in the trade transaction activities. The inspection and certification are playing the role of pioneer in the process of technological development, and they are also one of the necessary pusher for business and industrial enterprises to achieve the target of enhancing product quality¡A obtaining international accreditation and increasing international competence.
In the current society of free trade and free economy, the inspection or certification firms, along with the consumers¡¦ awareness and the drastic competitions among inspection or certification industries, facing the daily increasing competition and consumers¡¦ demanding, should make research of how to adopt an appropriate competitive strategy, to look for the market niche, to enhance service quality, to understand customers¡¦ real needs and customers¡¦ satisfactions so as to broaden service scopes and customer groups for the purpose of accomplishing the objective of building the firm to last forever.
- iii -
This study is to explain the characters, current status and outlook of the inspection or certification industry, and to analyze the development history, performance and achievement of SGS Group in Taiwan - the world leader of verification, testing and certification organization. By employing the industry analysis method addressed by Porter and the SWOT analysis for Strength, Weakness, Opportunity and Threat, we discussed the effect of ¡§five forces¡¨ on SGS Taiwan¡¦s business strategy, the correlation of service quality and customers¡¦ satisfaction, and currently its implementation of the managing tools of Balanced Scorecard and ISO 9001 quality management system. In summary, we have derived the Key success factors for the inspection or certification industry, and submitted suggestions of strengthening business model and management strategy for other inspection or certification firms as reference and benchmark enabling them to provide best and integrated services for the business and industrial enterprises in Taiwan who are pursuing quality.
Key words:Inspection, Verification, Certification, Business Strategy, Service Quality, Customers Satisfaction.
|
312 |
Systematic Generation of Instruction Test Patterns Based on Architectural ParametersMu, Peter 30 August 2001 (has links)
When we survey hardware design groups, we can find that it is now dedicated to verification between 60 to 80 percent. According to the instruction set architecture information should be a feasible and reasonable way for generating the test pattern to verify the function of a microprocessor. In this these, we¡¦ll present an instruction test pattern (for microprocessors) generation method based on the instruction set architecture. It can help the users to generate the instruction test pattern efficiently.
The generation flow in this thesis contains three major flows: individual instruction, instruction pair, and manual generation. They are used for different verification cases. The ¡§individual instruction¡¨ could be used for verifying the functions of each implemented instructions. The ¡§instruction pair¡¨ could be used for verifying the interaction of instruction execution in a pipeline for a HDL implementation of a microprocessor. The ¡§manual generation¡¨ could be used to verify some corner cases (behaviors) of the microprocessor.
As the quality of our test pattern, we generate some patterns for 32-bits instruction (ARM instruction sets and SPARC instruction sets) and use them to verify a synthesizable RTL core. With some handwriting test pattern (34.7%), our automatic generation method can approach 100% HDL code coverage of the microprocessor design. We use the HDL code coverage as the reference of test pattern quality.
Because our generation method is based on the instruction field, we can describe most instruction set for the generator. Hence, our generation method can retarget to most instruction set architecture without modifying the generator. Besides the RISC instructions, even the CISC instructions could be generated.
|
313 |
Machine Assisted Reasoning for Multi-Threaded Java Bytecode / Datorstödda resonemang om multi-trådad Java-bytekodLagerkvist, Mikael Zayenz January 2005 (has links)
<p>In this thesis an operational semantics for a subset of the Java Virtual Machine (JVM) is developed and presented. The subset contains standard operations such as control flow, computation, and memory management. In addition, the subset contains a treatment of parallel threads of execution.</p><p> </p><p>The operational semantics are embedded into a $µ$-calculus based proof assistant, called the VeriCode Proof Tool (VCPT). VCPT has been developed at the Swedish Institute of Computer Science (SICS), and has powerful features for proving inductive assertions.</p><p> </p><p>Some examples of proving properties of programs using the embedding are presented.</p> / <p>I det här examensarbetet presenteras en operationell semantik för en delmängd av Javas virtuella maskin. Den delmängd som hanteras innehåller kontrollflöde, beräkningar och minneshantering. Vidare beskrivs semantiken för parallella exekveringstrådar.</p><p>Den operationella semantiken formaliseras i en bevisassistent for $µ$-kalkyl, VeriCode Proof Tool (VCPT). VCPT har utvecklats vid Swedish Institiute of Computer Science (SICS), och har kraftfulla tekniker för att bevisa induktiva påståenden.</p><p>Några exempel på bevis av egenskaper hos program användandes formaliseringen presenteras också.</p>
|
314 |
FPGA-based implementation of concatenative speech synthesis algorithm [electronic resource] / by Praveen Kumar Bamini.Bamini, Praveen Kumar. January 2003 (has links)
Title from PDF of title page. / Document formatted into pages; contains 68 pages. / Thesis (M.S.Cp.E.)--University of South Florida, 2003. / Includes bibliographical references. / Text (Electronic thesis) in PDF format. / ABSTRACT: The main aim of a text-to-speech synthesis system is to convert ordinary text into an acoustic signal that is indistinguishable from human speech. This thesis presents an architecture to implement a concatenative speech synthesis algorithm targeted to FPGAs. Many current text-to-speech systems are based on the concatenation of acoustic units of recorded speech. Current concatenative speech synthesizers are capable of producing highly intelligible speech. However, the quality of speech often suffers from discontinuities between the acoustic units, due to contextual differences. This is the easiest method to produce synthetic speech. It concatenates prerecorded acoustic elements and forms a continuous speech element. The software implementation of the algorithm is performed in C whereas the hardware implementation is done in structural VHDL. A database of acoustic elements is formed first with recording sounds for different phones. / ABSTRACT: The architecture is designed to concatenate acoustic elements corresponding to the phones that form the target word. Target word corresponds to the word that has to be synthesized. This architecture doesn't address the form discontinuities between the acoustic elements as its ultimate goal is the synthesis of speech. The Hardware implementation is verified on a Virtex (v800hq240-4) FPGA device. / System requirements: World Wide Web browser and PDF reader. / Mode of access: World Wide Web.
|
315 |
Design, Testing and Implementation of a New Authentication Method Using Multiple DevicesCetin, Cagri 01 January 2015 (has links)
Authentication protocols are very common mechanisms to confirm the legitimacy of someone’s or something’s identity in digital and physical systems.
This thesis presents a new and robust authentication method based on users’ multiple devices. Due to the popularity of mobile devices, users are becoming more likely to have more than one device (e.g., smartwatch, smartphone, laptop, tablet, smart-car, smart-ring, etc.). The authentication system presented here takes advantage of these multiple devices to implement authentication mechanisms. In particular, the system requires the devices to collaborate with each other in order for the authentication to succeed. This new authentication protocol is robust against theft-based attacks on single device; an attacker would need to steal multiple devices in order to compromise the authentication system.
The new authentication protocol comprises an authenticator and at least two user devices, where the user devices are associated with each other. To perform an authentication on a user device, the user needs to respond a challenge by using his/her associated device. After describing how this authentication protocol works, this thesis will discuss three different versions of the protocol that have been implemented. In the first implementation, the authentication process is performed by using two smartphones. Also, as a challenge, a QR code is used. In the second implementation, instead of using a QR code, NFC technology is used for challenge transmission. In the last implementation, the usability with different platforms is exposed. Instead of using smartphones, a laptop computer and a smartphone combination is used. Furthermore, the authentication protocol has been verified by using an automated protocol-verification tool to check whether the protocol satisfies authenticity and secrecy properties. Finally, these implementations are tested and analyzed to demonstrate the performance variations over different versions of the protocol.
|
316 |
Semi-formal verifcation of analog mixed signal systems using multi-domain modeling languagesRamirez, Ricardo, active 2013 18 December 2013 (has links)
The verification of analog designs has been a challenging task for a few years now. Several approaches have been taken to tackle the main problem related to the complexity that such task presents to design and verification teams. The methodology presented in this document is based on the experiences and research work carried out by the Concordia University's Hardware Verification and the U. of Texas' IC systems design groups.
The representation of complex systems where different interactions either mechanical or electrical take place requires an intricate set of mathematical descriptions which greatly vary according to the system under test. As a simple and very relevant example one can look at the integration of RF-MEMS as active elements in System-On-Chip architectures. In order to tackle such heterogeneous interaction for a consistent model, the use of stochastic hybrid models is described and implemented for very simple examples using high level modeling tools for a succinct
and precise description. / text
|
317 |
The developmental interplay of behavioral confirmation and self-verificationRosen, Lisa Helene 04 May 2015 (has links)
Philosophers, psychologists, and authors have long pondered the question of whether others’ expectations or one’s own self-views are more important in determining behavior and personality. Researchers have designated these two processes behavioral confirmation and self-verification, respectively, and the interaction of these processes is often referred to as identity negotiation. Little research has examined the process of identity negotiation during adolescence, a period during which individuals are attempting to forge unique identities. Therefore, the primary purpose of the present studies was to examine the identity negotiation process during adolescence. In Study 1, I examined whether adolescents (11-15 years of age) solicit self-verifying feedback. Adolescents first completed a measure of self-perceptions and then selected whether to receive positive or negative feedback from an unknown peer in areas of perceived strength and weakness. Adolescents desired feedback congruent with their own self-views; those with higher self-esteem tended to request more positive feedback than those with lower self-esteem. Further, adolescents were more likely to seek negative feedback regarding a self-perceived weakness than a self-perceived strength. In Study 2, I examined the joint operation of behavioral confirmation and self-verification in dyadic interactions among unacquainted adolescents. One member of each dyad (the target) completed a measure of self-perception. The second member of each dyad (the perceiver) was provided with false information regarding the attractiveness of their partner. I compared whether targets’ self-views or perceivers’ expectations of them were stronger determinants of behavior. Self-verification strivings were evident in these interactions; targets’ self-views influenced the perceivers’ final evaluations of their partners. Support for behavioral confirmation was lacking in same-sex dyads and dyads composed of male perceivers and female targets. Appearance based expectations influenced target behavior in dyads composed of female perceivers and male targets. The current findings suggest that adolescents’ self-views are important determinants of behavior. Significant implications for adolescent mental health and peer selection are discussed. / text
|
318 |
Efficient, mechanically-verified validation of satisfiability solversWetzler, Nathan David 04 September 2015 (has links)
Satisfiability (SAT) solvers are commonly used for a variety of applications, including hardware verification, software verification, theorem proving, debugging, and hard combinatorial problems. These applications rely on the efficiency and correctness of SAT solvers. When a problem is determined to be unsatisfiable, how can one be confident that a SAT solver has fully exhausted the search space? Traditionally, unsatisfiability results have been expressed using resolution or clausal proof systems. Resolution-based proofs contain perfect reconstruction information, but these proofs are extremely large and difficult to emit from a solver. Clausal proofs rely on rediscovery of inferences using a limited number of techniques, which typically takes several orders of magnitude longer than the solving time. Moreover, neither of these proof systems has been able to express contemporary solving techniques such as bounded variable addition. This combination of issues has left SAT solver authors unmotivated to produce proofs of unsatisfiability. The work from this dissertation focuses on validating satisfiability solver output in the unsatisfiability case. We developed a new clausal proof format called DRAT that facilitates compact proofs that are easier to emit and capable of expressing all contemporary solving and preprocessing techniques. Furthermore, we implemented a validation utility called DRAT-trim that is able to validate proofs in a time similar to that of the discovery time. The DRAT format has seen widespread adoption in the SAT community and the DRAT-trim utility was used to validate the results of the 2014 SAT Competition. DRAT-trim uses many advanced techniques to realize its performance gains, so why should the results of DRAT-trim be trusted? Mechanical verification enables users to model programs and algorithms and then prove their correctness with a proof assistant, such as ACL2. We designed a new modeling technique for ACL2 that combines efficient model execution with an agile and convenient theory. Finally, we used this new technique to construct a fast, mechanically-verified validation tool for proofs of unsatisfiability. This research allows SAT solver authors and users to have greater confidence in their results and applications by ensuring the validity of unsatisfiability results. / text
|
319 |
Efficient and effective symbolic model checkingIyer, Subramanian Krishnan 28 August 2008 (has links)
Not available / text
|
320 |
Productivity with performance: property/behavior-based automated composition of parallel programs from self-describing components / Property/behavior-based automated composition of parallel programs from self-describing componentsMahmood, Nasim, 1976- 28 August 2008 (has links)
Development of efficient and correct parallel programs is a complex task. These parallel codes have strong requirements for performance and correctness and must operate robustly and efficiently across a wide spectrum of application parameters and on a wide spectrum of execution environments. Scientific and engineering programs increasingly use adaptive algorithms whose behavior can change dramatically at runtime. Performance properties are often not known until programs are tested and performance may degrade during execution. Many errors in parallel programs arise in incorrect programming of interactions and synchronizations. Testing has proven to be inadequate. Formal proofs of correctness are needed. This research is based on systematic application of software engineering methods to effective development of efficiently executing families of high performance parallel programs. We have developed a framework (P-COM²) for development of parallel program families which addresses many of the problems cited above. The conceptual innovations underlying P-COM² are a software architecture specification language based on self-describing components, a timing and sequencing algorithm which enables execution of programs with both concrete and abstract components and a formal semantics for the architecture specification language. The description of each component incorporates compiler-useable specifications for the properties and behaviors of the components, the functionality a component implements, pre-conditions and postconditions on the inputs and outputs and state machine based sequencing control for invocations of the component. The P-COM² compiler and runtime system implement these concepts to enable: (a) evolutionary development where a program instance is evolved from a performance model to a complete application with performance known at each step of evolution, (b) automated composition of program instances targeting specific application instances and/or execution environments from self-describing components including generation of all parallel structuring, (c) runtime adaptation of programs on a component by component basis, (d) runtime validation of pre-and post-conditions and sequencing of interactions and (e) formal proofs of correctness for interactions among components based on model checking of the interaction and synchronization properties of the program. The concepts and their integration are defined, the implementation is described and the capabilities of the system are illustrated through several examples.
|
Page generated in 0.1138 seconds