411 |
Utah Commercial Motor Vehicle Weigh-in-Motion Data Analysis and Calibration MethodologySeegmiller, Luke W. 30 November 2006 (has links) (PDF)
In preparation for changes in pavement design methodologies and to begin to assess the effectiveness of the weigh-in-motion (WIM) system in Utah, the Utah Department of Transportation (UDOT) contracted with a Brigham Young University (BYU) research team to conduct an evaluation of their commercial motor vehicle (CMV) data collection system statewide. The objective of this research was to evaluate the CMV data collection program in the state of Utah and to make limited recommendations for potential improvements and changes that will aid in more detailed and accurate CMV data collection across the state. To accomplish the research objectives, several tasks were conducted, including: 1) a review of literature to establish the state-of-the-practice for CMV monitoring, 2) collection of WIM data for the state of Utah, 3) analysis of the collected WIM data, 4) development of a calibration methodology for use in the state, and 5) presentation of recommendations and conclusions based on the research. The analysis of collected WIM data indicated that the CMV data collection system in the state of Utah currently produces data consistent with expectations with a few exceptions. Recommendations for improvements to the CMV data collection system come in the form of a proposed calibration methodology that is in line with current standards and the practices in other states. The proposed calibration methodology includes calibration, verification, and a quality assurance programs.
|
412 |
Machine Code Verification Using The Bogor FrameworkEdelman, Joseph R. 22 May 2008 (has links) (PDF)
Verification and validation of embedded systems software is tedious and time consuming. Software model checking uses a tool-based approach automating this process. In order to more accurately model software it is necessary to provide hardware support that enables the execution of software as it should run on native hardware. Hardware support often requires the creation of model checking tools specific to the instruction set architecture. The creation of software model checking tools is non-trivial. We present a strategy for using an "off-the-shelf" model checking tool, Bogor, to provide support for multiple instruction set architectures. Our strategy supports key hardware features such as instruction execution, exceptional control flow, and interrupt servicing as extensions to Bogor. These extensions work within the tool framework using existing interfaces and require significantly less code than creating an entire model checking tool.
|
413 |
BackFlip: A Principled Approach to Online Attribute VerificationDaley, Devlin R. 12 August 2010 (has links) (PDF)
As traditional interactions in the real-word move online, services that require verified personal information from web users will increase. We propose an architecture for the verification of web user attributes without the use of cryptographic-based credentials. In this architecture, service providers are delegated a user's ability to directly contact a certifying party and retrieve attribute data. We demonstrate that this approach is simple for both developers and users, can be applied to existing Internet facilities and sufficiently secure for typical web use cases.
|
414 |
An Incremental Trace-Based Debug System for Field-Programmable Gate-ArraysKeeley, Jared Matthew 07 November 2013 (has links) (PDF)
Modern society increasingly relies upon integrated circuits (ICs). It can be very costly if ICs do not function properly, and large portions of designer effort are spent on their verification. The use of field-programmable gate arrays (FPGAs) for verification and debug of ICs is increasing. FPGAs are faster than simulation and cost less than fabricating an ASIC prototype. However, the major challenge of using FPGAs for verification and debug is observability. Designers must use special techniques to observe the values of FPGA's internal signals. This thesis proposes a new method for increasing the observability of FPGAs and demonstrates its feasibility. The new method incrementally inserts trace buffers controlled by a trigger into already placed-and-routed FPGA designs. Incremental insertion allows several drawbacks of typical trace-based approaches to be avoided such as influencing the placing and routing of the design, large area overheads, and slow turnaround times when changes must be made to the instrumentation. It is shown that it is possible to observe every flip flop in Xilinx Virtex-5 designs using the method, given that enough trace buffer capacity is available. We investigate factors that influence the results of the method. It is shown that making the trace buffers wide may lead to routing failures. Congested areas of the circuit must be avoided when placing the trigger or this may also lead to routing failures. A drawback of the method is that it may increase the minimum period of the design, but we show that pipelining can reduce these effects. The method proves to be a promising way to observe thousands of signals in a design, potentially allowing designers to fully reconstruct the internal values of an FPGA over multiple clock cycles to assist in verification and debug.
|
415 |
Architecture for a Symbolic Execution EnvironmentNorlén, Joacim January 2022 (has links)
Program testing is an important aspect of software development. Symbolic execution can be used as a tool to automatically verify the correctness of programs for all feasible paths of execution. Moreover, for embedded systems symbolic execution can be used to generate test cases to estimate run times to help determine the worst-case execution time (WCET) and schedulability of systems. This thesis explores an architecture for symbolic execution for use in embedded Rust. Accompanied with the architecture are implementation details of a prototype Symex that can handle small programs. Symex evaluates all feasible paths of execution looking for errors and assertions, and reports which concrete inputs lead to errors. Included with the prototype is a command-line tool to automatically build and analyze Rust projects. The tool allows for easy analysis of projects, and an included library provides functions to manipulate symbolic variables to aid the analysis. The method of evaluating all feasible paths work well with the purpose of evaluating embedded systems, where the aim is typically to keep the code complexity low. The low code complexity lends the software to be resilient towards path explosion. For the cases where this cannot be helped the functions to manipulate the symbolic variables in the analysis can be used to further constrain the variables and lower the number of feasible paths. The evaluation shows the architecture is feasible for the intended use case in embedded systems. Furthermore, evaluation of the prototype shows how the system can be used to show the absence of errors, verify functions, and check for functional equivalence. Inherent to the symbolic execution approach the system cannot handle programs with a large branching factor.
|
416 |
A Language-Recognition Approach to Unit Testing Message-Passing SystemsUbah, Ifeanyi January 2017 (has links)
This thesis addresses the problem of unit testing components in message-passing systems. A message-passing system is one that comprises components communicating with each other solely via the exchange of messages. Testing aids developers in detecting and fixing potential errors and with unit testing in particular, the focus is on independently verifying the correctness of single components, such as functions and methods, in a system whose behavior is well understood. With the aid of unit testing frameworks such as those of the xUnit family, this process can not only be automated and done iteratively, but easily interleaved with the development process, facilitating rapid feedback and early detection of errors in the system. However, such frameworks work in an imperative manner and as such, are unsuitable for verifying message-passing systems where the behavior of a component is encoded in its stream of exchanged messages. In this work, we recognise that similar to streams of symbols in the field of formal languages and abstract machines, one can specify properties of a component’s message stream such that they form a language. Unit testing a component thus becomes the description of an automaton that recognizes such a specified language. We propose a platform-independent, language-recognition approach to creating unit testing frameworks for describing and verifying the behavior of message-passing components, and use this approach in creating a prototype implementation for the Kompics component model. We show that this approach can be used to perform both black box and white box testing of components, and that it is easy to work with while preventing common mistakes in practice.
|
417 |
Connecting Self Enhancement And Self Verification Messages In FriendshipsBloch, Ann 01 January 2009 (has links)
This study investigates the connection between self-enhancement and self-verification and confirmation and emotional support. The hypotheses predicted that there is a positive relationship between confirmation and self-enhancement and self-verification; people feel good about themselves when confirmed by friends, people feel that friends know them well when they are confirmed. The hypotheses also predicted that there would be a positive relationship between emotional support and self-enhancement and self-verification; people feel good when friends provide emotional support, and people feel that friends know them well when provided emotional support. A research question was also posed: Does family functioning have an effect on perceptions of self-enhancement and self-verification messages? To find the answers, a questionnaire was completed by 279 individuals. The results indicate two types of enhancement messages; a more specific and positive form of enhancement and more global (and negative) self perception of rejection. The findings are interesting and unique to self-enhancement in communication research which provides many avenues for continued research. Results also suggest that different elements of confirming communication influences perceptions of enhancement in different ways, emotional support predicts verification.
|
418 |
An Automated Test Station Design Used to Verify Aircraft Communication ProtocolsBerrian, Joshua 01 October 2011 (has links) (PDF)
Requirements verification is typically the costliest part of the systems engineering design process. In the commercial aircraft industry, as the software and hardware design evolves, it must be verified to conform to requirements. In addition, when new design releases are made, regression analysis must be performed which usually requires repeat testing. To streamline verification, a suite of automated verification tools is described in this document which can reduce the test effort. This test suite can be qualified to be used to verify systems at any DO-178B design assurance level. Some of the software tools are briefly described below.
There are major advantages of this automated verification effort. The tools can either be internally developed by a company or purchased "off the shelf", depending upon budget and staff constraints. Every automated test case can be run with the click of a button and failures caused by human factors are reduced. The station can be qualified per DO-178B guidelines, and can also be expanded to support ARINC 429, AFDX, Ethernet, and MIL-STD-1553 interfaces. The expansion of these test programs would enable the creation of a universal avionics test suite with minimal cost and a reduction of the overall program verification effort.
The following is a presentation of an automated test station capable of reducing verification time and cost. The hardware and software aspects needed to create the test station are examined. Also, steps are provided to help guide a designer through the tool qualification process. Lastly, a full suite of test functions are included that can be implemented and customized to verify a wide range of avionics communication characteristics.
|
419 |
The Use of Short-Interval GPS Data for Construction Operations AnalysisHildreth, John C. 05 March 2003 (has links)
The global positioning system (GPS) makes use of extremely accurate measures of the time to determine position. The times required for electronic signals to travel at the speed of light from at least four orbiting satellites to a receiver on earth is measured precisely and used to calculate the distances from the satellites to the receiver. The calculated distances are used to determine the position of the receiver through triangulation.
This research takes an approach opposite the original GPS research, focusing on the use of position to determine the time at which events occur. Specifically, this work addresses the question: Can the information pertaining to position and speed contained in a GPS record be used to autonomously identify the times at which critical events occur within a production cycle?
The research question was answered by determining the hardware needs for collecting the desired data in a useable format an developing a unique data collection tool to meet those needs. The tool was field evaluated and the data collected was used to determine the software needs for automated reduction of the data to the times at which key events occurred. The software tools were developed in the form of Time Identification Modules (TIMs). The TIMs were used to reduce data collected from a load and haul earthmoving operation to duration measures for the load, haul, dump, and return activities.
The value of the developed system was demonstrated by investigating correlations between performance times in construction operations and by using field data to verify the results obtained from productivity estimating tools. Use of the system was shown to improve knowledge and provide additional insight into operations analysis studies. / Ph. D.
|
420 |
Feasible Workspace for Robotic Fiber PlacementMoutran, Serge Riad 21 May 2002 (has links)
Online consolidation fiber placement is emerging as an automated manufacturing process for the fabrication of large composite material complex structures. While traditional composite manufacturing techniques limited the products' size, geometrical shapes and laminate patterns, robotic automation of the fiber placement process allows the manufacture of complex bodies with any desired surface pattern or towpreg's direction. Therefore, a complete understanding of the robot kinematic capabilities should be made to accurately position the structure's substrate in the workcell and to compute the feasible product dimensions and sizes.
A Matlab algorithm is developed to verify the feasibility of straight-line trajectory paths and to locate all valid towpreg segments in the workspace, with no focus on optimization. The algorithm is applied preliminary to a three-link planar arm; and a 6-dof Merlin robot is subsequently considered to verify the towpreg layouts in the three-dimensional space. The workspace is represented by the longest feasible segments and plotted on parallel two-dimensional planes. The analysis is extended to locate valid square areas with predetermined dimensions. The fabrication of isotropic circular coupons is then tested with two different compaction heads. The results allow the formulation of a geometric correlation between the end-effector dimensional measures and the orientation of the end-effector with respect to the towpreg segments. / Master of Science
|
Page generated in 0.1373 seconds