• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 5
  • 5
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 14
  • 14
  • 14
  • 5
  • 5
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

The implementation of an input/output consistency checker for a requirements specification document

Welmers, Laura Hazel January 2010 (has links)
Typescript (photocopy). / Digitized by Kansas Correctional Industries / Department: Computer Science.
2

Graphics function standard specification validation with GKS

Fraser, Steven D. January 1987 (has links)
A validation methodology is proposed for natural language software specifications of standard graphics functions. Checks are made for consistency, completeness and lack of ambiguity in data element and function descriptions. Functions and data elements are maintained in a relational database representation. The appropriate checks are performed by sequences of database operations. The relational database manager INGRES was used to support a prototype implementation of the proposed technique. / The methodology supports the development of a scenario-based prototype from the information available in the specification. This permits various function sequences to be checked without implementation of the environment specified. / The application of a prototype implementation of the proposed methodology, to the specification of the GKS software package, demonstrates the practicability of the method. Several inconsistencies in GKS, related to the definition of data elements, have been identified.
3

Graphics function standard specification validation with GKS

Fraser, Steven D. January 1987 (has links)
No description available.
4

Testing concurrent software systems

Kilgore, Richard Brian 28 August 2008 (has links)
Not available
5

Error and occurrence analysis of Stanfins redesign at Computer Sciences Corporation

Khan, Irshad A. January 1990 (has links)
At Ball State University Dr. Wayne Zage and Professor Dolores Zage are working on a Design metrics project to develop a metrics approach for analyzing software design.The purpose of this thesis is to test the hypotheses of this metric by calculating the De external design component, and to show the correlation of errors and stress points in the design phase for a large Ada Software, professionally developed at Computer Sciences Corporation.From these studies we can relatively conclude that De does indicate the error-prone module. Since the D(G) is comprised of an internal and external component it is necessary to evaluate Di to support this hypothesis on a large project. Just by viewing the external complexity, the metric does a relatively good job of pointing out high error modules, with only viewing 10% of the modules we found 33% of the errors.Comparing the results of STANFINS-R and the results of the BSU projects, the BSU projects did better in finding the errors 33% verus 53%. However in the STANFINS project, we had a better success rate of finding the error modules. Of the modules highlighted 72% did contain errors. Thus if we loosened the criteria for selection of error prone modules we might have had a large percentage of the errors captured. / Department of Computer Science
6

Incremental Lifecycle Validation Of Knowledge-based Systems Through Commonkads

Batarseh, Feras 01 January 2011 (has links)
This dissertation introduces a novel validation method for knowledge-based systems (KBS). Validation is an essential phase in the development lifecycle of knowledge-based systems. Validation ensures that the system is valid, reliable and that it reflects the knowledge of the expert and meets the specifications. Although many validation methods have been introduced for knowledge-based systems, there is still a need for an incremental validation method based on a lifecycle model. Lifecycle models provide a general framework for the developer and a mapping technique from the system into the validation process. They support reusability, modularity and offer guidelines for knowledge engineers to achieve high quality systems. CommonKADS is a set of models that helps to represent and analyze knowledge-based systems. It offers a de facto standard for building knowledge-based systems. Additionally, CommonKADS is a knowledge representation-independent model. It has powerful models that can represent many domains. Defining an incremental validation method based on a conceptual lifecycle model (such as CommonKADS) has a number of advantages such as reducing time and effort, ease of implementation when having a template to follow, well-structured design, and better tracking of errors when they occur. Moreover, the validation method introduced in this dissertation is based on case testing and selecting an appropriate set of test cases to validate the system. The validation method defined makes use of results of prior test cases in an incremental validation procedure. This facilitates defining a minimal set of test cases that provides complete and effective system coverage. CommonKADS doesn’t define validation, verification or testing in any of its models. This research seeks to establish a direct relation between validation and lifecycle models, and introduces a validation method for KBS embedded into CommonKADS
7

A validation software package for discrete simulation models

Florez, Rossanna E. January 1986 (has links)
This research examined the simulation model validation process. After a model is developed, its reliability should be evaluated using validation techniques. This research was concerned with the validation of discrete simulation models which simulate an existing physical system. While there are many validation techniques available in the literature, only the techniques which compare available real system data to model data were considered by this research. Three of the techniques considered were selected and automated in a micro-computer software package. The package consists of six programs which are intended to aid the user in the model validation process. DATAFILE allows for real and model data input, and creates files using a DIF format. DATAGRAF plots real against model system responses and provides histograms of the variables. These two programs are based on the approach used in McNichol's statistical software. Hypothesis tests comparing real and model responses are conducted using TESTHYPO. The potential cost of using an invalid model, in conjunction with the determination of the alpha level of significance, is analyzed in COSTRISK. A non-parametric hypothesis test can be performed using NOTPARAM. Finally, a global validity measure can be obtained using VALSCORE. The software includes brief explanations of each technique and its use. The software was written in the BASIC computer language. The software was demonstrated using a simulation model and hypothetical but realistic system data. The hardware chosen for the package use was the IBM Personal Computer with 256k memory. / M.S.
8

Verification and validation of computer simulations with the purpose of licensing a pebble bed modular reactor

Bollen, Rob 12 1900 (has links)
Thesis (MBA)--Stellenbosch University, 2002. / ENGLISH ABSTRACT: The Pebble Bed Modular Reactor is a new and inherently safe concept for a nuclear power generation plant. In order to obtain the necessary licenses to build and operate this reactor, numerous design and safety analyses need to be performed. The results of these analyses must be supported with substantial proof to provide the nuclear authorities with a sufficient level of confidence in these results to be able to supply the required licences. Beside the obvious need for a sufficient level of confidence in the safety analyses, the analyses concerned with investment protection also need to be reliable from the investors’ point of view. The process to be followed to provide confidence in these analyses is the verification and validation process. It is aimed at presenting reliable material against which to compare the results from the simulations. This material for comparison will consist of a combination of results from experimental data, extracts from actual plant data, analytical solutions and independently developed solutions for the simulation of the event to be analysed. Besides comparison with these alternative sources of information, confidence in the results will also be built by providing validated statements on the accuracy of the results and the boundary conditions with which the simulations need to comply. Numerous standards exist that address the verification and validation of computer software, for instance by organisations such as the American Society of Mechanical Engineers (ASME) and the Institute of Electrical and Electronics Engineers (IEEE). The focal points of the verification and validation of the design and safety analyses performed on typical PBMR modes and states, and the requirements imposed by both the local and overseas nuclear regulators, are not entirely enveloped by these standards. For this reason, PBMR developed a systematic and disciplined approach for the preparation of the Verification and Validation Plan, aimed at capturing the essence of the analyses. This approach aims to make a definite division between software development and the development of technical analyses, while still using similar processes for the verification and validation. The reasoning behind this is that technical analyses are performed by engineers and scientists who should only be responsible for the verification and validation of the models and data they use, but not for the software they are dependent on. Software engineers should be concerned with the delivery of qualified software to be used in the technical analyses. The PBMR verification and validation process is applicable to both hand calculations and computer-aided analyses, addressing specific requirements in clearly defined stages of the software and Technical Analysis life cycle. The verification and validation effort of the Technical Analysis activity is divided into the verification and validation of models and data, the review of calculational tasks, and the verification and validation of software, with the applicable information to be validated, captured in registers or databases. The resulting processes are as simple as possible, concise and practical. Effective use of resources is ensured and internationally accepted standards have been incorporated, aiding in faith in the process by all stakeholders, including investors, nuclear regulators and the public. / AFRIKAASE OPSOMMING: Die Modulêre Korrelbedreaktor is ’n nuwe konsep vir ’n kernkragsentrale wat inherent veilig is. Dit word deur PBMR (Edms.) Bpk. ontwikkel. Om die nodige vergunnings om so ’n reaktor te kan bou en bedryf, te bekom, moet ’n aansienlike hoeveelheid ontwerp- en veiligheidsondersoeke gedoen word. Die resultate wat hierdie ondersoeke oplewer, moet deur onweerlegbare bewyse ondersteun word om vir die owerhede ’n voldoende vlak van vertroue in die resultate te gee, sodat hulle die nodigde vergunnings kan maak. Benewens die ooglopende noodsaak om ’n voldoende vlak van vertroue in die resultate van die veiligheidsondersoeke te hê, moet die ondersoeke wat met die beskerming van die beleggers se beleggings gepaard gaan, net so betroubaar wees. Die proses wat gevolg word om vertroue in die resultate van die ondersoeke op te bou, is die proses van verifikasie en validasie. Dié proses is daarop gerig om betroubare vergelykingsmateriaal vir simulasies voor te lê. Hierdie vergelykingsmateriaal vir die gebeurtenis wat ondersoek word, sal bestaan uit enige kombinasie van inligting wat in toetsopstellings bekom is, wat in bestaande installasies gemeet is, wat analities bereken is; asook dit wat deur ’n derde party onafhanklik van die oorspronklike ontwikkelaars bekom is. Vertroue in die resultate van die ondersoeke sal, behalwe deur vergelyking met hierdie alternatiewe bronne van inligting, ook opgebou word deur die resultate te voorsien van ’n gevalideerde verklaring wat die akkuraatheid van die resultate aantoon en wat die grensvoorwaardes waaraan die simulasies ook moet voldoen, opsom. Daar bestaan ’n aansienlike hoeveelheid internasionaal aanvaarde standaarde wat die verifikasie en validasie van rekenaarsagteware aanspreek. Die standaarde kom van instansies soos die Amerikaanse Vereniging vir Meganiese Ingenieurs (ASME) en die Instituut vir Elektriese en Elektroniese Ingenieurs (IEEE) – ook van Amerika. Die aandag wat deur die Suid-Afrikaanse en oorsese kernkragreguleerders vereis word vir die toestande wat spesifiek geld vir korrelbedreaktors, word egter nie geheel en al deur daardie standaarde aangespreek nie. Daarom het die PBMR maatskappy ’n stelselmatige benadering ontwikkel om verifikasie- en validasieplanne voor te berei wat die essensie van die ondersoeke kan ondervang. Hierdie benadering is daarop gemik om ’n duidelike onderskeid te maak tussen die ontwikkeling van sagteware en die ontwikkeling van tegniese ondersoeke, terwyl steeds gelyksoortige prosesse in die verifikasie en validasie gebruik sal word. Die rede hiervoor is dat tegniese ondersoeke uitgevoer word deur ingenieurs en wetenskaplikes wat net vir verifikasie en validasie van hulle eie modelle en die gegewens verantwoordelik gehou kan word, maar nie vir die verifikasie en validasie van die sagteware wat hulle gebruik nie. Ingenieurs wat spesialiseer in sagteware-ontwikkeling behoort verantwoordelik te wees vir die daarstelling van sagteware wat deur die reguleerders gekwalifiseer kan word, sodat dit in tegniese ondersoeke op veiligheidsgebied gebruik kan word. Die verifikasie- en validasieproses van die PBMR is sowel vir handberekeninge as vir rekenaarondersteunde-ondersoek geskik. Hierdie proses spreek spesifieke vereistes in onderskeie stadiums gedurende die lewenssiklusse van die ontwikkeling van sagteware en van tegniese ondersoeke aan. Die verifikasie- en validasiewerk vir tegniese ondersoeksaktiwiteite is verdeel in die verifikasie en validasie van modelle en gegewens, die nasien van berekeninge en die verifikasie en validasie van sagteware, waarby die betrokke inligting wat gevalideer moet word, versamel word in registers of databasisse. Die prosesse wat hieruit voortgevloei het, is so eenvoudig as moontlik, beknop en prakties gehou. Hierdeur is ’n effektiewe benutting van bronne verseker. Internasionaal aanvaarde standaarde is gebruik wat die vertroue in die proses deur alle betrokkenes, insluitende beleggers, die owerhede en die publiek, sal bevorder.
9

Projeto e validação de software automotivo com o método de desenvolvimento baseado em modelos / Automotive software project and validation with model based design

Nunes, Lauro Roberto 07 July 2017 (has links)
Os veículos automotivos pesados possuem funcionalidades particulares e aplicação em ambiente agressivo. Para garantir melhores desempenho, segurança e confiabilidade aos equipamentos eletrônicos embarcados, é necessário o aperfeiçoamento dos métodos e processos de desenvolvimento de software embarcado automotivo. Considerando a metodologia de desenvolvimento baseada em modelos (MBD) como um método em ascensão na indústria automotiva, este trabalho pesquisa contribuições nas atividades de engenharia de requisitos, otimização e validação do software, de forma a comprovar a eficácia do método e ferramentas utilizadas na busca pela qualidade final do produto (veículo comercial pesado). A base do trabalho refere-se à aplicação dos conceitos de integração de requisitos à simulação (MIL - Model in the Loop), comparação da otimização do software gerado automaticamente entre ferramentas comuns (IDE’s) e as baseadas em modelo, validação e cobertura do software gerado e uma forma alternativa de aumento da cobertura do código testado. / The automotive heavy-duty vehicles have particular functionalities and aggressive environment application. To ensure better performance, safety and reliability to electronic embedded equipment, it is necessary to invest in methods and process improvements in automotive software development. Considering Model Based Design (MBD) as an ascending development method in automotive industry, this work looks towards contributions in requirements engineering, software optimization and validation, in order to prove the method and tools efficiency in the final product quality (heavy-duty vehicle). This work refers to the appliance of requirement engineering integration to the simulation (MIL - Model in the Loop), comparison between optimization in usual programming tools (IDE’s) and Model Based Design tools, validation and software code coverage, and an alternative way of increasing code coverage of a tested code.
10

Uma contribuição ao teste baseado em modelo no contexto de aplicações móveis / A contribution to the model-based testing in the context of mobile applications

Farto, Guilherme de Cleva 08 March 2016 (has links)
Devido ao crescente número e diversidade de usuários, novas abordagens de teste são necessárias para reduzir a ocorrência de defeitos e garantir uma melhor qualidade em aplicações móveis. As particularidades desse tipo de software exigem que as técnicas de teste tradicionais sejam revisitadas e novas abordagens propostas. A natureza direcionada a eventos e as funcionalidades de aplicações móveis demandam que os testes sejam executados de maneira automatizada. O Teste Baseado em Modelo (TBM) apresenta-se como uma abordagem válida e promissora que oportuniza o uso de um processo definido, bem como de mecanismos e técnicas formais para o teste de aplicações móveis. Esta dissertação de mestrado tem como objetivo investigar a adoção de TBM junto à técnica de modelagem Event Sequence Graph (ESG) no teste de aplicações móveis para a plataforma Android. Inicialmente, o TBM é avaliado com o apoio de ESG e da ferramenta Robotium. Com base nos resultados obtidos e desafios identificados, propõe-se uma abordagem específica que fundamenta o reuso de modelos de teste para (i) reduzir o esforço manual demandado na etapa de concretização de casos de teste e (ii) testar distintas características inerentes ao contexto de mobilidade. Uma ferramenta de apoio foi projetada e desenvolvida para automatizar a abordagem proposta. Um estudo experimental em ambiente industrial é conduzido para avaliar a abordagem e a ferramenta propostas quanto à efetividade na redução do esforço requisitado para a concretização, bem como à capacidade de detecção de defeitos em aplicações móveis desenvolvidas na plataforma Android. / Due to the increasing number and diversity of users, new testing approaches are necessary to reduce the presence of faults and ensure better quality in mobile applications. The particularities of this class of software require that traditional testing techniques are revisited and new approaches proposed. The event oriented nature and functionalities of mobile applications demand tests that can be performed automatically. Model-Based Testing (MBT) is a valid and promising approach that favors the use of a defined process, as well as mechanisms and formal techniques for the testing of mobile applications. This dissertation investigates the odoption of MBT along with the modeling technique Event Sequence Graph (ESG) to test Android applications. Initially, we evaluate TBM supported by ESG and the Robotium tool. Based on the results and challenges identified, we propose a specific approach underlying the reuse of test models to (i) reduce the manual effort to the concretization of test cases and to (ii) test different and inherited characteristics of the mobility context. A supporting tool was designed and implemented to automate the proposed approach. Finaly, we conducted an experimental study in an industrial environment to evaluate the proposed approach and tool regarding the effectiveness in reducing the concretization’s efforts, as well as the fault detection capability in Android mobile applications.

Page generated in 0.0991 seconds