• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 27
  • 7
  • 5
  • 4
  • 2
  • 2
  • Tagged with
  • 66
  • 66
  • 32
  • 22
  • 22
  • 18
  • 16
  • 13
  • 11
  • 11
  • 10
  • 10
  • 9
  • 9
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Model-driven aviation training family of systems architecture

Holden, Trevor January 2017 (has links)
The Ph.D. project has evolved from focusing on the technical problem of the integration and interoperability of an assemblage of complex systems and SoS within a flight training system to development of a workflow process using frameworks to aid the decision making process for the selection of optimal flight training blending mixes. The focus of the research involved developing a methodology to satisfy research project proposal requirements agreed upon with the industrial sponsor. This thesis investigates the complexity of a modern flight training systems and the need for understanding that it is supported by a complex Family of Systems (FoS) including Virtual Reality Training Environments such as flight simulators, to live training aircraft with various configurations of avionic controls. One of the key technical problems today is how best to develop and assemble a family of flight training system into an integrated Live/Synthetic mix for aircrew training to optimise organisation and training objectives. With the increased use of emulation/synthetic data on aircraft for live training, the synthetic boundary is becoming increasingly blurred. Systematic consideration of the most appropriate blend is needed. The methodology used in the research is model driven and the architecture produced is described at a level of abstraction to enable communication to all stakeholders for the means of understanding the structure involved in the system design process. Relational Oriented Systems Engineering and Technology Trade-Off Analysis (ROSETTA) frameworks are described using Model Based Systems Engineering (MBSE) techniques for supporting capability based trade-off decisions for selection of optimal flight training FoS mixes dependent on capability. The research proposes a methodology and associated methods including a high-level systematic closed loop information management structure for blended device/tool aircrew training and a modelling and analysis approach for the FoS aviation training problem to enhance the existing training programmes to provide a more efficient and agile training environment. The mathematical formalisms used provide a method of quantifying subjective opinions and judgements for trade studies to be accomplished on the suitability of technology for each student pilot in relation to training and organisational objectives. The methodology presented is by no means a final solution, but a path for further research to enable a greater understanding of the suitability of training tools/technology used to train individual pilots at various stages throughout the training pipeline lifecycle(s).
52

Um processo de verificação e validação para os componentes do núcleo comum do middleware ginga

Caroca, Caio Regis 27 September 2010 (has links)
Made available in DSpace on 2015-05-14T12:36:58Z (GMT). No. of bitstreams: 1 arquivototal.pdf: 2666511 bytes, checksum: ea015181db046234ba92f7c73f6c3a90 (MD5) Previous issue date: 2010-09-27 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / Ginga is the official specification and standardized middleware for the Brazilian Digital TV System. The complexity demanded by the construction of this layer of software is high, which also increases the complexity of testing. The importance of software testing and its relationship with quality should be emphasized, since this type of system still has a high degree of complexity inherent to its development, mainly due to its specification is recent, and by proposing innovative features. Besides being considered a critical software, since failures in the implementation of middleware can compromise the success of Digital TV as a whole. The middleware is a key player within a Digital TV system since it is he who dictates the rules so that applications can be run on the platform. Thus, the correctness of middleware is of vital importance to enable interactive applications to run successfully. The project CDN Ginga (Ginga Code Development Network) is responsible for developing collaborative and distributed a reference version for PC Ginga. This implementation is based on software components and open to universities and companies. In this context, this paper proposes a process for verification and validation of middleware Ginga, to be deployed in parallel to the process of project development Ginga CDN, facing the common core components (Ginga-CC). For this purpose, we defined a set of tests, which aim to check the operation of middleware, as well as validate the different configurations of components, from middleware Ginga CDN generated by the network. / O middleware Ginga é a especificação oficial e padronizada de middleware para o Sistema Brasileiro de TV Digital. A complexidade demandada na construção dessa camada de software é alta, o que também aumenta a complexidade de se testar. A importância do teste de software e sua relação com a qualidade devem ser enfatizadas, visto que este tipo de sistema ainda possui alto grau de complexidade inerente ao seu desenvolvimento devido, principalmente, por sua especificação ainda ser recente, e por propor funcionalidades inovadoras. Além de ser considerado um software crítico, pois falhas na implementação do middleware podem comprometer o sucesso da TV Digital como um todo. O middleware é uma peça chave dentro de um sistema de TV Digital uma vez que é ele quem dita às regras para que as aplicações possam ser executadas na plataforma. Dessa forma, a corretude do middleware é de vital importância para permitir que as aplicações interativas sejam executadas com sucesso. O projeto Ginga CDN (Ginga Code Development Network) é responsável pelo desenvolvimento colaborativo e distribuído de uma versão de referência para PC do middleware Ginga. Essa implementação é baseada em componentes de software e aberta para universidades e empresas. Neste contexto, este trabalho propõe um processo para verificação e validação do middleware Ginga, para ser implantado em paralelo ao processo de desenvolvimento do projeto Ginga CDN, voltado para componentes do núcleo comum (Ginga-CC). Para tanto, foram definidos um conjunto de testes, os quais visam verificar o funcionamento do middleware, bem como, validar as diferentes configurações de componentes, desde middleware, geradas pela rede Ginga CDN.
53

PRECISE - Um processo de verificaÃÃo formal para modelos de caracterÃsticas de aplicaÃÃes mÃveis e sensÃveis ao contexto / PRECISE - A Formal Verification Process for Feature Models for Mobile and Context-Aware Applications

Fabiana Gomes Marinho 27 August 2012 (has links)
Conselho Nacional de Desenvolvimento CientÃfico e TecnolÃgico / As LPSs, alÃm do seu uso em aplicaÃÃes tradicionais, tÃm sido utilizadas no desenvolvimento de aplicaÃÃes que executam em dispositivos mÃveis e sÃo capazes de se adaptarem sempre que mudarem os elementos do contexto em que estÃo inseridas. Essas aplicaÃÃes, ao sofrerem alteraÃÃes devido a mudanÃas no seu ambiente de execuÃÃo, podem sofrer adaptaÃÃes inconsistentes e, consequentemente, comprometer o comportamento esperado. Por esse motivo, à essencial a criaÃÃo de um processo de verificaÃÃo que consiga checar a corretude e a consistÃncia dessas LPSS, bem como checar a corretude tanto dos produtos derivados como dos produtos adaptados dessas LPSs. Sendo assim, nesta tese de doutorado à proposto o PRECISE - um Processo de VerificaÃÃo Formal para Modelos de CaracterÃsticas de AplicaÃÃes MÃveis e SensÃveis ao Contexto. O PRECISE auxilia na identificaÃÃo de defeitos na modelagem da variabilidade de uma LPS para aplicaÃÃes mÃveis e sensÃveis ao contexto e, assim, minimiza problemas que ocorreriam durante a execuÃÃo dos produtos gerados a partir dessa LPS. à importante ressaltar que o PRECISE à definido com base em uma especificaÃÃo formal e em um conjunto de propriedades de boa formaÃÃo elaborados usando LÃgica de Primeira Ordem. Essa especificaÃÃo à um prÃ-requisito para a realizaÃÃo de uma modelagem da variabilidade sem ambiguidades. Para avaliar o PRECISE, uma validaÃÃo à realizada a partir da especificaÃÃo formal e das propriedades de boa formaÃÃo definidas no processo. Essa validaÃÃo tem como objetivo mostrar que o PRECISE consegue identificar defeitos, anomalias e inconsistÃncias existentes em um modelo de variabilidades de uma LPS para aplicaÃÃes mÃveis e sensÃveis ao contexto. Nessa validaÃÃo, cinco tÃcnicas diferentes sÃo utilizadas: Perfil UML, OCL, LÃgica Proposicional, Prolog e SimulaÃÃo. AlÃm de minimizar os defeitos e inconsistÃncias dos modelos de variabilidades das LPSs, o PRECISE ainda se beneficia da generalidade e flexibilidade intrÃnsecas à notaÃÃo formal usada na sua especificaÃÃo. / SPLc have been used to develop different types of applications, including the ones that run on mobile devices and are able to adapt when the context elements in which they are located change. These applications can change due to variations in their execution environment and inconsistent adaptations can occur, compromising the expected behavior. Then there is a need for creating a verification process to check the correctness and consistency of these SPLs as well as to check the correctness of both derived products and adapted products from these SPLs. Thus, this work proposes PRECISE - A Formal Verification Process for Feature Models of Mobile and Context-Aware Applications. PRECISE helps to identify defects in the variability modeling of an SPL for mobile and context-aware applications, minimizing problems that can take place during the execution of products generated from this SPL. It is worth noting that PRECISE is defined based on a formal specification and a set of well-formedness properties developed using First-Order Logic, which are prerequisites for the achievement of an unambiguous variability modeling. To evaluate PRECISE, a validation is performed from the formal specification and well-formedness properties defined in the process. This validation intends to show that PRECISE is able to identify defects, anomalies and inconsistencies in a variability model of an SPL for mobile and context-aware applications. In this validation, five different techniques are used: UML Profile, OCL, Propositional Logic, Prolog and Simulation. While minimizing the defects and inconsistencies in the variability models of an SPL, PRECISE still benefits from the generality and flexibility intrinsic to the formal notation used in its specification.
54

An Empirical Evaluation & Comparison of Effectiveness & Efficiency of Fault Detection Testing Techniques

Natraj, Shailendra January 2013 (has links)
Context: The thesis is the analysis work of the replication of software experiment conducted by Natalia and Sira at Technical University of Madrid, SPAIN. The empirical study was conducted for the verification and validation of experimental data, and to evaluate the effectiveness and efficiency of the testing techniques. The analysis blocks, considered for the analysis were observable fault, failure visibility and observed faults. The statistical data analysis involved the ANOVA and Classification package of SPSS. Objective: To evaluate and compare the result obtained from the statistical data analysis. To establish the verification and validation of effectiveness and efficiency of testing techniques by using ANOVA and Classification tree analysis for percentage subject, percentage defect-subject and values (Yes / No) for each of the blocks. RQ1: Empirical evaluation of effectiveness of fault detection testing technique, using data analysis (ANOVA and Classification tree package). For the blocks (observable fault, failure visibility and observed faults) using ANOVA and Classification tree. RQ2: Empirical evaluation of efficiency of fault detection technique, based on time and number of test cases using ANOVA. RQ3: Comparison and inference of the obtained results for both effectiveness and efficiency. Method:The research will be focused on the statistical data analysis to empirically evaluate the effectiveness and efficiency of the fault detection technique for the experimental data collected at UPM (Technical university of Madrid, SPAIN). Empirical Strategy Used: Software Experiment. Results: Based on the planned research work. The analysis result obtained for the observable fault types were standardized (Ch5). Within the observable fault block, both the techniques, functional and structural were equally effective. In the failure visibility block, the results were partially standardized. The program types nametbl and ntree were equally effective in fault detection than cmdline. The result for observed fault block was partially standardized and diverse. The list for significant factors in this blocks were program types, fault types and techniques. In the efficiency block, the subject took less time in isolating the fault in the program type cmdline. Also the efficiency in fault detection was seen in cmdline with the help of generated test cases. Conclusion:This research will help the practitioners in the industry and academic in understanding the factors influencing the effectiveness and efficiency of testing techniques.This work also presents a comprehensive analysis and comparison of results of the blocks observable fault, failure visibility and observed faults. We discuss the factors influencing the efficiency of the fault detection techniques. / shailendra.natraj@gmail.com +4917671952062
55

Contribution à la Spécification et à la Vérification des Exigences Temporelles : Proposition d’une extension des SRS d’ERTMS niveau 2 / Contribution for the Specification and the Verification of Temporal Requirements : Proposal of an extension for the ERTMS-Level 2 specifications

Mekki, Ahmed 18 April 2012 (has links)
Les travaux développés dans cette thèse visent à assister le processus d’ingénierie des exigences temporelles pour les systèmes complexes à contraintes de temps. Nos contributions portent sur trois volets : la spécification des exigences, la modélisation du comportement et la vérification. Pour le volet spécification, une nouvelle classification des exigences temporelles les plus communément utilisées a été proposée. Ensuite, afin de cadrer l’utilisateur durant l’expression des exigences, une grammaire de spécification à base de motifs prédéfinis en langage naturel est développée. Les exigences générées sont syntaxiquement précises et correctes quand elles sont prises individuellement, néanmoins cela ne garantie pas la cohérence de l’ensemble des exigences exprimées. Ainsi, nous avons développé des mécanismes capables de détecter certains types d’incohérences entre les exigences temporelles. Pour le volet modélisation du comportement, nous avons proposé un algorithme de transformation des state-machine avec des annotations temporelles en des automates temporisés. L’idée étant de manipuler une notation assez intuitive et de générer automatiquement des modèles formels qui se prêtent à la vérification. Finalement, pour le volet vérification, nous avons adopté une technique de vérification à base d’observateurs et qui repose sur le model-checking. Concrètement, nous avons élaboré une base de patterns d’observation (ou observateurs) ; chacun des patterns développés est relatif à un type d’exigence temporelle dans la nouvelle classification. Ainsi, la vérification est réduite à une analyse d’accessibilité des états correspondants à la violation de l’exigence associée / The work developed in this thesis aims to assist the engineering process of temporal requirements for time-constrained complex systems. Our contributions concern three phases: the specification, the behaviour modelling and the verification. For the specification of temporal requirements, a new temporal properties typology taking into account all the common requirements one may meet when dealing with requirements specification, is introduced. Then, to facilitate the expression, we have proposed a structured English grammar. Nevertheless, even if each requirement taken individually is correct, we have no guarantee that a set of temporal properties one may express is consistent. Here we have proposed an algorithm based on graph theory techniques to check the consistency of temporal requirements sets. For the behaviour modelling, we have proposed an algorithm for transforming UML State Machine with time annotations into Timed Automata (TA). The idea is to allow the user manipulating a quite intuitive notation (UML SM diagramsduring the modelling phase and thereby, automatically generate formal models (TA) that could be used directly by the verification process. Finally, for the verification phase, we have adopted an observer-based technique. Actually, we have developed a repository of observation patterns where each pattern is relative to a particular temporal requirement class in our classification. Thereby, the verification process is reduced to a reachability analysis of the observers’ KO states relatives to the requirements’ violation
56

Zásobník tepla solární soustavy / Solar hot water storage tank

Vyhlídalová, Karolína January 2020 (has links)
The solar hot water storage tank is off great importance in the solar collector array. It allows transformed energy accumulation thus deals with the inconsistency between supply and demand. The suitable design of the storage tank can improve system efficiency. The storage capacity represents the balance between the amount of stored hot water and the tank's heat losses. The design of the storage capacity is based on three hypotheses. The coverage of hot water demand by solar energy, the ratio between storage capacity and solar thermal collector area and the prediction that the storage capacity corresponds to one- to twofold hot water demand. The purpose of this thesis is to share an understanding of the solar storage tanks design and to improve the design through numerical simulation, experimentations and general calculations. It also focuses on the confirmation of the used hypotheses and determination of the best way to design the solar storage tank for general practice and further potential discussions. The simulation model has three variables – the storage capacity, collector area and the number of occupants. The intent is to find the interdependence of these three variables. The purpose of the simulations is to modify the design of the solar tank based on the mutual influence of studied parameters. The modifications are performed based on the users' needs.
57

Stability And Objectivity Of A Bubbly And Slug Flow Two-Fluid Model With Wake Entrainment

Krishna chaitanya Chetty anamala (9746450) 15 December 2020 (has links)
<div>The current study is aimed at developing a well-posed and objective, i.e., frame invariant, Eulerian one-dimensional (1D) Two-Fluid Model (TFM) to predict flow regime transition from dispersed to clustered bubbly and slug flow for vertical adiabatic two-phase flows. Two-phase flows in general are characterized by local material wave or void fraction wave instabilities and flow regime transitions are one of the important consequences of these instabilities. The physical mechanism of wake entrainment for clustering of dispersed bubbles is proposed, leading to formation of bubble clusters and Taylor bubbles. The focus of the work is on simulation of the local interfacial structures for bubble clusters and Taylor bubbles, using a well-posed, unstable and non-linearly bounded 1D Shallow Water TFM.</div><div><br></div><div>The first part of the current study investigates the dynamic behavior of the well posed 1D mechanistic TFM obtained from the averaging approach of Ishii [1], due to wake entrainment instability. For this, a 1D Shallow Water TFM derived from the 1D mechanistic TFM is used, which retains the same dynamic behavior as that of the latter at short wavelengths and the required wake entrainment force is derived mechanistically. Three stability approaches are followed to study the dynamic behavior of the 1D Shallow Water TFM: characteristics, dispersion analysis, and nonlinear numerical simulations. An in-house code is used for the 1D numerical simulations of the growth of void fraction waves due to wake entrainment. The simulation results are validated with the experimental data of Cheng and Azzopardi [2] and Song et al. [3] To conclude the first part, the 1D results of the two-equation Shallow Water TFM are carried over to the complete four-equation TFM for quasi 1D simulations using the commercial CFD code of ANSYS Fluent.</div><div><br></div>As an alternative to the mechanistic approach, which is based on Newtonian mathematics, a variational approach based on Lagrangian and Hamiltonian mathematics is used in the second part of the thesis. While the mechanistic approach operates in terms of forces acting on the two-phase mixture, the variational approach operates in terms of energies of the two-phase system. To derive the equations of motion using the variational approach, the extended Hamilton principle of least action is applied to the Lagrangian density of the two-phase mixture. One of the appealing features<br>17<br>of this procedure is that the derived equations of motion are objective (Geurst [4]), in particular the added mass terms.<br>Thus, the second part of the current study focuses on deriving an objective, well-posed and unstable 1D TFM as well as developing a constitutive model for the wake entrainment effect using the variational method. Additional momentum transfer terms present in both the liquid phase and gas phase momentum equations, which render the variational TFM objective, are discussed. The variational method is then used to derive the 1D Shallow Water TFM using the fixed flux assumption. The conservative interfacial momentum transfer terms require formulation of the inertial coupling between the phases. Potential flow theory is first used to derive the inertial coupling coefficient for a single bubble and then for a pair of bubbles to consider interaction between the two bubbles. Then, a lumped parameter model is used to derive the inertial coupling coefficient for the wake entrainment effect. A local drag coefficient is obtained for the non-conservative interfacial drag force from the experimental data using kinematic approximation, i.e., force balance between drag and gravity. The linear and non-linear stability analyses are used to address the stability of the 1D variational Shallow Water TFM. The presence of appropriate short-wave physics makes the 1D Shallow Water TFM hyperbolic well-posed and kinematically unstable. Finally, numerical simulations are performed to demonstrate the development of void fraction waves due wake entrainment. The growth of void fraction waves is non-linearly bounded, i.e., Lyapunov stable. The simulation results are compared with the experimental data to validate the propagation properties of void fraction waves for bubble clusters and Taylor bubbles. This work illustrates the short-wave two-phase flow simulation capability of the TFM for the bubbly to slug flow regime transition.
58

Simuleringsarbete i medicinteknisk produktion : En fallstudie med dataanalys, verifiering och validering samt flaskhalsanalys av en simuleringsmodell / Simulation in medical technology production

Sjölander, Martina, Viklund, Philip January 2020 (has links)
The world is constantly changing and evolving faster and faster. In the industry, it is important to constantly evolve to adapt to the market and maintain competitiveness. The ongoing fourth industrial revolution is urging companies to increase digitalization and automate production. Simulation is a tool that can be used to develop, test and control production. In the pharmaceutical industry, the manufacturing and the product are required to ensure good quality. At the same time, the demand of pharmaceuticals and medical devices is high. With simulation, production systems can be evaluated, developed and improved a safer and more efficient way.  The purpose of the study is to investigate, through a simulation, whether the mass-producing pharmaceutical industry can utilize Discrete Event Simulation to improve production planning and control. The purpose was partly fulfilled by a situation assessment, which together with a literature study, was used to find out what data is required to establish a simulation model. Finally, a simulation model was established, representative of reality and a bottleneck analysis was performed as part of the validation. The study is based on a case study in which qualitative and quantitative methods have been used to answer the research questions. The method is designed to be able to be carried out remotely, since it has not been possible to attend the company where the study was conducted. Two different simulation models, which differ in detail level, are presented in the results of the study. The more detailed level model could be used as a basis for improvement work. The simplified model reflects reality thoroughly and can be used for production planning and production capacity estimations. The bottleneck analysis shows which process is the bottleneck and corresponds to the company's perception. The largest bottleneck was assembly and blister packing. The most important suggestion of improvement was increase of the process availability in the blister packing, which increases production throughput with 6 percent and reduces lead time with 1,5 days. To utilize the detailed simulation model, data collection at sub-process level is also proposed. / Världen förändras hela tiden och utvecklas snabbare och snabbare. Inom industrin gäller det att ständigt utvecklas för att kunna anpassa sig till marknaden och bibehålla konkurrenskraft. Den pågående fjärde industriella revolutionen uppmanar företag att öka digitalisering och automatisera produktionen. Simulering är ett verktyg som kan användas för att utveckla, testa och kontrollera produktionen. I läkemedelsbranschen kräver kunder att tillverkningen och produkten tillser god kvalitet. Samtidigt är efterfrågan stor på läkemedel och medicintekniska produkter. Med simulering kan produktionssystem utvecklas på ett säkrare och mer effektivt sätt.   Syftet med studien är att genom en simulering undersöka om massproducerande läkemedelsindustrin kan utnyttja händelsestyrd simulering för att förbättra produktionsplanering och styrning. Syftet uppfylldes delvis genom en nulägesanalys, som tillsammans med en litteraturstudie användes för att ta reda på vilken data som krävs för att upprätta en simuleringsmodell. Avslutningsvis upprättades en simuleringsmodell, representativ till verkligheten och en flaskhalsanalys genomfördes som del av valideringen. Studien grundar sig i en fallstudie där kvalitativa och kvantitativa metoder har använts för att besvara frågeställningarna. Metoden är utformad för att kunna utföras på distans då det ej har varit möjligt att närvara på företaget där studien är genomförd.   Två olika simuleringsmodeller, som skiljer sig i detaljnivå, presenteras i resultatet av studien. Den mer detaljerade skulle kunna användas som ett underlag för förbättringsarbete. Den förenklade modellen återspeglar produktionen översiktligt och kan användas för produktionsplanering. Flaskhalsaanalysen påvisar vilken process som är flaskhalsen och stämmer väl överens med företagets uppfattning. Montering och blistring utger sig som den största flaskhalsen och förbättringsförslag kopplat till detta är främst en ökad tillgänglighet hos processen blistring som ökar produktionstakten med 6 procentenheter och sänker produktionsledtiden med 1,5 dagar. För att kunna utnyttja den detaljerade simuleringsmodellen föreslås vidare datainsamling på delprocessnivå.
59

Experimental evaluation of an electro-Hydrostatic actuator for subsea applications in a hyperbaric chamber

Duarte da Silva, João Pedro, Neto, Amadeu Plácido, De Negri, Victor Juliano, Orth, Alexandre 23 June 2020 (has links)
A novel Electro-Hydrostatic Actuator (EHA) prototype – designed to operate on subsea gate valves in deep and ultra-deep water – is analysed and qualified in terms of functionality under design and normative constraints. The prototype is assembled in a test bench for load control in a hyperbaric chamber where the high subsea environmental pressure can be emulated. The process variables under evaluation are monitored through a set of pressure and position sensors, which are part of the prototype design. The experimental results demonstrate a robust behaviour of the actuator concerning the imposed external pressure and load forces even with a forced limitation in its power input. Moreover, the prototype performs consistently throughout the entire endurance trial, asserting high reliability. With the results obtained, the subsea EHA concept is effectually eligible to a technology readiness level 4, according to the API 17N.
60

Testing for verification and validation of an onboard orbit determination system exploiting GNSS : A nanosatellite application for HERMES-SP / Testning for verifikation och validering av ett ombord banbestämningssystem med GNSS : En applikation av nanosatelliter för HERMES-SP

Nermark, Clara January 2023 (has links)
When developing products for space, including nanosatellites, the verification and validation process is a mandatory part of any project conducted within the European space industry. Within such a process, testing is a method for verification and validation. In this degree project, the appropriate tests for verification and validation of a nanosatellite were investigated. The project was conducted at the Royal Institute of Technology and the Polytechnic of Milan, as part of a larger research project under the name HERMES-SP. The research project was, at the time at which the degree project was taking place, in its first phase of the verification process. Therefore, tests for verification and validation of the Orbit Determination System (ODS) had not yet been defined. HERMES-SP is developing a nanosatellite platform with a very precise and reliable ODS, combining both Inertial Navigation System (INS) and Global Navigation Satellite System (GNSS). This degree project was thus conducted with HERMES-SP as an applicative case to investigate tests for a ’nanosatellites onboard ODS focusing on the GNSS. The ODS developed for the nanosatellite platform was studied, along with the underlying theory for ODS and GNSS. The plan for verification defined within HERMES-SP was also examined, and the presented methodology for test development was followed. To fully answer the project’s research question, the appropriate tests had to be identified and defined. This was done by first determining the requirements related to the ODS, and then identifying the tests that were needed to verify the requirements. Lastly, the tests were defined in test specifications and procedures. It was found that the relevant tests in the verification process were a handful of tests on the Equipment Test (ET), Software Test (SWT), and Subsystem Integration Test (SSIT) test levels. The tests were needed for verification of individual components in the system, as well as integrated components and their interfaces. The defined tests were considered appropriate for verification and validation for the first phase of the verification process. The project contributed to the identification and definition of tests for a restricted part of the verification process, related to the specified system of the HERMES-SP nanosatellite. The findings could be used in other nanosatellite projects with similar ODS by following the process and the methodology for test development documented in this report. / Vid utvecklandet av produkter ämnade för rymden, såsom satelliter, är processen för verifiering och validering en obligatorisk del av projekt utförda inom den Europeiska rymdindustrin. Under en sådan process är testning en metod för verfiering och validering. I detta examensarbete undersöktes de lämpliga testerna för verifiering och validering av en nanosatellit. Arbetet utfördes på Kungliga Tekniska Högskolan (KTH) och Politecnico di Milano, som en del av ett större foskningsprojekt under namnet HERMES-SP. När detta examensarbete tog plats var forskningsprojektet i sin första verifieringsfas. Därför hade inte tester för verifiering och validering av systemet för ombord banbestämning ännu definerats. Inom HERMES-SP utvecklas en platform för nanosatelliter med ett precist och tillförlitligt banbestämmnings system. Systemet kombinerar därför både tröghetsnavigeringssystem och satellitnavigering (GNSS). Systemet för ombord banbestäming utvecklat för nanosatelliten studerades, tillsammans med underliggande teori för banbestäming och GNSS. HERMES-SPs plan för verifiering och validering studerades, och den presenterade metodiken för testning adapterades. För att besvara arbetets forskningsfråga behövdes de lämpliga testerna identifieras och sedan defineras. Detta gjorder genom att först bestämma krav på systemet för banbestämning, och därefter identifiera de tester som behövdes för att verifiera kraven. Sist definerades testen i form av test specifikationer och test procedurer. Arbetet resulterade i att en handfull at tester relevanta för verifieringsprocessen identifierades. Dessa tester tillhörde olika nivåer av testing, nämligen testning av komponenter, mjukvara, och integrering av delsystem. Dessa tester var nödvändiga för att utvärdera individuella komponenter i systemet, samt integrerade komponenter och deras gränssnitt. De tester som definerades i arbetet ansågs nödvändiga för verifiering och validering under den första fasen av processen för verifiering. Examensarbetet bidrog till identifiering och definering av tester tillhörande en begränsad fas av verifieringsprocessen, relaterade till det specifiecerade systemet av HERMES-SPs nanosatellit. Upptäckterna skulle kunna användas i andra projekt för nanosatelliter med liknande system för banbestämning, genom att följa metodiken för utveckling av tester dokumenterade i denna rapport.

Page generated in 0.2217 seconds