• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 37
  • 12
  • 3
  • 2
  • 1
  • 1
  • Tagged with
  • 61
  • 20
  • 14
  • 14
  • 14
  • 13
  • 13
  • 10
  • 8
  • 8
  • 8
  • 8
  • 7
  • 7
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Detekce oblasti otisku prstu v obraze / Detection of Fingerprint Area in Image

Doležel, Michal January 2010 (has links)
This master's thesis deals with proposal and implementation of system for detection of fingerprint area in image. The first task was to elaborate the theory which is necessary for understanding the image fingerprint area detection problems. It is also necessary to propose a specific system for image fingerprint area detection where it is possible to enhance or improve present methods or design a new one. The proposed system making use of selected method will be able to avoid all problems arising during fingerprint area detection. Description of proposed system implementation and testing on the fingerprint database is described in following part. In last part all the achieved results are discussed.
52

Run-time Anomaly Detection with Process Mining: Methodology and Railway System Compliance Case-Study

Vitale, Francesco January 2021 (has links)
Detecting anomalies in computer-based systems, including Cyber-Physical Systems (CPS), has attracted a large interest recently. Behavioral anomalies represent deviations from what is regarded as the nominal expected behavior of the system. Both Process science and Data science can yield satisfactory results in detecting behavioral anomalies. Within Process Mining, Conformance Checking addresses data retrieval and the connection of data to behavioral models with the aim to detect behavioral anomalies. Nowadays, computer-based systems are increasingly complex and require appropriate validation, monitoring, and maintenance techniques. Within complex computer-based systems, the European Rail Traffic Management System/European Train Control System (ERTMS/ETCS) represents the specification of a standard Railway System integrating heterogeneous hardware and software components, with the aim of providing international interoperability with trains seemingly interacting within standardized infrastructures. Compliance with the standard as well as expected behavior is essential, considering the criticality of the system in terms of performance, availability, and safety. To that aim, a Process Mining Conformance Checking process can be employed to validate the requirements through run-time model-checking techniques against design-time process models. A Process Mining Conformance Checking methodology has been developed and applied with the goal of validating the behavior exposed by an ERTMS/ETCS system during the execution of specific scenarios. The methodology has been tested and demonstrated correct classification of valid behaviors exposed by the ERTMS/ETCS system prototype. Results also showed that the Fitness metric developed in the methodology allows the detection of latent errors in the system before they can generate any failures.
53

Combining Conformance Quality and Experiential Quality in the Delivery of Health Care

Senot, Claire 24 June 2014 (has links)
No description available.
54

Cross-fertilizing formal approaches for protocol conformance and performance testing / Approches formelles croisées pour les tests de protocole de conformité et de performance

Che, Xiaoping 26 June 2014 (has links)
Les technologies de communication et les services web sont devenus disponibles dans notre vie numérique, les réseaux informatiques continuent de croître et de nouveaux protocoles de communication sont constamment définis et développés. Par la suite, la standardisation et la normalisation des protocoles sont dispensables pour permettre aux différents systèmes de dialoguer. Bien que ces normes peuvent être formellement vérifiés, les développeurs peuvent produire des erreurs conduisant à des implémentations défectueuses. C'est la raison pour laquelle leur mise en œuvre doit être strictement examinée. Cependant, la plupart des approches de tests actuels exigent une stimulation de l’exécution dans le cadre des tests (IUT). Si le système ne peut être consulté ou interrompu, l'IUT ne sera pas en mesure d'être testé. En outre, la plupart des travaux existants sont basées sur des modèles formels et très peu de travaux s'intéressent à la formalisation des exigences de performance. Pour résoudre ces problèmes, nous avons proposé une approche de test basé sur la logique "Horn" afin de tester passivement la conformité et la performance des protocoles. Dans notre approche, les exigences peuvent être formalisées avec précision. Ces exigences formelles sont également testées par des millions de messages collectés à partir des communicants réels. Les résultats satisfaisants des expériences effectuées ont prouvé le bon fonctionnement et l'efficacité de notre approche. Aussi pour satisfaire les besoins croissants de tests distribués en temps réel, nous avons également proposé un cadre de tests distribués et un cadre de tests en ligne et nous avons mis en œuvre notre plateforme dans un environnement réel à petite échelle avec succès / While today’s communications are essential and a huge set of services is available online, computer networks continue to grow and novel communication protocols are continuously being defined and developed. De facto, protocol standards are required to allow different systems to interwork. Though these standards can be formally verified, the developers may produce some errors leading to faulty implementations. That is the reason why their implementations must be strictly tested. However, most current testing approaches require a stimulation of the implementation under tests (IUT). If the system cannot be accessed or interrupted, the IUT will not be able to be tested. Besides, most of the existing works are based on formal models and quite few works study formalizing performance requirements. To solve these issues, we proposed a novel logic-based testing approach to test the protocol conformance and performance passively. In our approach, conformance and performance requirements can be accurately formalized using the Horn-Logic based syntax and semantics. These formalized requirements are also tested through millions of messages collected from real communicating environments. The satisfying results returned from the experiments proved the functionality and efficiency of our approach. Also for satisfying the increasing needs in real-time distributed testing, we also proposed a distributed testing framework and an online testing framework, and performed the frameworks in a real small scale environment. The preliminary results are obtained with success. And also, applying our approach under billions of messages and optimizing the algorithm will be our future works
55

Process capability assessment for univariate and multivariate non-normal correlated quality characteristics

Ahmad, Shafiq, Shafiq.ahmad@rmit.edu.au January 2009 (has links)
In today's competitive business and industrial environment, it is becoming more crucial than ever to assess precisely process losses due to non-compliance to customer specifications. To assess these losses, industry is extensively using Process Capability Indices for performance evaluation of their processes. Determination of the performance capability of a stable process using the standard process capability indices such as and requires that the underlying quality characteristics data follow a normal distribution. However it is an undisputed fact that real processes very often produce non-normal quality characteristics data and also these quality characteristics are very often correlated with each other. For such non-normal and correlated multivariate quality characteristics, application of standard capability measures using conventional methods can lead to erroneous results. The research undertaken in this PhD thesis presents several capability assessment methods to estimate more precisely and accurately process performances based on univariate as well as multivariate quality characteristics. The proposed capability assessment methods also take into account the correlation, variance and covariance as well as non-normality issues of the quality characteristics data. A comprehensive review of the existing univariate and multivariate PCI estimations have been provided. We have proposed fitting Burr XII distributions to continuous positively skewed data. The proportion of nonconformance (PNC) for process measurements is then obtained by using Burr XII distribution, rather than through the traditional practice of fitting different distributions to real data. Maximum likelihood method is deployed to improve the accuracy of PCI based on Burr XII distribution. Different numerical methods such as Evolutionary and Simulated Annealing algorithms are deployed to estimate parameters of the fitted Burr XII distribution. We have also introduced new transformation method called Best Root Transformation approach to transform non-normal data to normal data and then apply the traditional PCI method to estimate the proportion of non-conforming data. Another approach which has been introduced in this thesis is to deploy Burr XII cumulative density function for PCI estimation using Cumulative Density Function technique. The proposed approach is in contrast to the approach adopted in the research literature i.e. use of best-fitting density function from known distributions to non-normal data for PCI estimation. The proposed CDF technique has also been extended to estimate process capability for bivariate non-normal quality characteristics data. A new multivariate capability index based on the Generalized Covariance Distance (GCD) is proposed. This novel approach reduces the dimension of multivariate data by transforming correlated variables into univariate ones through a metric function. This approach evaluates process capability for correlated non-normal multivariate quality characteristics. Unlike the Geometric Distance approach, GCD approach takes into account the scaling effect of the variance-covariance matrix and produces a Covariance Distance variable that is based on the Mahanalobis distance. Another novelty introduced in this research is to approximate the distribution of these distances by a Burr XII distribution and then estimate its parameters using numerical search algorithm. It is demonstrates that the proportion of nonconformance (PNC) using proposed method is very close to the actual PNC value.
56

The role of process conformance and developers' skills in the context of test-driven development

Fucci, D. (Davide) 26 April 2016 (has links)
Abstract Modern software development must adapt to demanding schedules while keeping the software at a high level of quality. Agile software development has been adopted in recent years to meet such a need. Test-driven development (TDD) is one practice that has arisen within the agile software development movement that leverages unit tests to develop software in incremental cycles. TDD supporters claim that the practice increases the productivity of the practitioners who employ it, as well as the internal and external quality of the software they develop. In order to validate or refute such claims, the software engineering research community has studied TDD for the last decade; the results of the empirical studies on the effects of TDD have been mostly inconclusive. This dissertation has studied two factors that may impact the manifestation of the claimed effects of TDD on software’s external quality and developers’ productivity: the developers’ conformance to the process (i.e., their ability to follow TDD) and their skills. The research was performed in four phases. In the first phase, the literature was reviewed to identify a set of factors that have been considered to affect TDD. In the second phase, two experiments were executed within academia. A total of 77 students at the University of Oulu, took part in the studies. These experiments investigated the quality of the software, as well as the subject’s productivity with respect to their programming and testing skills. A follow-up study, using data collected during the second experiment, explored the relation between the quality, productivity and the subjects’ process conformance. In the third phase, four industrial experiments, involving 30 professional, were performed. Process conformance and skills were investigated in relation to the TDD’s effects on external quality and productivity. The fourth phase synthesized the evidence gathered in the two previous phases. The results show that TDD is not associated with improvements in external quality, or developers’ productivity. Further, improvements in both external quality and productivity are associated with skills, rather than with the process, at least in the case of professional developers. Hence, process conformance has a negligible impact. The productivity of novice developers, on the other hand, can benefit from the test-first approach promoted by TDD. / Tiivistelmä Modernin ohjelmistokehityksen täytyy mukautua haastaviin aikatauluihin säilyttäen ohjelmistojen korkea laatu. Ketterä ohjelmistokehitys on viime vuosina omaksuttu tähän tarpeeseen ja suuntauksessa on saanut alkunsa testivetoisen kehityksen käytäntö, joka hyödyntää yksikkötestausta ohjelmiston inkrementaalisessa, syklisessä kehityksessä. Testivetoisen kehityksen puolestapuhujat väittävät tämän käytännön lisäävän ohjelmistokehittäjien tuottavuutta sekä ohjelmiston sisäistä ja ulkoista laatua. Ohjelmistotuotannon tutkimusyhteisö on tutkinut testivetoista kehitystä viimeisen vuosikymmenen aikana vahvistaakseen tai kumotakseen nämä väitteet. Empiiriset tutkimukset testivetoisen kehityksen vaikutuksista ohjelmistotuotantoon ovat suurelta osin tuloksettomia. Tämä väitöstyö tutkii kahta tekijää, jotka voivat vaikuttaa testivetoisen kehityksen väitettyjen vaikutusten ilmentymiseen ohjelmiston ulkoisena laatuna ja ohjelmistokehittäjien tuottavuutena: ohjelmistokehittäjien taitoja ja prosessin mukaista toimintaa. Tutkimus toteutettiin neljässä vaiheessa. Ensimmäisessä vaiheessa tehtiin kirjallisuuskatsaus, jolla selvitettiin tekijöitä, joiden on katsottu vaikuttavan testivetoiseen kehitykseen. Toisessa vaiheessa tehtiin Oulun yliopistolla kaksi koetta, joihin osallistui kaikkiaan 77 opiskelijaa. Kokeissa tutkittiin ohjelmiston laadun ja osallistujien tuottavuuden suhdetta heidän ohjelmointi- ja testaustaitoihinsa. Toisen kokeen aikana kerättyä aineistoa hyödynnettiin jatkotutkimuksessa, jossa tarkasteltiin laadun, tuottavuuden ja prosessin mukaisen toiminnan suhdetta. Kolmannessa vaiheessa tehtiin neljä koetta, joihin osallistui 30 ohjelmistoalan ammattilaista. Prosessin mukaista toimintaa ja taitoja tutkittiin suhteessa testivetoisen kehityksen vaikutuksiin ohjelmiston ulkoiseen laatuun ja tuottavuuteen. Neljännessä vaiheessa syntetisoitiin kahden edellisen vaiheen löydökset. Tulokset osoittavat, ettei testivetoinen kehitys liity ulkoisen laadun parantumiseen eikä ohjelmistokehittäjien tuottavuuteen. Parannukset laadussa ja tuottavuudessa liittyvät ennemmin taitoihin kuin prosessiin, ainakin ohjelmistokehityksen ammattilaisten kohdalla. Näin ollen prosessin mukaisella toiminnalla on vähäpätöinen vaikutus. Toisaalta testivetoisen kehityksen suosiman test-first-menettelytavan hyödyntäminen voi edistää aloittelevien ohjelmistokehittäjien tuottavuutta.
57

A symbolic approach for the verification and the test of service choreographies / Une approche symbolique pour la vérification et le test des chorégraphies de services

Nguyễn, Hữu Nghĩa 31 October 2013 (has links)
L'ingénierie orientée services est un nouveau paradigme pour développer des logiciels distribués et collaboratifs. Un tel logiciel se compose de plusieurs entités, appelés services, chacun d'entre eux étant par exemple une application Web, un service Web, ou même un humain. Les services peuvent être développés indépendamment et sont composés pour atteindre quelques exigences. Les chorégraphies de service définissent ces exigences avec une perspective globale, basée sur les interactions entre des participants qui sont implémentés en tant que services. Cette thèse vise à formaliser des problèmes et tente d'élaborer un environnement intégré avec lequel les chorégraphies de services peuvent être développés correctement pour les deux types d'approches de développement: l'approche descendante et l'approche ascendante. Elle consiste à analyser la relation entre une spécification de chorégraphie et une implémentation de la chorégraphie au niveau du modèle et aussi au niveau de l'implémentation réelle. Particulièrement, il s'agit de la composition/décomposition des services, la vérification, et le test de l'implémentation de chorégraphie. Le premier point-clé de notre environnement intégré est de représenter le passage de valeurs entre les services en utilisant la technique symbolique et un solveur SMT. Cette technique nous permet de réduire les faux négatifs et de contourner le problème d'explosion combinatoire de l'espace d'états, ces problèmes sont durs à l'abstraction et à l'énumération des valeurs pour les approches existantes basées données. Le second point-clé est le test passif boîte noire de l'implémentation de chorégraphie. Il ne nécessite pas d'accéder au code source, ni de rendre indisponible l'implémentation pendant le processus de test. Notre environnement intégré est mis en œuvre dans nos outils qui sont disponibles en téléchargement ou à utiliser en ligne à l’adresse http://schora.lri.fr. / Service-oriented engineering is an emerging software development paradigm for distributed collaborative applications. Such an application is made up of several entities abstracted as services, each of them being for example a Web application, a Web service, or even a human. The services can be developed independently and are composed to achieve common requirements through interactions among them. Service choreographies define such requirements from a global perspective, based on interactions among a set of participants. This thesis aims to formalize the problems and attempts to develop a framework by which service choreographies can be developed correctly for both top-down and bottom-up approaches. It consists in analyzing the relation between a choreography specification and a choreography implementation at both model level and real implementation level. Particularly, it concerns the composition/decomposition service design, the verification, and the testing of choreography implementation. The first key point of our framework is to support value-passing among services by using symbolic technique and SMT solver. It overcomes false negatives or state space explosion issues due by abstracting or limiting the data domain of value-passing in existing approaches. The second key point is the black-box passive testing of choreography implementation. It does not require neither to access to source codes nor to make the implementation unavailable during the testing process. Our framework is fully implemented in our toolchains, which can be downloaded or used online at address: http://schora.lri.fr.
58

Behavioral service substitution

Parnjai, Jarungjit 22 April 2013 (has links)
Serviceevolution erlaubt es, einen Service durch einen anderen Service zu verfeinern oder zu ersetzen. Der Austausch durch einen anderen Service sollte garantieren, dass alle oder ausgewählte Partner des Originalservices erhalten bleiben. In dieser Arbeit entwickeln wir einen Ansatz welcher einem Serviceentwickler helfen soll, Analyse- und Syntheseaufgaben für den Serviceaustausch so durchzuführen, dass jeder Partner eines gegebenen Services beim Austausch erhalten bleibt. Wir modellieren einen Kontrollfluss eines Services als Beschreibung der Reihenfolge von asynchron kommunizierenden Ereignissen mittels eines impliziten ungeordneten Nachrichtenspeichers. Weiterhin studieren wir den Verhaltensaspekt von korrekter Interaktion zwischen Services und konzentrieren uns auf zwei Varianten von Verklemmungsfreiheit als Korrektheitskriterien von Serviceersetzung. Der wichtigste Beitrag ist ein Ansatz zur Charakterisierung jedes möglichen Austausches eines gegebenen Services. Die zentrale Idee dieses Ansatzes ist eine systematische Untersuchung der Verbindung zwischen einem Service und all seiner Partner bzgl. eines gegebenen Korrektheitskriteriums. Wir nutzen diese Verbindung um von einem gegebenen Service einen kanonischen Partner und einen kanonischen Austausch bzgl. aller Partner zu synthetisieren. Ein Service welcher den kanonischen Austausch eines gegebenen Services verfeinert wird als Austausch des gegebenen Services angesehen, wenn die Menge all seiner Partner jeden Partner des gegebenen Services enthält. Mit dem kanonischen Austausch eines gegebenen Services identifizieren wir die Menge der möglichen austauschenden Services eines gegebenen Services bei der jeder exakt die gleichen Partner wie der gegebene Service hat. Einige Ergebnisse dieser Arbeit fundieren auf früheren Arbeiten zu Austausch und Korrektheit von Services und können daher mit diesen verbunden werden um schwierigere Analyse- und Syntheseaufgaben für den Serviceaustausch durchzuführen. / Service evolution allows one service to be refined into or substituted by another service. Substituting one service by another service should guarantee to preserve all or selected partners of the original service. In this thesis, we develop an approach that shall assist a service designer, such as a domain expert, to perform analysis and synthesis tasks on service substitution. We model a control flow of services that describes the ordering of asynchronously communicating events over an implicit unordered message buffer. We study the behavioral aspect of correct interaction between services and concentrate on two variants of deadlock freedom as correctness criteria of service substitution. The major contribution of this thesis is an approach for characterizing the set of all substitutes for a given service. We systematically investigate the relationship between a service and all its partners under a given correctness criterion and employ this relationship to synthesize from a given service its canonical partner and its canonical substitute with respect to all partners. A service that refines the canonical substitute for a given service is regarded as a substitute for the given service if the set of all its partners includes every partner of the given service. With the canonical substitute of a given service, we identify a specific subset of the set of all substitutes for the given service, each of which has exactly the same set of partners as that of the given service. Parts of the results in this thesis have been established upon previous works on service substitution and correctness of services. Consequently, we can also combine our results with the related existing techniques to perform more sophisticated analysis and synthesis tasks on service substitution.
59

NAT2TEST: generating test cases from natural language requirements based on CSP

CARVALHO, Gustavo Henrique Porto de 26 February 2016 (has links)
Submitted by Natalia de Souza Gonçalves (natalia.goncalves@ufpe.br) on 2016-09-28T12:33:15Z No. of bitstreams: 2 license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) GustavoHPCarvalho_Doutorado_CInUFPE_2016.pdf: 1763137 bytes, checksum: aed7b3ab2f6235757818003678633c9b (MD5) / Made available in DSpace on 2016-09-28T12:33:15Z (GMT). No. of bitstreams: 2 license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) GustavoHPCarvalho_Doutorado_CInUFPE_2016.pdf: 1763137 bytes, checksum: aed7b3ab2f6235757818003678633c9b (MD5) Previous issue date: 2016-02-26 / High trustworthiness levels are usually required when developing critical systems, and model based testing (MBT) techniques play an important role generating test cases from specification models. Concerning critical systems, these models are usually created using formal or semi-formal notations. Moreover, it is also desired to clearly and formally state the conditions necessary to guarantee that an implementation is correct with respect to its specification by means of a conformance relation, which can be used to prove that the test generation strategy is sound. Despite the benefits of MBT, those who are not familiar with the models syntax and semantics may be reluctant to adopt these formalisms. Furthermore, most of these models are not available in the very beginning of the project, when usually natural-language requirements are available. Therefore, the use of MBT is postponed. Here, we propose an MBT strategy for generating test cases from controlled naturallanguage (CNL) requirements: NAT2TEST, which refrains the user from knowing the syntax and semantics of the underlying notations, besides allowing early use of MBT via naturallanguage processing techniques; the formal and semi-formal models internally used by our strategy are automatically generated from the natural-language requirements. Our approach is tailored to data-flow reactive systems: a class of embedded systems whose inputs and outputs are always available as signals. These systems can also have timed-based behaviour, which may be discrete or continuous. The NAT2TEST strategy comprises a number of phases. Initially, the requirements are syntactically analysed according to a CNL we proposed to describe data-flow reactive systems. Then, the requirements informal semantics are characterised based on the case grammar theory. Afterwards, we derive a formal representation of the requirements considering a model of dataflow reactive systems we defined. Finally, this formal model is translated into communicating sequential processes (CSP) to provide means for generating test cases. We prove that our test generation strategy is sound with respect to our timed input-output conformance relation based on CSP: csptio. Besides CSP, we explore the generation of other target notations (SCR and IMR) from which we can generate test cases using commercial tools (T-VEC and RT-Tester, respectively). The whole process is fully automated by the NAT2TEST tool. Our strategy was evaluated considering examples from the literature, the aerospace (Embraer) and the automotive (Mercedes) industry. We analysed performance and the ability to detect defects generated via mutation. In general, our strategy outperformed the considered baseline: random testing. We also compared our strategy with relevant commercial tools. / Testes baseados em modelos (MBT) consiste em criar modelos para especificar o comportamento esperado de sistemas e, a partir destes, gerar testes que verificam se implementações possuem o nível de confiabilidade esperado. No contexto de sistemas críticos, estes modelos são normalmente (semi)formais e deseja-se uma definição precisa das condições necessárias para garantir que uma implementação é correta em relação ao modelo da especificação. Esta definição caracteriza uma relação de conformidade, que pode ser usada para provar que uma estratégia de MBT é consistente (sound). Apesar dos benefícios, aqueles sem familiaridade com a sintaxe e a semântica dos modelos empregados podem relutar em adotar estes formalismos. Aqui, propõe-se uma estratégia de MBT para gerar casos de teste a partir de linguagem natural controlada (CNL). Esta estratégia (NAT2TEST) dispensa a necessidade de conhecer a sintaxe e a semântica das notações formais utilizadas internamente, uma vez que os modelos intermediários são gerados automaticamente a partir de requisitos em linguagem natural. Esta estratégia é apropriada para sistemas reativos baseados em fluxos de dados: uma classe de sistemas embarcados cujas entradas e saídas estão sempre disponíveis como sinais. Estes sistemas também podem ter comportamento dependente do tempo (discreto ou contínuo). Na estratégia NAT2TEST, inicialmente, os requisitos são analisados sintaticamente de acordo com a CNL proposta neste trabalho para descrever sistemas reativos. Em seguida, a semântica informal dos requisitos é caracterizada utilizando a teoria de gramática de casos. Posteriormente, deriva-se uma representação formal dos requisitos considerando um modelo definido neste trabalho para sistemas reativos. Finalmente, este modelo é traduzido em uma especificação em communicating sequential processes (CSP) para permitir a geração de testes. Este trabalho prova que a estratégia de testes proposta é consistente considerando a relação de conformidade temporal baseada em entradas e saídas também definida aqui: csptio. Além de CSP, foi explorada a geração de outras notações formais (SCR e IMR), a partir das quais é possível gerar casos de teste usando ferramentas comerciais (T-VEC e RT-Tester, respectivamente). Todo o processo é automatizado pela ferramenta NAT2TEST. A estratégia NAT2TEST foi avaliada considerando exemplos da literatura, da indústria aeroespacial (Embraer) e da automotiva (Mercedes). Foram analisados o desempenho e a capacidade de detectar defeitos gerados através de operadores de mutação. Em geral, a estratégia NAT2TEST apresentou melhores resultados do que a referência adotada: testes aleatórios. A estratégia NAT2TEST também foi comparada com ferramentas comerciais relevantes.
60

Domain Specific Modeling Support for ArCon / Stöd för domänspecifik modellering med ArCon

Azari, Leila January 2013 (has links)
One important phase in software development process is to create a design model of the system which follows all the architectural rules. Often the architectural rules are defined by the system architect and the system model is designed by the system designer. The architect defines the rules in a text file where no standard or pattern is followed. Therefore, there is always the risk of violating the architectural rules by the designer. So manual reviews on the system model should be done by the architect to ensure the system model is valid.In order to remove this manual checking which can be erroneous and time consuming ArCon (Architecture Conformance Checker) was developed by Combitech AB. ArCon is a tool which lets the architect define the architectural rules in the format of UML (Unified Modeling Language) models where the elements of the model have different meaning than the standard UML. ArCon can read this model and extract architectural rules from it and check the system model against those rules and then print all the rule violations.ArCon is an open source tool i.e. free for everyone to download and use. Currently, it supports Papyrus as the UML modeling tool. Papyrus is integrated to Eclipse platform and is a general purpose modeling tool. It supports users with all types of UML diagrams and elements.The idea for this thesis work was to implement a new feature for ArCon in order to facilitate the design process for system designers. The feature should provide the system designers only those types of elements which they are permitted to add to a specific fraction of the system model. The list of permitted element types should be extracted from the architecture model where all the architectural rules are defined in advance. This new support in ArCon was named Domain Specific Modeling (DSM) support.To evaluate the effect of DSM support on the system designers performance a few test sessions, called usability tests, were performed. The participants in the test sessions were a representative sample of software designers. After analyzing the data collected from the test sessions, the pros and cons of the new support were discovered. Furthermore, a few new ideas for enhancing DSM support were generated.

Page generated in 0.0538 seconds