• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 14
  • 13
  • 3
  • Tagged with
  • 64
  • 15
  • 14
  • 12
  • 10
  • 8
  • 8
  • 8
  • 7
  • 7
  • 6
  • 6
  • 6
  • 5
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Optimal utilization of historical data sets for the construction of software cost prediction models

Liu, Qin January 2006 (has links)
The accurate prediction of software development cost at early stage of development life-cycle may have a vital economic impact and provide fundamental information for management decision making. However, it is not well understood in practice how to optimally utilize historical software project data for the construction of cost predictions. This is because the analysis of historical data sets for software cost estimation leads to many practical difficulties. In addition, there has been little research done to prove the benefits. To overcome these limitations, this research proposes a preliminary data analysis framework, which is an extension of Maxwell's study. The proposed framework is based on a set of statistical analysis methods such as correlation analysis, stepwise ANOVA, univariate analysis, etc. and provides a formal basis for the erection of cost prediction models from his¬torical data sets. The proposed framework is empirically evaluated against commonly used prediction methods, namely Ordinary Least-Square Regression (OLS), Robust Regression (RR), Classification and Regression Trees (CART), K-Nearest Neighbour (KNN), and is also applied to both heterogeneous and homogeneous data sets. Formal statistical significance testing was performed for the comparisons. The results from the comparative evaluation suggest that the proposed preliminary data analysis framework is capable to construct more accurate prediction models for all selected prediction techniques. The framework processed predictor variables are statistic significant, at 95% confidence level for both parametric techniques (OLS and RR) and one non-parametric technique (CART). Both the heterogeneous data set and homogenous data set benefit from the application of the proposed framework for improving project effort prediction accuracy. The homogeneous data set is more effective after being processed by the framework. Overall, the evaluation results demonstrate that the proposed framework has an excellent applicability. Further research could focus on two main purposes: First, improve the applicability by integrating missing data techniques such as listwise deletion (LD), mean imputation (MI), etc., for handling missing values in historical data sets. Second, apply benchmarking to enable comparisons, i.e. allowing companies to compare themselves with respect to their productivity or quality.
32

Une approche agile, fiable et minimale pour le maintien de la qualité de service lors de l’évolution d’applications à base de processus métiers / An agile, reliable and minimalist approach to preserve the quality of service of business-processes based applications during their evolutions

Feugas, Alexandre 08 October 2014 (has links)
Les logiciels actuels adoptent une méthodologie de développement dite agile et itérative, où chaque itération peut être vue comme une évolution du logiciel, pour prendre en compte les nouveaux besoins des utilisateurs. Dans l’écosystème des architectures orientées services, la conception de logiciels passe par l’orchestration de services par des processus métiers. La phase d’évolution devient une phase complexe, où une simple modification d'une sous-partie d’un processus métier peut avoir des conséquences sur l’ensemble du logiciel, causant par exemple son ralentissement à l’exécution ou la dégradation de sa qualité de service (QoS). Il est nécessaire de pouvoir maintenir la QoS lors de l’évolution de logiciels à base de processus métiers en proposant des mécanismes d’évolution agiles, fiables et minimal afin de ne pas dégrader les performances du logiciel. Les contributions de cette thèse sont mises en œuvre dans notre cycle de développement Blink, centré sur le maintien de la QoS lors de l’évolution, et Smile, un canevas de développement pour le maintien de la QoS lors de l’évolution d’applications orientées service définies à base de processus métiers. Ce dernier repose sur une analyse de l’évolution déterminant son effet sur la QoS du logiciel, en établissant des relations de causalité entre variables, opérations, services et autres parties du système. En identifiant les éléments causalement affectés par l’évolution et en écartant ceux qui ne le sont pas, notre approche permet de limiter le nombre d’éléments à re-vérifier, garantissant ainsi une étape d’évolution fiable, avec une étape de re-vérification minimale. / Current softwares are built using agile and iterative development methods, where each iteration can be seen as a software evolution, to consider the new needs of users. In the Service-Oriented Architecture (SOA) world, the design of software is made of service orchestration using business processes. The evolution step becomes a complex step, where a simple modification on a sub-part of a business process can have consequences on the entire system, causing for example its slowing down or its Quality of Service (QoS) degradation. As a result, it is necessary to preserve the QoS of software made of business processes by proposing agile and reliable evolution mechanisms that have to be minimal in order to not degrade the software performances. The contributions of this thesis are Blink, a development cycle to maintain the QoS during evolution, and Smile, a framework to maintain QoS during the evolution of a service-oriented software made of business processes. Smile relies on an evolution analysis determining its effect of the software QoS, by establishing causal relations between variables, operations, services and other parts of the system. By identifying the elements that are causally affected by the evolution and by ruling out the ones that are not, our approach enables the limitation of the number of elements to (re)check in order to assure a reliable evolution step, with a minimal (re)check step.
33

An interpretive field study of packaged software selection processes

Light, B. January 2003 (has links)
Packaged software is pre-built with the intention of licensing it to users in domestic settings and work organisations. This thesis focuses upon the work organisation where packaged software has been characterised as one of the latest ‘solutions’ to the problems of information systems. The study investigates the packaged software selection process that has, to date, been largely viewed as objective and rational. In contrast, this interpretive study is based on a 2½ year long field study of organisational experiences with packaged software selection at T.Co, a consultancy organisation based in the United Kingdom. Emerging from the iterative process of case study and action research is an alternative theory of packaged software selection. The research argues that packaged software selection is far from the rationalistic and linear process that previous studies suggest. Instead, the study finds that aspects of the traditional process of selection incorporating the activities of gathering requirements, evaluation and selection based on ‘best fit’ may or may not take place. Furthermore, even where these aspects occur they may not have equal weight or impact upon implementation and usage as may be expected. This is due to the influence of those multiple realities which originate from the organisational and market environments within which packages are created, selected and used, the lack of homogeneity in organisational contexts and the variously interpreted characteristics of the package in question.
34

Towards a practically extensible Event-B methodology

Maamria, Issam January 2013 (has links)
Formal modelling is increasingly recognised as an important step in the development of reliable computer software. Mathematics provide a solid theoretical foundation upon which it is possible to specify and implement complex software systems. Event-B is a formalism that uses typed set theory to model and reason about complex systems. Event-B and its associated toolset, Rodin, provide a methodology that can be incorporated into the development process of software and hardware. Refinement and mathematical proof are key features of Event-B that can be exploited to rigorously specify and reason about a variety of systems. Successful and usable formal methodologies must possess certain attributes in order to appeal to end-users. Expressiveness and extensibility, among other qualities, are of major importance. In this thesis, we present techniques that enhance the extensibility of: (1) the mathematical language of Event-B in order to enhance expressiveness of the formalism, and (2) the proving infrastructure of the Rodin platform in order to cope with an extensible mathematical language. This thesis makes important contributions towards a more extensible Event-B methodology. Firstly, we show how the mathematical language of Event-B can be made extensible in a way that does not hinder the consistency of the underlying formalism. Secondly, we describe an approach whereby the prover used for reasoning can be augmented with proof rules without compromising the soundness of the framework. The theory component is the placeholder for mathematical and proof extensions. The theoretical contribution of this thesis is the study of rewriting in the presence of partiality. Finally, from a practical viewpoint, proof obligations are used to ensure soundness of user-contributed extensions.
35

Support à la conception architecturale de systèmes-de-systèmes reconnus à logiciel prépondérant / Supporting architectural design of acknowledged Software-intensive Systems- of-Systems

Benites Gonçalves, Marcelo 12 October 2016 (has links)
Systèmes-de-systèmes (Systems-of-Systems, SoS) sont des systèmes à logiciel prépondérant de grande échelle, complexes et souvent critiques dont l’importance n’a cessé de croître dans cette décennie. C’est le cas, par exemple, des SoS pour les réseaux intelligents d’énergie, les transports multimodaux et les villes intelligentes. On retrouve les SoS dans des domaines d’application très variés tels que l’environnement, le transport, l’énergie, la santé, l’aérospatiale, l’aéronautique et l’automobile. Pour maîtriser la complexité de tels systèmes évolutionnaires à logiciel prépondérant, il est fondamental de pouvoir concevoir leurs architectures logicielles afin de satisfaire leurs exigences fonctionnelles et non-fonctionnelles. En particulier, il est nécessaire de modéliser et exécuter des processus systématiques pour concevoir les architectures logicielles de SoS. Malgré ce besoin croissant, la plupart de leurs architectures logicielles sont encore développées de manière ad hoc. En général, il y a un manque de processus structurés pour concevoir les architectures des SoS. Cet état de l’art entrave leur développement, notamment pour les applications critiques. Cette thèse présente SOAR (General Process for Acknowledged SoS Software Architectures) qui rend possible la mise en place de processus de conception architecturale pour SoS reconnus. Ce framework de processus a été conçu pour fournir différents niveaux de support en fonction des différents contextes de développement de SoS. Il comprend un noyau de haut niveau qui décrit ce qu’il faut faire pour la conception des architectures de SoS et ainsi que trois pratiques avec des activités spécifiques et des produits de travail pour guider l’analyse, la synthèse et l’évaluation architecturale. SOAR a été implémenté à l’aide d’Essence (Kernel and Language for Software Engineering Methods), un Standard OMG/SEMAT. Afin de valider SOAR, trois enquêtes, une étude de viabilité et une expérimentation ont été menées. Les résultats obtenus dans ces trois études montrent que SOAR répond positivement au besoin exprimée. / System-of-Systems (SoS) refer to complex, large-scale, and often critical software-intensive systems whose importance has rapidly grown over this decade. This is the case, for instance, of SoS for energy grids, multimodal traffic control, and smart cities. We find SoS in different application areas as diverse as environment, transportation, energy, healthcare, aerospace, aeronautics, and automotive. For mastering the complexity of such evolutionary software-intensive systems, it is fundamental to be able to design their software architectures for satisfying both functional and non-functional requirements. In particular, there is the need to model and enact systematic processes to design SoS software architectures. Despite this increasing necessity, most of the SoS software architectures have been still developed in an ad hoc manner. In general, there is a lack of structured processes for architecting SoS. This state-of-art hinders SoS development, in particular for critical applications. This thesis presents SOAR (General Process for Acknowledged SoS Software Architectures) that supports the modeling and enactment of architectural design processes for acknowledged SoS. Conceived to provide different levels of support according to different SoS development contexts, it comprises a high-level kernel that describes what must be done when architecting SoS and also three practices with specific activities and work products to guide how to perform architectural analysis, synthesis, and evaluation. SOAR was implemented using Essence (Kernel and Language for Software Engineering Methods), an OMG/SEMAT Standard. To validate SOAR, three surveys, a viability study, and an experiment were conducted. Results achieved in these studies indicate that SOAR positively meets the expressed need.
36

Theoretical and practical tools for validating discrete and real-time systems

Qu, Hongyang January 2005 (has links)
System validation has been investigated for a long time. Testing is used to find errors inside a system; in contrast, model checking is used to verify whether a given property holds in the system. Both methods have their own advantages and interact with each other. This thesis focuses on four methodologies for model checking and testing. In the end, they are integrated into a practical validating tool set, which is described in this thesis. Many techniques have been developed to manage the state space for a complicated system. But they still fail to reduce the state space for some large-scale concurrent systems. We propose using code annotation as a means of manually controlling the state space. This solution provides a trade-off between computability and exhaustiveness. When a suspicious execution is found either by testing or by model checking, it can be difficult to repeat this execution in a real environment due to nondeterministic choices existing in the system. We suggest enforcing a given execution by code transformation. In addition, we extend our method from a single path to partial order executions. In order to repeat at least one such execution, we need to provide appropriate values satisfying the path's initial precondition in its environment. It is easy to obtain the precondition in a discrete environment, but difficult in a real-time environment, especially for a partial order, since the computation would involve time constraints in the latter case. We present a real-time model first, and then a methodology to compute the precondition on this model. When every action in the system is associated with a probability density function, it is possible to calculate the probability of the occurrence of a particular execution. We give a method to calculate the probability by inte- gration on a group of independent continuous random variables, each of which is corresponding to an action either executed, or enabled but not fired. The research described in this thesis provides some new ideas for ap- plying formal methods to classical software development tools.
37

Un îlot formel pour les transformations de modèles qualifiables / A formal island for qualifiable model transformations

Bach, Jean-Christophe 12 September 2014 (has links)
Le processus de développement logiciel est composé d'un grand nombre d'étapes qui intègrent de plus en plus d'outils. Les chaînes de développement de systèmes critiques (aéronautique, domaine médical) font appel à des outils de génération de code basés sur des modèles. Cette complexification a des conséquences sur la vérification des logiciels critiques. Les contraintes légales imposant qu'ils soient certifiés, la qualification des outils utilisés lors de leur développement est nécessaire. Dans cette thèse, nous nous proposons d'aider le processus de qualification en élaborant des méthodes et outils pour le développement fiable. Pour ce faire, nous présentons une méthode hybride de transformation de modèles par réécriture. Nous nous appuyons sur le langage Tom qui fournit de nouvelles fonctionnalités aux langages généralistes par l'ajout de constructions dédiées. Nous proposons aussi une traçabilité de ces transformations afin de répondre aux exigences de la qualification. La trace générée peut être utilisée a posteriori à des fins de vérification. / Software development process is composed of steps which integrate an increasing number of tools. Development chains for critical systems (avionics, health field) have gradually adopted model-based code generation tools. This increase of complexity has consequences on critical software verification. As legal constraints require to certify them, tools used during the development have to be qualified. In this thesis, we propose to help the qualification process by providing methods and tools for liable development. To do so, we present an hybrid models transformation method based on rewriting. We rely upon Tom language which provides new features to general purposes languages by adding dedicated constructs. We also propose a traceability in order to respect qualification requirements. The generated trace can then be used for verification purpose
38

Articulation entre activités formelles et activités semi-formelles dans le développement de logiciels / Articulation between definite and semi-definite activities in software development

Sayar, Imen 28 March 2019 (has links)
Le développement de spécifications formelles correctes pour des systèmes et logiciels commence par l’analyse et la compréhension des besoins du client. Entre ces besoins décrits en langage naturel et leur spécification définie dans un langage formel précis, un écart existe et rend la tâche de développement de plus en plus difficile à accomplir. Nous sommes face à deux mondes distincts. Ce travail de thèse a pour objectif d’expliciter et d’établir des interactions entre ces deux mondes et de les faire évoluer en même temps. Par interaction, nous désignons les liens, les échanges et les activités se déroulant entre les différents documents. Parmi ces activités, nous présentons la validation comme un processus rigoureux qui démarre dès l’analyse des besoins et continue tout au long de l’élaboration de leur spécification formelle. Au fur et à mesure du développement, des choix sont effectués et les retours des outils de vérification et de validation permettent de détecter des lacunes aussi bien dans les besoins que dans la spécification. L’évolution des deux mondes est décrite via l’introduction d’un nouveau besoin dans un système existant et à travers l’application de patrons de développement. Ces patrons gèrent à la fois les besoins et la spécification formelle associée ; ils sont élaborés à partir de la description de la forme des besoins. Ils facilitent la tâche de développement et aident à éviter les risques d’oublis. Quel que soit le choix, des questions se posent tout au long du développement et permettent de déceler des lacunes, oublis ou ambiguïtés dans l’existant. / The development of correct formal specifications for systems and software begins with the analysis and understanding of client requirements. Between these requirements described in natural language and their specification defined in a specific formal language, a gap exists and makes the task of development more and more difficult to accomplish. We are facing two different worlds. This thesis aims to clarify and establish interactions between these two worlds and to evolve them together. By interaction, we mean all the links, exchanges and activities taking place between the different documents. Among these activities, we present the validation as a rigorous process that starts from the requirements analysis and continues throughout the development of their formal specification. As development progresses, choices are made and feedbacks from verification and validation tools can detect shortcomings in requirements as well as in the specification. The evolution of the two worlds is described via the introduction of a new requirement into an existing system and through the application of development patterns. These patterns manage both the requirements and their associated formal specifications ; they are elaborated from the description of the form of the requirements in the client document. They facilitate the task of development and help to avoid the risk of oversights. Whatever the choice, the proposed approach is guided by questions accompanying the evolution of the whole system and makes it possible to detect imperfections, omissions or ambiguities in the existing.
39

Εργαλεία για την αξιολόγησης της ποιότητας λογισμικού

Κόρδας, Αθανάσιος 12 June 2015 (has links)
Η εργασία ασχολείται με διάφορα εμπορικά εργαλεία αξιολόγησης λογισμικού (τόσο ανοιχτού κώδικα όσο και επί πληρωμή). Επίσης έγιναν δοκιμαστικές αναλύσεις μεγάλων εμπορικών προγραμμάτων και συγκριτικές αξιολογήσεις. Τέλος στα πλαίσια της εργασίας αναπτύχθηκε εργαλείο στατικής ανάλυσης λογισμικού. / This thesis is occupied with various tools for software analysis(both open source and paid tools).Also large software programms have been tested and analyzed and results evaluated and compared.Finally a static software analysis tool has been developed.
40

Web Engineering Resources Portal: σχεδιασμός και υλοποίηση της πύλης πόρων του Web Enginnering (WEP) με αποθήκευση και διαχείρηση πληροφορίας σε XML/XML - Schema/XML- Stylevision forms χρήση Wiki και προηγμένη τρόποι πλοήγησης σε ιεραρχικό περιεχόμενο

Γκιζάς, Ανδρέας 25 October 2007 (has links)
Σχεδιασμός και υλοποίηση της πύλης Πόρων του Web Enginnering (WEP), με αποθήκευση και διαχείριση πληροφορίας σε XML/ XML-Schema/ XML-Stylevision Forms, χρήση wiki και προηγμένοι τρόποι πλοήγησης σε ιεραρχικό περιεχόμενο / Design and implementaton of Web Enginering Resources Portal (WEP), XML base system, use of XML-Schema/ XML-Stylevision Forms, wiki platform and advanced ways of piloting in categorized information

Page generated in 0.0437 seconds