251 |
Exploring the socio-technical impact of continuous integration: tools, practices, and humansElazhary, Omar M. 12 November 2021 (has links)
Continuous software engineering is a rapidly growing discipline in software engineering. Among its many reported benefits is increased development velocity, faster feedback for developers, and better software quality. It also comes with its own share of challenges, most of which are centered on making automated builds more efficient or detecting problems with build configuration. However, the majority of literature in this area does not take into account software developers, which are arguably the cornerstone of software development. Software development is still a human-driven endeavour. It is a developer who writes the code, tests it, makes the final decision while factoring in the build results, and so on. Furthermore, software development does not happen in a vacuum. Development takes place within the context of practices dictating how it should be done, and perceived benefits that drive practice adoption and implementation. Software development, and by extension continuous software development, is a socio-technical endeavour that features interactions between human aspects (developers, testers, etc.), technical aspects (automation), and environmental aspects (process, project-specific characteristics, infrastructure, etc.). While the software engineering field has its share of theories, frameworks, and models, or borrows them from other fields, we still do not have a human-centric framework for software engineering that takes into account other socio-technical aspects (technical and environmental). My dissertation addresses this need for a socio-technical framework by illustrating a series of studies that ultimately resulted in the creation of a socio-technical theory of continuous software engineering that focuses on phenomena involving both humans and automation. In particular, I focus on the role of continuous software engineering tools (automation) in the software development process and how they displace existing tools, disrupt existing workflows, and feature in software developer decision making. This theory will enable further research in this area as well as allow researchers to make more grounded recommendations for industrial applications. / Graduate
|
252 |
Complying with the GDPR in the context of continuous integrationLi, Ze Shi 08 April 2020 (has links)
The full enforcement of the General Data Protection Regulation (GDPR) that began on May 25, 2018 forced any organization that collects and/or processes personal data from European Union citizens to comply with a series of stringent and comprehensive privacy regulations. Many software organizations struggled to comply with the entirety of the GDPR's regulations both leading up and even after the GDPR deadline. Previous studies on the subject of the GDPR have primarily focused on finding implications for users and
organizations using surveys or interviews. However, there is a dearth of in-depth studies that investigate compliance practices and compliance challenges in software organizations. In particular, small and medium enterprises are often neglected in these previous studies, despite small and medium enterprises representing the majority of organizations in the EU. Furthermore, organizations that practice continuous integration have largely been ignored in studies on GDPR compliance. Using design science methodology, we conducted an in-depth study over the span of 20 months regarding GDPR compliance practices and challenges in collaboration with a small, startup organization. Our first step helped identify our collaborator's business problems. Subsequently, we iteratively developed two artifacts to address those business problems: a set of privacy requirements operationalized from GDPR principles, and an automated GDPR tool that tests these GDPR-derived privacy requirements. This design science approach resulted in five implications for research and for practice about ongoing challenges to compliance. For instance, our research reveals that GDPR regulations can be partially operationalized and tested through automated means, which is advantageous for achieving long term compliance. In contrast, more research is needed to create more efficient and effective means to disseminate and manage GDPR knowledge among software developers. / Graduate
|
253 |
A systematical literature review and industrial survey in addressing the possible impacts with the continuous testing and delivery during DevOps transformation.Zaib, Shah, Lakshmisetty, Pavan Kumar January 2021 (has links)
Context: Digital transformation poses new challenges to the organization. Market needs and competition in business have changed the continuous testing environment and continuous delivery in software organizations. There is a great possibility of conflict between the development and testing operations because they are a separate entity in large organizations. Organizations where testers, developers, and operation teams do not communicate well and lack collaboration can have their productivity affected negatively. This leads to defects and errors at the early stage of the development process. DevOps’ approach enhances the integration, delivery, performance,and communication between the developers, testers, and operational members. Organizations are reluctant to apply DevOps practices because there is a lack of agreement on DevOps characteristics. The most difficult part of a large organization is DevOpsadaptation and its implementation because of its legacy structure. It is required to get an understanding of DevOps implementation in organizations before they start transforming. Objectives: The thesis aims to identify the challenges in organizations towards continuous delivery and provide a list of techniques or strategies to overcome continuous testing and DevOps challenges. This thesis also identifies the communication challenges between continuous testing and delivery teams during the COVID-19 pandemic and the software architecture effect on testing in the DevOps environment. Methods: To achieve the research goal a multiple research method techniques are applied. A systematic literature review is conducted to identify the literature and to meet the research goal. A survey is conducted for the verification of the data from SLR. An interview is used as a data collection method in Survey and explores the actual process of continuous testing and delivery in large DevOps companies. Results: A list of challenges to large organizations towards continuous delivery is generated. A list of strategies and solutions towards the challenges of continuous testing and DevOps is generated. A list of post COVID-19 communication challenges between testing and delivery groups in DevOps is created. A list of software architecture and production environment effects on testing is also generated. After analyzing the SLR results, a survey is conducted to validate the results from software practitioners. Thematic analysis is performed on the results. "Finally", the findings from the SLR and Survey are compared. Conclusions: This research’s findings can help researchers, industry practitioners, and someone who wants to investigate further the possible effects with the continuous testing and delivery during DevOps transformation. We observed that industry practitioners could enhance their communication channels by reviewing the post-COVID-19 communication challenges between testing and delivery teams. We also observed that there is more research required to continue on this topic.
|
254 |
Intelligent and Context-Aware Information Filtering in Continuous Integration Pipeline using the Eiffel Protocol / Intelligent och kontextmedveten informationsfiltrering i kontinuerlig integrationsrörledning med Eiffel-protokolletGustafsson, Robin January 2021 (has links)
Software development has gotten more complex and certain parts being more and more automated. Continuous integration practices with automated build and testing greatly benefit the development process. Combined with continuous deployment, the software can go directly from commit to deployment within hours or days which means that every commit is a possible deployment. The ability to trace links between artifacts is known as software traceability which has become a necessity and requirement in the industry. Following these traces and the ability to answer questions and base decisions on them is a complex problem. Tools already used in the industry are hard to adapt since every stakeholder requires different needs. Eiffel protocol aims to be as flexible and scalable as possible to fit as many needs as necessary for the stakeholder. This thesis provides an extension to Eiffel-store, an already existing open-source application that can visualize events in the Eiffel protocol that will be extended with functionality so that it can filter events and answer some questions stakeholders might have.
|
255 |
Create React App vs NextJS : A comparison of two ReactJS based web application frameworksJohansson, Jens January 2021 (has links)
Webbapplikationer bygger på många olika webbramar och utvecklare har en mängd olika webbramverk att välja mellan när de utvecklar en webbapplikation. Två populära webbramverk som finns på marknaden är NextJS och Create React App (CRA). Varje ramverk har sina egna för- och nackdelar i olika perspektiv. Syftet med denna studie är att granska om det finns några skillnader i dessa två populära webbramverk ur ett kontinuerligt integrations och kontinuerligt leverans-perspektiv med ett fokus på att kolla närmare på skillnaderna inom utvecklingsprocessen vid utökningar av applikationer och den tid det tar att bygga och driftsätta applikationer i de olika ramverken. För att få kunskap om ämnet så har en teoretisk studie av webbaserade källor gjorts och en applikation har skapats i de båda ramarverken för att sedan utvidgas med ytterligare verktyg för att kunna utföra en jämförelse. Studien visar att ramverken resulterar i liknande byggoch driftsättningstider men skiljer sig dock åt gällande konfigurationer när applikationerna utökades och att NextJS gav en enklare åtkomst åt konfigurering. / Web applications are built on numerous of different web frameworks and developers have a plethora of different web frameworks to choose from when developing a web application. Two popular web frameworks on the market are NextJS and Create React App (CRA). Each framework has its own advantages and disadvantages in different perspectives. The objective of this study is to review if there are any differences in these two popular web frameworks in a continuous integration and continuous delivery perspective looking closer at the differences in the development process when extending applications and the time it takes to build and deploy applications in the different frameworks. To gain knowledge about the subject, a theoretical study on web based sources has been made and an application has been created in both framework to then be extended with further tools to be able to perform the comparison. The study shows that the frameworks results in similar build and deployment times but does however differ in the configurations when extending applications and that NextJS did provide a more straightforward configuration.
|
256 |
FROM CHAOS TO ORDER: A study on how data-driven development can help improve decision-makingIlebode, Terry, Mukherjee, Annwesh January 2019 (has links)
AbstractThe increasing amount of data available from software systems has given a unique opportunity for software development organizations to make use of it in decision-making. There are several types of data such as bug reports, website interaction information, product usage extent or test results coming into software-intensive companies and there is a perceived lack of structure associated with the data. The data is mostly scattered and not in an organized form to be utilized further. The data, if analyzed in an effective way, can be useful for many purposes, especially in decision-making. The decisions can be on the level of business or on the level of product execution. In this paper, through a literature review, an interview study and a qualitative analysis we categorize different types data that organizations nowadays collect. Based on the categorization we order the different types of decisions that are generally taken in a software development process cycle. Combining the two we create a model to explain a recommended process of handling the surge of data and making effective use of it. The model is a tool to help both practitioners and academicians who want to have a clearer understanding of which type of data can best be used for which type of decisions. An outline of how further research can be conducted in the area is also highlighted.
|
257 |
Traceability in continuous integration pipelines using the Eiffel protocolHramyka, Alena, Winqvist, Martin January 2019 (has links)
The current migration of companies towards continuous integration and delivery as well all service-oriented business models brings great benefits but also challenges. One challenge that a company striving to establish continuous practices is the need for pipeline traceability, which can bring great enhancements to continuous integration and delivery pipelines as well as offer a competitive edge. This exploratory case study looks at the current and desired states at Axis Communications, a global leader in network video solutions based in Lund, Sweden. It further evaluates the practical and organizational aspects of the adoption of the Eiffel protocol in the company’s pipeline tools through developing a proof-of-concept Eiffel plugin for Artifactory. Based on the discovered technical and organizational needs and obstacles, it draws conclusions and makes recommendations on a possible strategy when introducing Eiffel in a company.
|
258 |
Exploring the Encounter of ContinuousDeployment and the Financial IndustryFRIE, FELIX, HAMMMARLUND, GUSTAV January 2016 (has links)
The digitisation of the financial markets has led to IT becoming a vitalpart of financial institutions. The principles and practices of ContinuousDeployment (CD) are utilised to increase innovation through flexibilityand swiftness at many IT companies. This thesis explores the encounterof CD and the financial industry through participant observations andsemi-structured interviews with developers.We find in our study that practitioners in the financial industry usepractices that are part of a CD process. The specialisation of the systemsthat is evident in the industry could be considered a barrier for theadoption of a CD process. However, improved transparency that maycome as a result of CD is well aligned with the demands that are evidentin the industry. Furthermore, the requirement for code reviews mightimpact the ability to attain a continuous process, as it must be a manual.
|
259 |
A Comparison of CI/CD Tools on KubernetesJohansson, William January 2022 (has links)
Kubernetes is a fast emerging technological platform for developing and operating modern IT applications. The capacity to deploy new apps and change old ones at a faster rate with less chance of error is one of the key value proposition of the Kubernetes platform. A continuous integration and continuous deployment (CI/CD) pipeline is a crucial component of the technology. Such pipelines compile all updated code and do specific tests and may then automatically deploy the produced code artifacts to a running system. There is a thriving ecosystem of CI/CD tools. Tools can also be divided into two types: integrated and standalone. Integrated tools will be utilized for both pipeline phases, CI and CD. The standalone tools will be used just for one of the processes, which needs the usage of two independent programs to build up the pipeline. Some tools predate Kubernetes and may be converted to operate on Kubernetes, while others are new and designed specifically for usage with Kubernetes clusters. CD systems are classified as push-style (artifacts from outside the cluster are pushed into the cluster) or pull-style (CD tool running inside the cluster pulling built artifacts into the cluster). Pull- and push-style pipelines will have an impact on how cluster credentials are managed and if they ever need to leave the cluster. This thesis investigates the deployment time, fault tolerance, and access security of pipelines. Using a simple microservices application, a testing setup is created to measure the metrics of the pipelines. Drone, Argo Workflows, ArgoCD, and GoCD are the tools compared in this study. These tools are coupled to form various pipelines. The pipeline using Kubernetes-specific tools, Argo Workflows and ArgoCD, is the fastest, the pipeline with GoCD is somewhat slower, and the Drone pipeline is the slowest. The pipeline that used Argo Workflows and ArgoCD could also withstand failures. Theother pipelines that used Drone and GoCD were unable to recover and timed out. Pull pipelines handles the Kubernetes access differently to push pipelines as the Kubernetes cluster credentials does not have to leave the cluster, whereas push pipelines needs the cluster credentials in the external environment where the CD tool is running.
|
260 |
Stetigkeit in der StatistikHuschens, Stefan 30 March 2017 (has links) (PDF)
Es werden verschiedene Stetigkeitskonzepte, die in der statistischen Theorie und Methodik eine Rolle spielen, erläutert.
|
Page generated in 0.066 seconds