• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1575
  • 481
  • Tagged with
  • 2056
  • 2052
  • 2051
  • 534
  • 426
  • 422
  • 402
  • 208
  • 175
  • 174
  • 135
  • 134
  • 130
  • 114
  • 108
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
181

Towards Understanding Software Craftsmanship

Sundelin, Anders January 2021 (has links)
The concept of software craftsmanship has roots in the earliest days of computing but has received comparably little attention from the research community.As a reaction to how Agile methods were practiced and taught in industry, in 2009, the Manifesto for Software Craftsmanship was formulated and published, drawing attention to the concept. Subsequent books and research papers have also elaborated on the concept. With this dissertation, we aim to study the software craftsmanship phenomenon using empirical software engineering methods.We developed an anatomy of software craftsmanship through a systematic literature study and a longitudinal case study, following a project consisting of multiple teams over several years.We also illustrate some consequences of not following through on the espoused craftsmanship practice of managing and account for technical debt. We find that some areas exhibited high growth in technical debt, while others remained comparably idle. This indicates that it is important to keep track of existing technical debt, but repayment should consider the distribution of each kind of technical debt in the codebase. Our studies are empirical, using mixed methods, analyzing quantitative as well as qualitative data.We used thematic coding to structure the qualitative data into themes, principles, and practices. We provide our systematically derived anatomy of the principles and practices of software craftsmanship and discuss how these relate to other principles within software engineering in general.
182

Similarity-Based Test Effort Reduction

Flemström, Daniel January 2017 (has links)
Embedded computer systems are all around us. We find them in everything, from dishwashers to cars and airplanes. They must always work correctly and moreover, often within certain time constraints. The software of such a system can be very large and complex, e.g. in the case of a car or a train. Hence, we develop the software for embedded systems in smaller, manageable, parts. These parts can be successively integrated, until they form the complete software for the embedded system, possibly at different levels. This phase of the development process is called the system integration phase and is one of the most critical phases in the development of embedded systems. In this phase, substantial effort is spent on testing activities. Studies have found that a considerable amount of test effort is wasteful due to people, unknowingly or by necessity, performing similar (or even overlapping) test activities. Consequently, test cases may end up to be similar, partially or wholly. We identified such test similarities in a case study of 2500 test cases, written in natural language, from four different projects in the embedded vehicular domain. Such information can be used for reducing effort when maintaining or automating similar test cases. In another case study in the same domain, we investigated several approaches for prioritizing test cases to automate with the objective to reduce manual test effort as quick as possible given that similar automated tests could be reused (similarity-based reuse). We analyzed how the automation order affects the test effort for four projects with a total of 3919 integration test cases, written in natural language. The results showed that similarity-based reuse of automated test case script code, and the best-performing automation order can reduce the expected manual test effort with 20 percentage points. Another way of reducing test effort is to reuse test artifacts from one level of integration to another, instead of duplicating them. We studied such reuse methods, that we denote vertical reuse, in a systematic mapping study. While the results from of our systematic mapping study showed the viability of vertical test reuse methods, our industrial case studies showed that keeping track of similarities and test overlaps is both possible and feasible for test effort reduction. We further conclude that the test case automation order affects the manual test execution effort when there exist similar steps that cannot be removed, but are possible to reuse with respect to test script code. / Inbyggda datorsystem finns överallt omkring oss idag; i allt från diskmaskiner till bilar och flygplan. Ofta finns det stränga krav på såväl korrekt funktion som svarstider. Eftersom programvaran i det inbyggda systemet i till exempel en bil är väldigt stor och komplex, utvecklar man programvaran i mindre delar som sedan successivt integreras till det färdiga datorsystemet. Denna del av utvecklingsprocessen kallas systemintegrationsfasen och är en av de största och mest kritiska faserna när man utvecklar inbyggda system. Integrationen kan utföras i ett antal nivåer med utförliga tester av mjukvaran på varje nivå. Detta innebär att det krävs ett avsevärt arbete för att testa mjukvaran. Om man kan minska detta arbete, givetvis utan att ge avkall på testningens kvalitet, förväntas det få en stor effekt på det totala utvecklingsarbetet. Studier har visat att en icke försumbar del av testarbetet är slöseri på grund av att människor, omedvetet eller av nödvändighet, utför likartade (eller till och med överlappande) testaktiviteter. En konsekvens är att testfall riskerar att bli helt eller delvis lika. Vi identifierade sådana likheter i en fallstudie med 2500 manuella testfall, skrivna på vanligt språk, från fyra projekt inom fordonsindustrin. Information om likheter och överlapp kan användas för att, till exempel, minska arbetsåtgången vid underhåll eller när man översätter testfall till kod så att de kan utföras automatiskt av en dator (automatisering). I en annan studie inom samma domän, undersökte vi flera metoder för att prioritera arbetsordningen vid automatisering av testfall där liknande testfall kunde återanvändas. I studien analyserade vi hur denna ordning påverkar arbetsmängden för fyra industriprojekt med totalt 3919 integrationstestfall skrivna på vanligt språk. Resultaten visar att den bästa ordningen kan minska testarbetet avsevärt. En förutsättning för detta är att redan översatta delar av testfall kan återanvändas för att slippa översätta liknande testfall igen. En annan väg för att minska testarbetet är att återanvända testfall och information mellan integrationsnivåer. Vi har kartlagt metoder och motiv kring sådan återanvändning i en systematisk mappningsstudie. Våra fallstudier visar vidare att det både är genomförbart och lönsamt att hålla reda på likheter i testfallen. Slutligen konstaterar vi att arbetsinsatsen för manuell testning påverkas av automationsordningen när det är möjligt att återanvända redan översatta delar av liknande testfall. / IMPRINT
183

Drone Interactive Map : En webbapplikation för styrning av drönare / Drone Interactive Map : A web application for controlling drones

Arvidsson, Albin, Dahl, Marcus, Dyremark, Edvin, Franked, Andreas, Johansson, Liza, Larsson, Noah, Munoz, Alrik, Nyman, Oscar, Pilotti Wiger, Thomas, Purgal, Daniel January 2023 (has links)
Denna rapport beskriver arbetet med implementeringen av Drone Interactive Map som gjordes på uppdrag av RISE, Research Institutes of Sweden AB. Projektets mål är att göra det lättare att få en överblick över ett händelseförlopp med hjälp av drönare försedda med kameror som dokumenterar händelseförloppet. Projektet kan användas av till exempel räddningstjänstpersonal för att få en överblick över en skogsbrand eller industribrand.Projektet behandlar huvudsakligen automatisk styrning och dynamisk omfördelning av en svärm drönare som styrs genom en webbapplikation. Resultatet blev en produkt som har ett användarvänligt gränssnitt som i genomsnitt fick 78,3 poäng vid användbarhetstester enligt SUS-metoden.Produkten kan dela in ett område i mindre delområden och genererar sedan rutter för enskilda drönare i en drönarsvärm. Projektarbetet utfördes av en grupp på tio personer i kursen TDDD96 Kandidatprojekt i programvaruutveckling vid Linköpings universitet.
184

Optimizing ESRGAN for Mobile Deployment : Enhancing Image Super-Resolution on Android Devices

Fredin, Arvid January 2024 (has links)
Rapporten presenterar det arbete som utfördes för en kandidatuppsats i ämnesområdet datavetenskap. Den ursprungliga uppgiften var att undersöka hur djupinlärningsarkitekturen ESRGAN, som används för superupplösning, kan komprimeras så att minimal precision förloras. Projektet resulterade i utvärderingen av tre optimeringsmetoder; dynamic range, full integer och float16-kvantisering. Mätningarna utfördes med hjälp av två mobila enheter; en Samsung Galaxy S9+ surfplatta och en S10+ Android-telefon. Mätningarna genomfördes med hjälp av mätvärdena inferenstid, PSNR, SSIM och kompressionsförhållande. Resultaten visade att Dynamic Range hade en avsevärt långsammare inferenstid jämfört med Full Integer och Float16-kvantisering. Dynamic Range hade ett validerings-PSNR på 27.0 och ett test-PSNR på 22.3. De resulterande SSIM-värdena var 0.81 för valideringsdatasetet och 0.67 för testdatasetet. Full Integer slutade med PSNR-värdena 26.3 och 21.9 för validering respektive test. När det gäller SSIM fick Full Integer poängen 0.77 (validering) och 0.64 (test). Slutligen genererade Float16 PSNR-värdena 27.1 och 22.3, samt SSIM-värdena 0.81 och 0.67. PSNR- och SSIM-utvärderingarna visade att de komprimerade modellerna behövde mer kalibrering för att uppnå högre poäng i dessa metoder, och således högre noggrannhet. / This report presents the work that was carried out for a bachelor’s thesis in computer science. The original task was to investigate how the deep learning architecture ESRGAN used for super resolution can be compressed such that minimal accuracy is lost. The project resulted in the evaluation of three optimization methods; dynamic range, full integer, and float16 quantization. Dynamic range quantizes the weights of the neural network into 8 bits of precision, full integer quantizes all floating point parameters, and float16 reduces halves the floating point precisions. The benchmarks were performed using two mobile devices; a Samsung Galaxy S9+ tablet and an S10+ android phone. Measurements were conducted using metrics inference time, PSNR, SSIM, and compression ratio. The results showed that Dynamic Range had a significantly slower inference time compared to Full Integer and Float16 quantization. Dynamic range had the validation PSNR score of 27.0, and a testing PSNR score of 22.3. The resulting SSIM values were 0.81 for the validation dataset and 0.67 for the testing dataset. Full integer ended up with the PSNR scores 26.3, 21.9 for validation and testing respectively. As for SSIM, Full integer brought the scores 0.77 (validation) and 0.64 (testing). Finally, Float16 generated PSNR scores 27.1 and 22.3, and the SSIM scores 0.81 and 0.67. The PSNR and SSIM evaluations showed that the compressed models needed more calibration for a higher score in these metrics, and consequently a higher level of accuracy.
185

Identifying and Managing Key Challenges in Architecting Software-Intensive Systems

Wallin, Peter January 2011 (has links)
In many traditional industry applications, such as automotive, process automation and manufacturing automation, software plays a crucial role as an enabler for the introduction of new functionality and retaining competitiveness. The system and software architecture plays an important part in ensuring the systems’ qualities. However, the design of the architecture may be neglected during system development, whilst development efforts are centered on implementing new functionality. The architecture is supposed to support and enable key quality attributes such as safety, reliability, maintainability and flexibility, and so on. This thesis identifies some of the key issues in architecting these software intensive systems. In total, 21 issues have been identified; examples of these issues are (1) there is a lack of process for architecture development, (2) there is a lack of method or model to evaluate business value when choosing architecture, (3) there is a lack of clear long-term architectural strategy, and (4) processes and methods are less valued than individuals’ knowledge and competence. Through a series of workshops, root causes were identified for a selection of these issues. Based on these root causes, five success factors were identified. The success factors are (1) define an architectural strategy (2) implement a process for architectural work (3) ensure authority for architects (4) clarify the business impact of the architecture and (5) optimize on the project portfolio level instead of optimizing each project. In an attempt to provide a possible solution to some of the issues, a method has been created to evaluate how new functionality is successfully integrated into an existing architecture. The method is a combination of the Architecture Tradeoff Analysis Method, ATAM, and the Analytical Hierarchy Process, AHP. The method firstly supports a structured way of listing system goals, and secondly, it also supports design decision-making. Since several issues relate to the organization and are affected by management, a comparison was made between the view of management and architects. This study revealed that one cause for the lack of focus on architecture could be that the existing performance measurement systems used by management all focus on the later phases of development when the architecture is already set. / CoSy
186

Digital Inlärningsportal: Ett mer intressant sätt att lära sig / Digital Educationportal: A more interesting way to learn

Sundqvist, Christoffer, Djuvfeldt, Joakim January 2019 (has links)
This essay describes the implementation of a websolution where gamification is applied. The purpose of the websolution is to create a more interesting platform where the focus is education and information sharing between the company's employees. The websolution have the possibility for administrators to create articles which can be linked to quizzes and such. The articles are shown on a newspage which is available for everyone employed at the company. Each article is linked with a quiz or such that the employees can do. To create a feeling of gamification the scores are stored and displayed onto a highscore board that is available to everyone. The essay also describes the design choices that have been done during the project. Some of the choices that are described are Application Programming Interface (APIlayer), frameworks, database models, data exchange, user management, visual design and security. The product that has been created will be presented both visually and functionally where the results are compared to what was asked by the employer. The essay ends with what the authors think went right or wrong with the implementation and design but also some things they learnt and how the project can be further developed.
187

An Introduction to the DevOps Tool Related Challenges

Bheri, Sujeet, Vummenthala, SaiKeerthana January 2019 (has links)
Introduction : DevOps bridges the gap between the development and operations by improving the collaboration while automating the as many as steps from developing the software to releasing the product to the customers. To automate the software development activities, DevOps relies on the tools. There are many challenges associated with the tool implementation such as choosing the suitable tools and integrating tools with existed tools and practices. There must be a clear understanding on what kind of tools are used by the DevOps practitioners and what challenges does each tool create for them. Objectives: The main aim of our study is to investigate the challenges faced by the DevOps practitioners related to the tools and compare the findings with the related literature. Our contributions are (i) a comprehensive set of tools used by Developers and Operators in the software industries; (ii) challenges related to tools faced by the practitioners; and (iii) suggested recommendations and its effectiveness to mitigate the above challenges. Methods: we adopted case study for our study to achieve our research objectives. We have chosen literature review and semi-structured interviews as our data collection methods. Results: In our study we identified seven tools used by developers and operators which were not reported in the literature such as Intellij, Neo4j, and Postman. We identified tool related challenges from the practitioners such as difficulty in choosing the suitable tools, lack of maturity in tools such as Git, and learning new tools. We also identified recommendations for addressing tool related challenges such as Tech-Talks and seminars using complementary tools to overcome the limitations of other tools. We also identified benefits related to the adoption of such recommendations. Conclusion: We expect the DevOps tool landscape to change as old tools either become more sophisticated or outdated and new tools are being developed to better support DevOps and more easily integrate with deployment pipeline. With regard to tool related challenges literature review as well as interviews show that there is a lack of knowledge on how to select appropriate tools and the time it takes to learn the DevOps practices are common challenges. Regarding suggested recommendations, the most feasible one appears to be seminars and knowledge sharing events which educate practitioners how to use better tools and how to possible identify suitable tools.
188

Digital besökslogg: Ett steg mot ett modernare kundbemötande / Digital Visitor Log: A Step Towards a Modernized Customer Encounter

Pettersson Strömsjö, Lucas January 2019 (has links)
Denna uppsats beskriver utvecklingen av en lösning vars syfte är att modernisera det bemötande som gäster får på uppdragsgivarens kontor. Lösningen består av en applikation som körs på en surfplatta, en applikation för administration av dess data samt en molnbaserad databas, där Entity Framework är en central del för att koppla samman dessa komponenter.När en gäst anländer får gästen registrera sig i applikationen på surfplattan och sedan får den anställde som ska ta emot gästen en mobilnotifikation. Notifikationen gör den anställde observant om gästens ankomst. För att identifiera gästen skrivs även en speciellt framtagen gästetikett ut där information från surfplatteapplikationen finns angiven.
189

Partitioned Scheduling of Real-Time Tasks on Multi-core Platforms

Nemati, Farhang January 2010 (has links)
In recent years multiprocessor architectures have become mainstream, and multi-core processors are found in products ranging from small portable cell phones to large computer servers. In parallel, research on real-time systems has mainly focused on traditional single-core processors. Hence, in order for real-time systems to fully leverage on the extra capacity offered by new multi-core processors, new design techniques, scheduling approaches, and real-time analysis methods have to be developed. In the multi-core and multiprocessor domain there are mainly two scheduling approaches, global and partitioned scheduling. Under global scheduling each task can execute on any processor at any time while under partitioned scheduling tasks are statically allocated to processors and migration of tasks among processors is not allowed. Besides simplicity and efficiency of partitioned scheduling protocols, existing scheduling and synchronization methods developed for single-core processor platforms can more easily be extended to partitioned scheduling. This also simplifies migration of existing systems to multi-cores. An important issue related to partitioned scheduling is distribution of tasks among processors which is a bin-packing problem. In this thesis we propose a partitioning framework for distributing tasks on the processors of multi-core platforms. Depending on the type of performance we desire to achieve, the framework may distribute a task set differently, e.g., in an application in which tasks process huge amounts of data the goal of the framework may be to decrease cache misses.Furthermore, we propose a blocking-aware partitioning heuristic algorithm to distribute tasks onto the processors of a multi-core architecture. The objective of the proposed algorithm is to decrease blocking overhead of tasks which reduces the total utilization and has the potential to reduce the number of required processors.Finally, we have implemented a tool to facilitate evaluation and comparison of different multiprocessor scheduling and synchronization approaches, as well as different partitioning heuristics. We have applied the tool in the evaluation of several partitioning heuristic algorithms, and the tool is flexible to which any new scheduling or synchronization protocol as well as any new partitioning heuristic can easily be added.
190

Efficiency of Different Encoding Schemes in Swarm Intelligence for Solving Discrete Assignment Problems: A Comparative Study

Pettersson, Richard January 2019 (has links)
Background Solving problems classified as either NP-complete or NP-hard has long been an active topic in the research community, and has brought about many new algorithms for approximating an optimal solution (basically the best possible solution). A fundamental aspect to consider when developing such an algorithm is how to represent the given solution. Finding the right encoding scheme is key for any algorithm to function as efficiently as possible. That being said, there appears to be a lack of research studies that offer a comprehensive comparison of these encoding schemes. Objectives This study sets out to provide an extensive comparative analysis of five already existing encoding schemes for a population-based meta-heuristic algorithm, with focus on two discrete combinatorial problems: the 1/0 knapsack problem and task scheduling problem. The most popular scheme of these will also be defined and determined by reviewing the literature. Methods The encoding schemes were implemented and incorporated into a recently proposed algorithm, known as the Coyote Optimization Algorithm. Their difference in performance were then compared through several experiments.On top of that, the popularity of said schemes were measured by their number of occurrences among a set of surveyed research studies (on the topic knapsack-problem). Conclusions When compared to the real-valued encoding scheme, we found that both qubits (smallest unit in quantum computing) and complex numbers were more efficient for solving the 1/0 knapsack problem, due to their broader search-space.Our chosen variant of the quantum-inspired encoding scheme contributed to a slightly better result than its complex-valued counterpart. The binary- and boolean encoding schemes worked great in conjunction with a repair function for the knapsack problem, to the extent that their produced solutions converged at a faster rate than the rest.Interestingly enough, the real-valued encoding scheme was by far the more popular choice of them all (as far as the knapsack problem is concerned), which we attribute to its generally simple and convenient implementation; and the fact that it has been around for longer. Finally, we saw that the matrix-based encoding scheme contributed to a faster convergence rate for approximate solutions to the task scheduling problem when the hardware for each resource differed greatly in computing capacity. On the other hand, the SPV (small position value) decoder for both the real-valued and complex-valued encoding schemes were more advantageous when the resources had near to identical computing power, as it is more suitable for distributing tasks equally.

Page generated in 0.0529 seconds