Spelling suggestions: "subject:"open source."" "subject:"ipen source.""
511 |
Observational Studies of Software Engineering Using Data from Software RepositoriesDelorey, Daniel Pierce 06 March 2007 (has links) (PDF)
Data for empirical studies of software engineering can be difficult to obtain. Extrapolations from small controlled experiments to large development environments are tenuous and observation tends to change the behavior of the subjects. In this thesis we propose the use of data gathered from software repositories in observational studies of software engineering. We present tools we have developed to extract data from CVS repositories and the SourceForge Research Archive. We use these tools to gather data from 9,999 Open Source projects. By analyzing these data we are able to provide insights into the structure of Open Source projects. For example, we find that the vast majority of the projects studied have never had more than three contributors and that the vast majority of authors studied have never contributed to more than one project. However, there are projects that have had up to 120 contributors in a single year and authors who have contributed to more than 20 projects which raises interesting questions about team dynamics in the Open Source community. We also use these data to empirically test the belief that productivity is constant in terms of lines of code per programmer per year regardless of the programming language used. We find that yearly programmer productivity is not constant across programming languages, but rather that developers using higher level languages tend to write fewer lines of code per year than those using lower level languages.
|
512 |
Prevalence of Reflexivity and Its Impact on Success in Open Source Software Development: An Empirical StudyFoushee, Brandon D. 23 April 2013 (has links) (PDF)
Conventional wisdom, inspired in part by Eric Raymond, suggests that open source developers primarily develop software for developers like themselves. In our studies we distinguish between reflexive software (software written primarily for other developers) and irreflexive software (software written primarily for passive users). In the first study, we present four criteria which we then use to assess project reflexivity in SourceForge. These criteria are based on three specific indicators: intended audience, relevant topics, and supported operating systems. Based on our criteria, we find that 68% of SourceForge projects are reflexive (in the sense described by Raymond). In the second study, we randomly sample and statically estimate reflexivity within SourceForge. Our results support Raymond's assertions that 1) OSS projects tend to be reflexive and 2) reflexive OSS projects tend to be more successful than irreflexive projects. We also find a decrease in reflexivity from a high in 2001 to a low in 2011.
|
513 |
A Low-Cost, Compact Electrochemical Analyzer Based on an Open-Source MicrocontrollerAddo, Michael 25 April 2023 (has links)
Electrochemical measurements are utilized in various fields, including healthcare (e.g., potentiometric measurements for electrolytes in blood and blood gas, amperometric biosensing of glucose as in blood glucose meters), water quality (e.g., pH measurement, voltammetric analyses for heavy metals), and energy. Much of the appeal of electrochemical analyses can be attributed to the relative simplicity, low cost and lack of maintenance associated with electrochemical instruments, along with techniques that can exhibit high sensitivity and selectivity, wide linear dynamic range, and low limits of detection for many analytes. While commercial electrochemical analyzers are less expensive than many other instruments for chemical analyses and are available from various manufacturers, versatility and performance often coincide with added expense. Recently, the development of low-cost, adaptable, open-source chemical instruments, including electrochemical analyzers, has emerged as a topic of great interest in the scientific community. In contrast to commercial instruments, for which schematics and underlying operation details are often obscured – severely limiting modifications and improvements, creators of open-source instruments release all the necessary information for reproduction of the hardware and software. As a result, open-source instruments not only serve as excellent teaching tools for novices to gain experience in electronics and programming, but also present opportunity to design and develop low-cost, portable instruments, which have particular significance for point-of-care sensing applications, use in resource-limited settings, and the rapidly developing field of on-body sensors. In this work, we report the design of a low-cost, compact electrochemical analyzer based on an open-source Arduino microcontroller. The instrument is capable of performing electrochemical analyses such as cyclic and linear sweep voltammetry with an operating range of ± 138 ��A and ± 1.65 V. Performance of the platform is investigated with low-cost pencil graphite electrodes and results compared to commercial potentiostats.
|
514 |
Integration of Digital Signal Processing Block in SymbiFlow FPGA Toolchain for Artix-7 DevicesHartnett, Andrew T 28 October 2022 (has links)
The open-source community is a valuable resource for many hobbyists and researchers interested in collaborating and contributing towards publicly available tools. In the area of field programmable gate arrays (FPGAs) this is no exception. Contributors seek to reverse-engineer the functions of large proprietary FPGA devices. An interesting challenge for open-source FPGA engineers has been reverse-engineering the operation and bitstreams of digital signal processing (DSP) blocks located in FPGAs. SymbiFlow is an open-source FPGA toolchain designed as a free alternative to proprietary computer-aided design tools like Xilinx’s Vivado. For SymbiFlow, mapping logical multipliers to DSP blocks and generating DSP block bitstreams has been left unimplemented for the Artix-7 family of FPGAs. This research seeks to rectify this shortcoming by introducing DSP information for the place and route functions into SymbiFlow. By delving into the SymbiFlow architecture definitions and creating functioning FPGA assembly code (FASM) files for Project X-Ray, a bitstream generator for Artix-7, we have been able to determine the desired output of the open-source Versatile Place & Route tool that will generate a working DSP bitstream. We diagnose and implement changes needed throughout the SymbiFlow toolchain, allowing for DSP design bitstreams to be successfully generated with open-source tools.
|
515 |
Evaluation of Open-source Threat Intelligence Platforms Considering Developments in Cyber SecurityAndrén, Love January 2024 (has links)
Background. With the increase in cyberattacks and cyber related threats, it is of great concern that the area still lacks the needed amount of practitioners. Open-source threat intelligence platforms are free platforms hosting related information to cyber threats. These platforms can act as a gateway for new practitioners and be of use during research on all levels. For this to be the case, they need to be up-to-date, active user base and show a correlation to commercial companies and platforms. Objectives. In the research, data will be gathered from a multitude of open-source threat intelligence platforms to determine if they have increased usage and correlation to other sources. Furthermore, the research will look at if there are overrepresentations for certain countries and if the platforms are affected by real world events. Methods. Platforms were gathered using articles and user curated lists, they were filtered based on if the data could be used and if they were free or partially free. The data was then, and processed to only include information from after 2017 and all be unique entries. It was then filtered through a tool to remove potential false positives. For IP addresses and domains, a WHOIS query was done for each entry to get additional information. Results. There was a noticeable increase in the amount of unique submission for the categories CVE and IP addresses, the other categories showed no clear increase or decrease. The United States was the most represented country when analyzing domains and IP addresses. The WannaCry ransomware had a notable effect on the platforms, with an increase in submission during the month of the attack and after, and samples of the malware making out 7.03\% of the yearly submissions. The Russian invasion of Ukraine did not show any effect on the platforms. Comparing the result to the annual Microsoft security reports, there was a clear correlation for some years and sources, while others showed none at all. This was the case for all the statistic applicable to, reported countries, noticeable trend increases and most prominent malware. Conclusions. While some results showed that there was an increase in cyberattacks and correlation to real world event, others did not. Open-Source threat intelligence platforms often provides the necessary data, but problems starts showing up when analyzing it. The data itself is extremely sensitive depending on what processing methods are used, which in turn can lead to varying results. / Bakgrund. Med den stora ökningen av cyberattecker och hot har det uppmärksammats att cybersäkerhets omårdet fortfarande saknar nog med utbildade individer. Open-source threat intelligence plattformar är gratis tjänster som innehåller information om cyberhot. Dessa platformar kan fungera som en inkörsport till cybersäkerhets området och ett stöd till alla nivåer av forskning samt utbildning. För att detta ska fungera, måste de vara uppdaterade, ha en aktiv användarbas och data ha liknande resultat som betaltjänster och stora företagsrapporter. Syfte. I arbetet kommer data samlas in från flertal open-source threat intelligence plattformar i syftet att avgöra om deras använding och bidrag har ökat. Vidare om informationen är liknande till det som rapporteras av företag. Utöver så kommer det undersökas om några länder är överrepresenterade bland datan och om verkliga händelser påverkade plattformarna. Metod. Möjliga plattformar samlades in genom artiklar och användarskapade listor. De filtrerades sedan baserat på om data kunde användas i arbetet och om det var gratis eller delvis gratis. Datan hämtades från plattformarna och filtrerades så enbart allt rapporterat efter 2017 och unika bidrag kvarstod. All data bearbetades genom ett verktyg för att få bort eventuella falskt positiva bidrag. Slutligen så gjordes WHOIS uppslag för IP adresser och domäner. Resultat. CVEs och IP-adresser visade en märkbar ökning av antalet unika bidrag. Resterande kategorier visade ingen direkt ökning eller minskning. Det mest överrepresenterade landet var USA för båda domäner och IP adresser. WannaCry viruset hade en märkbar påverkan på pattformarna, där månaden under attacken och efter hade ökningar av bidrag. Viruset utgjorde 7.03\% av de total årliga bidragen. Den ryska invasionen av Ukraina visade ingen direkt påverkan på plattformarna. När resultatet jämfördes med Microsots årliga säkerhetsrapporter fanns det en tydlig liknelse i resultat för vissa år och källor. Andra källor och år hade ingen liknande statistik. Den information från rapporten som kunde tillämpas var länder, märkbara ökningar i specifika kategorier och högst förekommande virus. Slutsatser. Vissa resultat visade att det fanns ökning av cyberattacker och att plattformarna hade en tydlig koppling till verkliga händelser, för andra resultat stämde det ej överrens. Open-source threat intelligence plattformar innehåller viktig och relevant data. Problem börjar dock uppstå när man ska analysera datan. Detta är eftersom datan är extremt känslig till hur den bearbetas den, som i tur kan leda till varierande resultat.
|
516 |
Assessing the utility of 3D modeling with photogrammetry in assigned sex estimation from the greater sciatic notchCarrière, Chelsea Madison 15 February 2024 (has links)
Assigned sex estimation via the greater sciatic notch (GSN) is traditionally performed via physical/visual examination and ordinal scoring; however, this relies on the subjective assessment of morphology for typological classification which may not be reflective of human variation. Three-dimensional (3D) photogrammetry may offer a technologically advanced, low cost, and more objective alternative to assess the complex curvature of anatomical landmarks. This research explores the accuracy of photogrammetry derived 3D models by comparing digital measurements to those obtained from the skeletal elements and to streamline the application of curvature analysis for the estimation of assigned sex from the GSN. This study utilizes the left and right os coxae from 15 skeletal individuals (5 females, 10 males) from the Boston University Chobanian & Avedisian School of Medicine. A Fujifilm X-Pro2 and Fujifilm 35 mm prime lens captured 123 images per element, which were processed in Meshroom by AliceVision® to create a 3D textured mesh. The mesh was exported into Blender for cleanup, scaling, measurement, and curvature analysis. The measurements were between 96.54% and 99.94% consistent across methods and observations. The consistency between digital metric observations increased by an average of 0.07% when compared to the consistency of the dry bone measurements. Additionally, curvature analysis of the GSN correctly estimated the assigned sex of all os coxae in the sample. This study demonstrates that photogrammetry is an accurate and reliable method for the digitization of remains that enables analytical techniques to better capture skeletal variation compared to traditional methods.
|
517 |
Theoretical methods for electron-mediated processesGayvert, James R. 01 February 2024 (has links)
Electron-driven processes lie at the core of a large variety of physical, biological, and chemical phenomena. Despite their crucial roles in science and technology, detailed description of these processes remains a significant challenge, and there is a need for the development of accurate and efficient computational tools that enable predictive simulation. This work is focused on the development of novel software tools and methodologies aimed at two classes of electron-mediated processes: (i) electron-molecule scattering, and (ii) charge transfer in proteins.
The first major focus of this thesis is the electronic structure of autoionizing electronic resonances. The theoretical description of these metastable states is intractable by means of conventional quantum chemistry techniques, and specialized techniques are required in order to accurately describe their energies and lifetimes. In this work, we have utilized the complex absorbing potential (CAP) method, and describe three developments which have advanced the applicability, efficiency, and accessibility of the CAP methodology for molecular resonances: (1) implementation and investigation of the smooth Voronoi potential (2) implementation of CAP in the projected scheme, and (3) development of the OpenCAP package, which extends the CAP methodology to popular electronic structure packages.
The second major focus is the identification of electron and hole transfer (ET) pathways in biomolecules. Both experimental and theoretical inquiries into electron/hole transfer processes in biomolecules generally require targeted approaches, which are complicated by the existence of numerous potential pathways. To this end, we have developed an open-source web platform, eMap, which exploits a coarse-grained model of the protein crystal structure to (1) enable pre-screening of potentially efficient ET pathways, and (2) identify shared pathways/motifs in families of proteins.
Following introductory chapters on motivation and theoretical background, we devote a chapter to each new methodology mentioned above. The open-source software tools discussed herein are under active development, and have been utilized in published work by several unaffiliated experimental and theoretical groups across the world. We conclude the dissertation with a summary and discussion of the outlook and future directions of the OpenCAP and eMap software packages.
|
518 |
Three Essays on Economic Agents' Incentives and Decision MakingLee, Dongryul 04 June 2009 (has links)
This dissertation consists of three essays on theoretical analysis of economic agents' decision making and incentives. Chapter 1 gives an outline of the subjects to be examined in the subsequent chapters and shows their conclusions in brief.
Chapter 2 explores the decision problem of a superordinate (a principal) regarding whether to delegate its authority or right to make a decision to a subordinate (an agent) in an organization. We first study the optimal contracting problem of the superordinate that specifies the allocation of the authority and wage in a principal-agent setting with asymmetric information, focusing on two motives for delegation, "informative" and "effort-incentive-giving" delegation. Further, we suggest delegating to multiple agents as a way of addressing the asymmetric information problem within an organization, focusing on another motive for delegation, "strategic" delegation.
Chapter 3 analyzes the behavior of players in a particular type of contest, called "the weakest-link contest". Unlike a usual contest in which the winning probability of a group in a contest depends on the sum of the efforts of all the players in the group, the weakest-link contest follows a different rule: the winning probability of a group is determined by the lowest effort of the players in the group. We first investigate the effort incentives of the players in the weakest-link contest, and then check whether the hungriest player in each group, who has the largest willingness to exert effort, has an incentive to incentivize the other players in his group in order to make them exert more effort.
Chapter 4 examines the decision making of software programmers in the software industry between an open source software project and a commercial software project. Incorporating both intrinsic and extrinsic motivation on open source project participation into a stylized economic model based on utility theory, we study the decision problem of the programmers in the software industry and provide the rationale for open source project participation more clearly. Specifically, we examine the question of how the programmers' intrinsic motivation, extrinsic motivation, and abilities affect their project choices between an open source project and a commercial project, and effort incentives. / Ph. D.
|
519 |
Assessment of Open-Source Software for High-Performance ComputingRapur, Gayatri 13 December 2003 (has links)
High quality software is a key component of various technology systems that are crucial to software producers, users, and society in general. Software application development today uses software from external sources, to achieve software implementation goals. Numerous methods, activities, and standards have been developed in order to realize quality software. Nevertheless, the pursuit for new methods of realizing and assuring quality in software is incessant. Researchers in the software engineering field are in pursuit of methods that can be on par with changing technology. Assessment of open-source software can be supported by a methodology that uses data from prior releases of a software product to predict the quality of a future release. The proposed methodology is validated using a case study of MPICH ? an open-source software product from the field of high-performance computing. A quantitative model and a module-order model have been developed that can predict the modules that are expected to have code-churn and the amount of code-churn in each module. Code-churn is defined as the amount of update activity that has been done to a software product in order to fix bugs. Further validation of the proposed methodology on other software and development of classification models for the quality factor code-churn are recommended as future work.
|
520 |
Community Revisited: Invoking the Subjectivity of the Online LearnerRybas, Sergey 25 July 2008 (has links)
No description available.
|
Page generated in 0.0573 seconds