• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 73
  • 16
  • 15
  • 6
  • 4
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 141
  • 141
  • 58
  • 46
  • 43
  • 37
  • 33
  • 19
  • 19
  • 16
  • 14
  • 13
  • 12
  • 11
  • 11
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
131

Modeling Eye Movement for the Assessment of Programming Proficiency

Al Madi, Naser S. 26 July 2020 (has links)
No description available.
132

Towards Understanding and Securing the OSS Supply Chain

Vu Duc, Ly 14 March 2022 (has links)
Free and Open-Source Software (FOSS) has become an integral part of the software supply chain in the past decade. Various entities (automated tools and humans) are involved at different stages of the software supply chain. Some actions that occur in the chain may result in vulnerabilities or malicious code injected in a published artifact distributed in a package repository. At the end of the software supply chain, developers or end-users may consume the resulting artifacts altered in transit, including benign and malicious injection. This dissertation starts from the first link in the software supply chain, ‘developers’. Since many developers do not update their vulnerable software libraries, thus exposing the user of their code to security risks. To understand how they choose, manage and update the libraries, packages, and other Open-Source Software (OSS) that become the building blocks of companies’ completed products consumed by end-users, twenty-five semi-structured interviews were conducted with developers of both large and small-medium enterprises in nine countries. All interviews were transcribed, coded, and analyzed according to applied thematic analysis. Although there are many observations about developers’ attitudes on selecting dependencies for their projects, additional quantitative work is needed to validate whether behavior matches or whether there is a gap. Therefore, we provide an extensive empirical analysis of twelve quality and popularity factors that should explain the corresponding popularity (adoption) of PyPI packages was conducted using our tool called py2src. At the end of the software supply chain, software libraries (or packages) are usually downloaded directly from the package registries via package dependency management systems under the comfortable assumption that no discrepancies are introduced in the last mile between the source code and their respective packages. However, such discrepancies might be introduced by manual or automated build tools (e.g., metadata, Python bytecode files) or for evil purposes (malicious code injects). To identify differences between the published Python packages in PyPI and the source code stored on Github, we developed a new approach called LastPyMile . Our approach has been shown to be promising to integrate within the current package dependency management systems or company workflow for vetting packages at a minimal cost. With the ever-increasing numbers of software bugs and security vulnerabilities, the burden of secure software supply chain management on developers and project owners increases. Although automated program repair approaches promise to reduce the burden of bug-fixing tasks by suggesting likely correct patches for software bugs, little is known about the practical aspects of using APR tools, such as how long one should wait for a tool to generate a bug fix. To provide a realistic evaluation of five state-of-the-art APR tools, 221 bugs from 44 open-source Java projects were run within a reasonable developers’ time and effort.
133

Прилог пројектовању, консолидацији и трансформацијама ограничења торке шеме базе података, заснован на платформски независним моделима / Prilog projektovanju, konsolidaciji i transformacijama ograničenja torke šeme baze podataka, zasnovan na platformski nezavisnim modelima / An Approach to Design, Consolidation and Transformations of Database Schema Check Constraints Based on Platform Independent Models

Obrenović Nikola 10 October 2015 (has links)
<p>Употреба платформски независног моделовања и генерисања<br />прототипова у развоју информационих система скраћује време<br />њиховог развоја и побољшава квалитет тог процеса. При томе,<br />циљ је обезбеђење могућности да развој свих аспеката<br />информационих система буде подржан оваквим приступом.<br />Ова дисертација треба да пружи одговарајући допринос у<br />остварењу наведеног циља. У дисертацији представљени су<br />алгоритми за трансформацију модела ограничења вредности у<br />извршив к&ocirc;д и консолидацију подшема са јединственом<br />шемом базе података, са аспекта ограничења вредности.</p> / <p>Upotreba platformski nezavisnog modelovanja i generisanja<br />prototipova u razvoju informacionih sistema skraćuje vreme<br />njihovog razvoja i poboljšava kvalitet tog procesa. Pri tome,<br />cilj je obezbeđenje mogućnosti da razvoj svih aspekata<br />informacionih sistema bude podržan ovakvim pristupom.<br />Ova disertacija treba da pruži odgovarajući doprinos u<br />ostvarenju navedenog cilja. U disertaciji predstavljeni su<br />algoritmi za transformaciju modela ograničenja vrednosti u<br />izvršiv k&ocirc;d i konsolidaciju podšema sa jedinstvenom<br />šemom baze podataka, sa aspekta ograničenja vrednosti.</p> / <p>The usage of platform-independent modelling and generation of<br />prototypes in information systems development reduces the<br />development time and improves the process quality. By that, the<br />goal is to have all elements of an information system supported by<br />this approach.<br />This dissertation should provide a contribution towards fulfilling the<br />given goal. In the dissertation, author presents algorithms for<br />check constraint model into executable code transformations and<br />algorithms for testing subschema consolidation with respect to<br />check constraints.</p>
134

Capturing JUnit Behavior into Static Programs : Static Testing Framework

Siddiqui, Asher January 2010 (has links)
<p>In this research paper, it evaluates the benefits achievable from static testing framework by analyzing and transforming the <em>JUnit3.8 </em>source code and static execution of transformed code. Static structure enables us to analyze the code statically during creation and execution of test cases. The concept of research is by now well established in static analysis and testing development. The research approach is also increasingly affecting the static testing process and such research oriented work has proved particularly valuable for those of us who want to understand the reflective behavior of <em>JUnit3.8 Framework</em>.</p><p><em> JUnit3.8 Framework</em> uses <em>Java Reflection API</em> to invoke core functionality (test cases creation and execution) dynamically. However, <em>Java Reflection API</em> allows developers to access and modify structure and behavior of a program.  Reflection provides flexible solution for creating test cases and controlling the execution of test cases. Java reflection helps to encapsulate test cases in a single object representing the test suite. It also helps to associate each test method with a test object. Where reflection is a powerful tool to perform potential operations, on the other hand, it limits static analysis. Static analysis tools often cannot work effectively with reflection.</p><p>In order to avoid the reflection, <em>Static Testing Framework</em> provides a static platform to analyze the <em>JUnit3.8</em> source code and transform it into non-reflective version that emulates the dynamic behavior of <em>JUnit3.8</em>. The transformed source code has possible leverage to replace reflection with static code and does same things in an execution environment of <em>Static Testing Framework</em> that reflection does in <em>JUnit3.8</em>. More besides, the transformed code also enables execution environment of <em>Static Testing Framework</em> to run test methods statically. In order to measure the degree of efficiency, the implemented tool is evaluated. The evaluation of <em>Static Testing Framework</em> draws results for different Java projects and these statistical data is compared with <em>JUnit3.8</em> results to measure the effectiveness of <em>Static Testing Framework</em>. As a result of evaluation, <em>STF</em> can be used for static creation and execution of test cases up to <em>JUnit3.8</em> where test cases are not creating within a test class and where real definition of constructors is not required. These problems can be dealt as future work by introducing a middle layer to execute test fixtures for each test method and by generating test classes as per real definition of constructors.</p>
135

Adoção do modelo aberto de desenvolvimento de software pelas empresas

Milan, Luiz Fernando Albertin Bono 26 February 2018 (has links)
Submitted by Luiz Fernando Bono Milan (luiz.milan@gvmail.br) on 2018-03-23T19:20:49Z No. of bitstreams: 1 Tese_Luiz_Fernando_Albertin_Bono_Milan.pdf: 1496924 bytes, checksum: dc2807d559797a738bbd609bc3b1ad35 (MD5) / Approved for entry into archive by Debora Nunes Ferreira (debora.nunes@fgv.br) on 2018-03-26T14:49:42Z (GMT) No. of bitstreams: 1 Tese_Luiz_Fernando_Albertin_Bono_Milan.pdf: 1496924 bytes, checksum: dc2807d559797a738bbd609bc3b1ad35 (MD5) / Approved for entry into archive by Suzane Guimarães (suzane.guimaraes@fgv.br) on 2018-03-26T17:35:06Z (GMT) No. of bitstreams: 1 Tese_Luiz_Fernando_Albertin_Bono_Milan.pdf: 1496924 bytes, checksum: dc2807d559797a738bbd609bc3b1ad35 (MD5) / Made available in DSpace on 2018-03-26T17:35:06Z (GMT). No. of bitstreams: 1 Tese_Luiz_Fernando_Albertin_Bono_Milan.pdf: 1496924 bytes, checksum: dc2807d559797a738bbd609bc3b1ad35 (MD5) Previous issue date: 2018-02-26 / Nas últimas décadas, o modelo aberto de desenvolvimento de software foi de passatempo de programadores, para inimigo de empresas de tecnologia e, mais recentemente, passou a ser uma estratégia destas. Sob o paradigma filosófico do Realismo Crítico, o estudo tem o objetivo de identificar os mecanismos envolvidos na decisão de adoção do modelo aberto de desenvolvimento de software pelas empresas. Utilizando estudo de casos múltiplos, os dados contemplam nove empresas brasileiras de tecnologia. O material qualitativo foi analisado utilizando técnicas indutivas e dedutivas, buscando evidências para suportar um modelo preliminar de pesquisa baseado na literatura e a identificação de novos fatores. Os principais resultados do estudo são: com a utilização de técnicas de mapeamento de citações, uma ampla e estruturada revisão da literatura sobre o modelo aberto de desenvolvimento de software, podendo assim, oferecer o caminho principal da literatura; dos dados empíricos, foi identificado um mecanismo e mudança da prática organizacional de influência mútua, entre o nível do indivíduo e o nível da organização e os fatores, uso de software de código aberto e intenção de garantir o investimento inicial; é oferecida uma estrutura de referência – framework, que endereça os mecanismos envolvidos na adoção deste modelo pelas empresas, importante para o avanço do conhecimento sobre o tema; e, as posturas adotadas pelas empresas da amostra, relevante contribuição para os gestores das empresas no contexto brasileiro de desenvolvimento de software. Um importante achado do estudo foi que, independentemente do risco, a adoção do modelo aberto de desenvolvimento de software pelas empresas parece estar mais fortemente ligada à mudança da prática organizacional do que a outros fatores. Os achados do estudo permitem aos pesquisadores, além de enxergarem a evolução da literatura sobre o tema ao longo do tempo, a avançar de forma estruturada os estudos sobre o modelo aberto de desenvolvimento de software no nível da organização, nível de análise que recebeu pouca atenção da literatura ao longo do tempo. Aos gestores do mercado, permite a reavaliação de suas estratégias em relação ao desenvolvimento de software. / In the last decades, the open software development model has gone from being a pastime for programmers to an enemy of technology companies and, more recently, to becoming a strategy for these companies. Under the philosophical paradigm of Critical Realism, this study aims to identify the mechanisms involved in the decision for companies to adopt the open software development model. Using a multiple case study, the data cover nine Brazilian technology companies. The qualitative material was analyzed using inductive and deductive techniques, seeking evidence to support a preliminary research model based on the literature and the identification of new factors. With the use of citation mapping techniques, a broad and structured review of the literature on the open software development model shows the main path that the literature has followed. From the empirical data, a mechanism was identified and a change in organizational culture with mutual influence between the individual and organizational level and the factors, use of open code software and the intention to guarantee initial investment. A framework is provided that addresses the mechanisms involved in the adoption of this model by companies. This is an important step in the advance of knowledge on the theme. Furthermore, the strategies adopted by the companies in the sample are important in that they can be helpful for company managers in the Brazilian software development context. An important finding of the study was that, regardless of the risk, the adoption of the open model of software development by companies seems to be more strongly linked to the change in organizational culture than to other factors. The findings of the study allow the researchers, in addition to seeing the evolution of the literature on the theme over time, to advance in a structured way the studies about the open model of software development at the organizational level, level of analysis that received little attention of literature over time. To managers, it allows the reevaluation of their strategies in relation to software development.
136

Capturing JUnit Behavior into Static Programs : Static Testing Framework

Siddiqui, Asher January 2010 (has links)
In this research paper, it evaluates the benefits achievable from static testing framework by analyzing and transforming the JUnit3.8 source code and static execution of transformed code. Static structure enables us to analyze the code statically during creation and execution of test cases. The concept of research is by now well established in static analysis and testing development. The research approach is also increasingly affecting the static testing process and such research oriented work has proved particularly valuable for those of us who want to understand the reflective behavior of JUnit3.8 Framework. JUnit3.8 Framework uses Java Reflection API to invoke core functionality (test cases creation and execution) dynamically. However, Java Reflection API allows developers to access and modify structure and behavior of a program.  Reflection provides flexible solution for creating test cases and controlling the execution of test cases. Java reflection helps to encapsulate test cases in a single object representing the test suite. It also helps to associate each test method with a test object. Where reflection is a powerful tool to perform potential operations, on the other hand, it limits static analysis. Static analysis tools often cannot work effectively with reflection. In order to avoid the reflection, Static Testing Framework provides a static platform to analyze the JUnit3.8 source code and transform it into non-reflective version that emulates the dynamic behavior of JUnit3.8. The transformed source code has possible leverage to replace reflection with static code and does same things in an execution environment of Static Testing Framework that reflection does in JUnit3.8. More besides, the transformed code also enables execution environment of Static Testing Framework to run test methods statically. In order to measure the degree of efficiency, the implemented tool is evaluated. The evaluation of Static Testing Framework draws results for different Java projects and these statistical data is compared with JUnit3.8 results to measure the effectiveness of Static Testing Framework. As a result of evaluation, STF can be used for static creation and execution of test cases up to JUnit3.8 where test cases are not creating within a test class and where real definition of constructors is not required. These problems can be dealt as future work by introducing a middle layer to execute test fixtures for each test method and by generating test classes as per real definition of constructors.
137

Rozvoj instrumentace programu při překladu / Development of Instrumentation during Compilation

Ševčík, Václav January 2020 (has links)
The focus of this master's thesis is on the topic of instrumentation during the compilation process in the LLVM compiler. The tool enables to instrument memory accesses and functions. The instrumentation is realized through adding a novel pass to the LLVM's optimalization phase. Information about variables are managed by the created framework. The framework is linked with the program. The overhead of the instrumentation increases duration of the program by about 14 % in the case of switched off indirect addressing and 23 % in the case of switched on indirect addressing. The main benefit of the work is the possibility of easy instrumentation of the program which can even monitor operation of local variables through indirect addressing) and support multithread programs. The framework is part of Testos's tools where it provides automatic instrumentation in the Spectra tool.
138

Malý CNC stroj / Small CNC machine

Moštěk, Jiří January 2014 (has links)
This diploma thesis deals with the design and construction of a three-axis CNC machine primarily designed for PCB drilling and production of front panels for various electronic devices. All three axes are driven by stepper motors NEMA 23 which are connected to stepper motor drivers L6470. Processor STM32F407 is used to control the whole machine. The wiring is completed by a LCD display with touchscreen which is used to communicate with user. Data for drilling can be entering manually or via USB inerface. Part of this thesis is the selection of a suitable construction and components, assembling equipment, wiring design of electronic circuits and writing the code to control the machine. Finally, the parameters of the designed device have been measured.
139

Minimalistická reprezentace modelu areálu Božetěchova / Minimal Representation of the Božetěchova Complex

Král, Tomáš Unknown Date (has links)
The document describes developing graphical application with limited size. It describes suitable techniques for a polygonal mesh's compression. The second part is focused on practical usage of this techniques for developing scene in 3D modeling environment and also describes how to transfer this model to the executable file. The work attends to optimalizations of source code compilation and executables compression at the final chapters.
140

Information-Theoretic aspects of quantum key distribution

Van Assche, Gilles 26 April 2005 (has links)
<p>La distribution quantique de clés est une technique cryptographique permettant l'échange de clés secrètes dont la confidentialité est garantie par les lois de la mécanique quantique. Le comportement particulier des particules élémentaires est exploité. En effet, en mécanique quantique, toute mesure sur l'état d'une particule modifie irrémédiablement cet état. En jouant sur cette propriété, deux parties, souvent appelées Alice et Bob, peuvent encoder une clé secrète dans des porteurs quantiques tels que des photons uniques. Toute tentative d'espionnage demande à l'espion, Eve, une mesure de l'état du photon qui transmet un bit de clé et donc se traduit par une perturbation de l'état. Alice et Bob peuvent alors se rendre compte de la présence d'Eve par un nombre inhabituel d'erreurs de transmission.</p><p><p><p>L'information échangée par la distribution quantique n'est pas directement utilisable mais doit être d'abord traitée. Les erreurs de transmissions, qu'elles soient dues à un espion ou simplement à du bruit dans le canal de communication, doivent être corrigées grâce à une technique appelée réconciliation. Ensuite, la connaissance partielle d'un espion qui n'aurait perturbé qu'une partie des porteurs doit être supprimée de la clé finale grâce à une technique dite d'amplification de confidentialité.</p><p><p><p>Cette thèse s'inscrit dans le contexte de la distribution quantique de clé où les porteurs sont des états continus de la lumière. En particulier, une partie importante de ce travail est consacrée au traitement de l'information continue échangée par un protocole particulier de distribution quantique de clés, où les porteurs sont des états cohérents de la lumière. La nature continue de cette information implique des aménagements particuliers des techniques de réconciliation, qui ont surtout été développées pour traiter l'information binaire. Nous proposons une technique dite de réconciliation en tranches qui permet de traiter efficacement l'information continue. L'ensemble des techniques développées a été utilisé en collaboration avec l'Institut d'Optique à Orsay, France, pour produire la première expérience de distribution quantique de clés au moyen d'états cohérents de la lumière modulés continuement.</p><p><p><p>D'autres aspects importants sont également traités dans cette thèse, tels que la mise en perspective de la distribution quantique de clés dans un contexte cryptographique, la spécification d'un protocole complet, la création de nouvelles techniques d'amplification de confidentialité plus rapides à mettre en œuvre ou l'étude théorique et pratique d'algorithmes alternatifs de réconciliation.</p><p><p><p>Enfin, nous étudions la sécurité du protocole à états cohérents en établissant son équivalence à un protocole de purification d'intrication. Sans entrer dans les détails, cette équivalence, formelle, permet de valider la robustesse du protocole contre tout type d'espionnage, même le plus compliqué possible, permis par les lois de la mécanique quantique. En particulier, nous généralisons l'algorithme de réconciliation en tranches pour le transformer en un protocole de purification et nous établissons ainsi un protocole de distribution quantique sûr contre toute stratégie d'espionnage.</p><p><p><p>Quantum key distribution is a cryptographic technique, which allows to exchange secret keys whose confidentiality is guaranteed by the laws of quantum mechanics. The strange behavior of elementary particles is exploited. In quantum mechnics, any measurement of the state of a particle irreversibly modifies this state. By taking advantage of this property, two parties, often called Alice and bob, can encode a secret key into quatum information carriers such as single photons. Any attempt at eavesdropping requires the spy, Eve, to measure the state of the photon and thus to perturb this state. Alice and Bob can then be aware of Eve's presence by a unusually high number of transmission errors.</p><p><p><p>The information exchanged by quantum key distribution is not directly usable but must first be processed. Transmission errors, whether they are caused by an eavesdropper or simply by noise in the transmission channel, must be corrected with a technique called reconciliation. Then, the partial knowledge of an eavesdropper, who would perturb only a fraction of the carriers, must be wiped out from the final key thanks to a technique called privacy amplification.</p><p><p><p>The context of this thesis is the quantum key distribution with continuous states of light as carriers. An important part of this work deals with the processing of continuous information exchanged by a particular protocol, where the carriers are coherent states of light. The continuous nature of information in this case implies peculiar changes to the reconciliation techniques, which have mostly been developed to process binary information. We propose a technique called sliced error correction, which allows to efficiently process continuous information. The set of the developed techniques was used in collaboration with the Institut d'Optique, Orsay, France, to set up the first experiment of quantum key distribution with continuously-modulated coherent states of light.</p><p><p><p>Other important aspects are also treated in this thesis, such as placing quantum key distribution in the context of a cryptosystem, the specification of a complete protocol, the creation of new techniques for faster privacy amplification or the theoretical and practical study of alternate reconciliation algorithms.</p><p><p><p>Finally, we study the security of the coherent state protocol by analyzing its equivalence with an entanglement purification protocol. Without going into the details, this formal equivalence allows to validate the robustness of the protocol against any kind of eavesdropping, even the most intricate one allowed by the laws of quantum mechanics. In particular, we generalize the sliced error correction algorithm so as to transform it into a purification protocol and we thus establish a quantum key distribution protocol secure against any eavesdropping strategy.</p> / Doctorat en sciences appliquées / info:eu-repo/semantics/nonPublished

Page generated in 0.0838 seconds