• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1927
  • 598
  • 576
  • 417
  • 240
  • 177
  • 57
  • 54
  • 45
  • 26
  • 26
  • 25
  • 24
  • 23
  • 20
  • Tagged with
  • 4822
  • 533
  • 504
  • 499
  • 433
  • 421
  • 376
  • 362
  • 354
  • 346
  • 340
  • 337
  • 320
  • 319
  • 317
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
201

Closure: Transforming Source Code for Faster Fuzzing

Paterson, Ian G. 27 May 2022 (has links)
Fuzzing, the method of generating inputs to run on a target program while monitoring its execution, is a widely adopted and pragmatic methodology for bug hunting as a means of software hardening. Technical improvements in throughput have shown to be critical to increasing the rate at which new bugs can be discovered time and time again. Persistent fuzzing, which keeps the fuzz target alive via looping, provides increased throughput at the cost for manual development of harnesses to account for invalid states and coverage of the programs code base, while relying on forking to reset the state accrued by looping over the same piece of code multiple times. Stale state can lead to wasted fuzzing efforts as certain areas of code may be conditionally ignored due to a stale global. I propose Closure, a toolset which enables programs to run at persistent speeds while avoiding the downsides of stale state and other bottlenecks associated with persistent fuzzing. / Master of Science / The process of program testing to find bugs is becoming increasingly automated. A current method called "Fuzzing", is a widely adopted means for finding bugs and is required in the life cycle of program development by major companies and the US Government. I look at current improvements in fuzzing, and expand the use case of the cutting edge method called persistent fuzzing to a wider array of applications with my tool Closure. With Closure, fuzzing practitioners can experience faster fuzzing performance with less manual effort.
202

Studying 3D Spherical Shell Convection using ASPECT

Euen, Grant Thomas 08 January 2018 (has links)
ASPECT is a new convection code that uses more modern and advanced solver methods than geodynamics legacy codes. I use ASPECT to calculate 2-dimensional Cartesian as well as 2- and 3-dimensional spherical-shell convection cases. All cases use the Boussinesq approximation. The 2D cases come from Blankenbach et al. (1989), van Keken et al. (1997), and Davies et al. (in preparation). Results for 2D cases agree well with their respective benchmark papers. The time-evolutions of the root mean square velocity (Vrms) and Nusselt number agree, often to within 1%. The 3D cases come from Zhong et al. (2008). Modifications were made to the simple.cc and harmonic_perturbation.cc files in the ASPECT code in order to reproduce the initial conditions and temperature-dependence of the rheology used in the benchmark. Cases are compared using both CitcomS and ASPECT with different levels of grid spacing, as well as comparing uniform grid spacing and the ASPECT default grid spacing, which refines toward the center. Results for Vrms, average temperature, and Nusselt numbers at the top and bottom of the shell range from better than 1% agreement between CitcomS and ASPECT for cases with tetragonal planforms and 7000 Rayleigh number to as much as 44% difference for cases with cubic planforms and 10^5 Rayleigh number. For all benchmarks, the top Nusselt number from ASPECT is farthest from the reported benchmark values. The 3D planform and radially averaged quantity plots agree. I present these results, as well as recommendations and possible fixes for discrepancies in the results, specifically in the Nusselt numbers, Vrms, and average temperature. / Master of Science / Mantle convection is the primary process in which heat is transferred from the interior of Earth to its exterior. It is a process that involves the physical movement of material in the mantle: hot material rises towards the surface and cools, while cold material sinks to the base and warms. This transferring of heat and energy is also the driving force behind plate tectonics, the process in which the surface of the Earth moves and changes with time. Plate tectonics is responsible for the formation of oceans, mountains, volcanoes, and trenches to name a few. Understanding the behavior of the mantle as it convects is crucial to understanding how the Earth and planetary bodies like it develop over time. In this work, I use the new modeling code ASPECT, Advanced Solver for Problems in Earths ConvecTion, to test various models in 2 and 3 dimensions. This is done to compare the results calculated by ASPECT with those of older, legacy codes for the purpose of benchmarking and growth of ASPECT. Insight is also gleaned into the large-scale factors that influence mantle convection and planetary development. My results show good agreement between results calculated by ASPECT and those of legacy codes, though there is some discrepancy in some values. The main values I present here are V<sub>RMS</sub>, the root mean square velocity, the average temperature, and the Nusselt number calculated for both the top and base of the models. In this work, I present these results and potential solutions to the discrepancies encountered.
203

Systema

Merkel, Evan Andrew 27 February 2018 (has links)
This thesis is a three-part creative coding exploration of generative typography and pixel-based image manipulation. Systema is composed of three distinct projects named Lyra, Mensa, and Vela, respectively, that investigate and demonstrate the advantages and drawbacks of generative graphic design. / Master of Fine Arts
204

Analyse de discours sur l'éthique en publicité

Bergeron, Caroline January 1993 (has links)
Mémoire numérisé par la Direction des bibliothèques de l'Université de Montréal.
205

Semi-automatic code-to-code transformer for Java : Transformation of library calls / Halvautomatisk kodöversättare för Java : Transformation av biblioteksanrop

Boije, Niklas, Borg, Kristoffer January 2016 (has links)
Having the ability to perform large automatic software changes in a code base gives new possibilities for software restructuring and cost savings. The possibility of replacing software libraries in a semi-automatic way has been studied. String metrics are used to find equivalents between two libraries by looking at class- and method names. Rules based on the equivalents are then used to describe how to apply the transformation to the code base. Using the abstract syntax tree, locations for replacements are found and transformations are performed. After the transformations have been performed, an evaluation of the saved effort of doing the replacement automatically versus manually is made. It shows that a large part of the cost can be saved. An additional evaluation calculating the maintenance cost saved annually by changing libraries is also performed in order to prove the claim that an exchange can reduce the annual cost for the project.
206

Exploiting abstract syntax trees to locate software defects

Shippey, Thomas Joshua January 2015 (has links)
Context. Software defect prediction aims to reduce the large costs involved with faults in a software system. A wide range of traditional software metrics have been evaluated as potential defect indicators. These traditional metrics are derived from the source code or from the software development process. Studies have shown that no metric clearly out performs another and identifying defect-prone code using traditional metrics has reached a performance ceiling. Less traditional metrics have been studied, with these metrics being derived from the natural language of the source code. These newer, less traditional and finer grained metrics have shown promise within defect prediction. Aims. The aim of this dissertation is to study the relationship between short Java constructs and the faultiness of source code. To study this relationship this dissertation introduces the concept of a Java sequence and Java code snippet. Sequences are created by using the Java abstract syntax tree. The ordering of the nodes within the abstract syntax tree creates the sequences, while small sub sequences of this sequence are the code snippets. The dissertation tries to find a relationship between the code snippets and faulty and non-faulty code. This dissertation also looks at the evolution of the code snippets as a system matures, to discover whether code snippets significantly associated with faulty code change over time. Methods. To achieve the aims of the dissertation, two main techniques have been developed; finding defective code and extracting Java sequences and code snippets. Finding defective code has been split into two areas - finding the defect fix and defect insertion points. To find the defect fix points an implementation of the bug-linking algorithm has been developed, called S + e . Two algorithms were developed to extract the sequences and the code snippets. The code snippets are analysed using the binomial test to find which ones are significantly associated with faulty and non-faulty code. These techniques have been performed on five different Java datasets; ArgoUML, AspectJ and three releases of Eclipse.JDT.core Results. There are significant associations between some code snippets and faulty code. Frequently occurring fault-prone code snippets include those associated with identifiers, method calls and variables. There are some code snippets significantly associated with faults that are always in faulty code. There are 201 code snippets that are snippets significantly associated with faults across all five of the systems. The technique is unable to find any significant associations between code snippets and non-faulty code. The relationship between code snippets and faults seems to change as the system evolves with more snippets becoming fault-prone as Eclipse.JDT.core evolved over the three releases analysed. Conclusions. This dissertation has introduced the concept of code snippets into software engineering and defect prediction. The use of code snippets offers a promising approach to identifying potentially defective code. Unlike previous approaches, code snippets are based on a comprehensive analysis of low level code features and potentially allow the full set of code defects to be identified. Initial research into the relationship between code snippets and faults has shown that some code constructs or features are significantly related to software faults. The significant associations between code snippets and faults has provided additional empirical evidence to some already researched bad constructs within defect prediction. The code snippets have shown that some constructs significantly associated with faults are located in all five systems, and although this set is small finding any defect indicators that transfer successfully from one system to another is rare.
207

L'influence du modèle français sur les codifications congolaises : cas du droit des personnes et de la famille / The influence of french model on congolese codifications : case of right persons and family law

Bokolombe, Bokina 14 December 2013 (has links)
Le Code civil français a exercé une influence considérable sur la codification civile congolaise. En 1895, par le biais de la colonisation, les Belges avaient importé au Congo le Code Napoléon qu’ils avaient eux-mêmes hérité des conquêtes de l’Empereur français. Le système juridique congolais qui jadis était basé sur le droit coutumier non écrit, fait de multiples coutumes et mœurs locales, s’était alors doté d’un Code rationnalisé calqué sur le modèle français. Après l’indépendance, le pouvoir politique congolais avait voulu remplacer le Code colonial qui était non seulement lacunaire mais surtout inadapté à la mentalité et aux traditions congolaises. Les travaux législatifs engagés notamment sur la partie relative aux droits des personnes et de la famille ont requis le recours à l’authenticité congolaise… En 1987, le législateur congolais a édicté la loi portant le Code de la famille. Ce Code qui pourtant prônait la rupture avec l’ancien Code colonial ne s’est-il pas finalement aligné sur ce même modèle contesté ? Quel choix le législateur congolais a-t-il fait entre tradition et modernité ? Quelles sont les principales nouveautés de ce Code ? Quelles critiques en a-t-on fait ? Aujourd’hui, 20 ans après son élaboration, le vieillissement du Code de la famille ne nécessite-il pas une recodification ? / The French Law has exercised significant influence on Congolese codifications; the most outstanding example is no doubt civil codifications. In reality, the Congolese legal system once based on the unwritten customary law made on multiple customs and community behaviours received through the Belgian colonization, with some adjustments, the Napoleonic Code that the Belgium has therefore received from Napoleonic conquests. This Code is also always applied in Belgium. But after the Congolese’s national independence, political power had wanted to replace the colonial Code which was the mentality and Congolese customs but still incomplete. Furthermore, the legislative work initiated on the part relating to the rights of persons and the family, which led to performing in 1987 of the Family Code, had advocated the use of the right traditional (authenticity). However, apart from the integration of a few customary institutions, this new Congolese Code is the modern fundamental (imperative of development). In fact, it renewed and even amplified the French law that associated others European rights and African postcolonial. But today, this Code has definitely aged; what might therefore be the best remedies to more valuable ? _______________________________________________________________________________________
208

IVCon: A GUI-based Tool for Visualizing and Modularizing Crosscutting Concerns

Saigal, Nalin 10 April 2009 (has links)
Code modularization provides benefits throughout the software life cycle; however, the presence of crosscutting concerns (CCCs) in software hinders its complete modularization. This thesis describes IVCon, a GUI-based tool that provides a novel approach to modularization of CCCs. IVCon enables users to create, examine, and modify their code in two different views, the woven view and the unwoven view. The woven view displays program code in colors that indicate which CCCs various code segments implement. The unwoven view displays code in two panels, one showing the core of the program and the other showing all the code implementing each concern in an isolated module. IVCon aims to provide an easy-to-use interface for conveniently creating, examining, and modifying code in, and translating between, the woven and unwoven views.
209

Code violations and other blight indicators : a study of Colony Park/Lakeside (Austin, Texas)

Durden, Teri Deshun 11 December 2013 (has links)
Blight and the elimination thereof have profoundly impacted urban areas. In Colony Park/Lakeside (Austin, Texas), community leaders and members of the local neighborhood association have come together to mitigate and reverse social, economic, and physical symptoms of blight in their neighborhood. Following the approval of a HUD Community Challenge Planning Grant application that was submitted by the Austin Neighborhood Housing and Community Development (NHCD) department, these individuals utilized the media attention surrounding the grant to campaign for code enforcement, landlord-tenant accountability, policing, and the clean-up of illegal dumping in the area. Moreover, after much ado between residents and City workers, the neighborhood association devised a community-focused partnership with the City to ensure that current residents would reap the benefits of the planning process and help define the collective will and interests of the community. Utilizing publicly available data and first-hand knowledge from one City code compliance investigator and local residents, this report attempts to provide a blight indicator analysis of the Colony Park/Lakeside planning area as defined by NHCD. In other words, this report uses quantitative data to create descriptive maps of current neighborhood conditions with particular attention to code violations and community discussions surrounding them. The results of this work are intended to shed light on where resources should be directed to further research in the area and to resolve issues that threaten the health, safety, and viability of the neighborhood today. / text
210

Near Shannon Limit and Reduced Peak to Average Power Ratio Channel Coded OFDM

Kwak, Yongjun 24 July 2012 (has links)
Solutions to the problem of large peak to average power ratio (PAPR) in orthogonal frequency division multiplexing (OFDM) systems are proposed. Although the design of PAPR reduction codewords has been extensively studied and the existence of asymptotically good codes with low PAPR has been proved, still no reduced PAPR capacity achieving code has been constructed. This is the topic of the current thesis.This goal is achieved by implementing a time-frequency turbo block coded OFDM. In this scheme, we design the frequency domain component code to have a PAPR bounded by a small number. The time domain component code is designed to obtain good performance while the decoding algorithm has reasonable complexity. Through comparative numerical evaluation we show that our method achieves considerable improvement in terms of PAPR with slight performance degradation compared to capacity achieving codes with similar block lengths. For the frequency domain component code, we used the realization of Golay sequences as cosets of the fi rst order Reed-Muller code and the modi cation of dual BCH code. A simple MAP decoding algorithm for the modi ed dual BCH code is also provided. Finally, we provide a flexible and practical scheme based on probabilistic approach to a PAPR problem. This approach decreases the PAPR without any signi cant performance loss and without any adverse impact or required change to the system. / Engineering and Applied Sciences

Page generated in 0.0487 seconds