Spelling suggestions: "subject:"lead pode"" "subject:"lead mode""
1 |
Isothermality: making speculative optimizations affordablePereira, David John 22 December 2007 (has links)
Partial Redundancy Elimination (PRE) is a ubiquitous optimization used by compilers to remove
repeated computations from programs. Speculative PRE (SPRE), which uses program profiles
(statistics obtained from running a program), is more cognizant of trends in run time behaviour
and therefore produces better optimized programs. Unfortunately, the optimal version of SPRE is
a very expensive algorithm, of high-order polynomial time complexity, and unlike most compiler
optimizations, which run effectively in linear time complexity over the size of the program that they
are optimizing.
This dissertation uses the concept of “isothermality”—the division of a program into a hot region
and a cold region—to create the Isothermal SPRE (ISPRE) optimization, an approximation to
optimal SPRE. Unlike SPRE, which creates and solves a flow network for each program expression
being optimized—a very expensive operation—ISPRE uses two simple bit-vector analyses, optimizing
all expressions simultaneously. We show, experimentally, that the ISPRE algorithm works, on
average, nine times faster than the SPRE algorithm, while producing programs that are optimized
competitively.
This dissertation also harnesses the power of isothermality to empower another kind of ubiquitous
compiler optimization, Partial Dead Code Elimination (PDCE), which removes computations
whose values are not used. Isothermal Speculative PDCE (ISPDCE) is a new, simple, and efficient
optimization which requires only three bit-vector analyses. We show, experimentally, that ISPDCE
produces superior optimization than PDCE, while keeping a competitive running time.
On account of their small analysis costs, ISPRE and ISPDCE are especially appropriate for use in
Just-In-Time (JIT) compilers.
|
2 |
Improving maintenance of CSS through structure and refactoring strategies / Förbättrat underhåll av CSS genom struktur och refaktoreringsstrategierRydfalk, Villiam January 2022 (has links)
This study assesses how maintainability can be improved by manual and automatic refactoring strategies. Firstly, the structure of the CSS is important and CSS rules should conform to a certain order depending on their purpose. Secondly, dead code makes files unnecessarily large and more difficult to read. Removing dead code is an important step for improving maintenance. These refactoring strategies were taken from the book CSS Refactoring by Lindstrom. The refactoring strategies were tested on a large CSS code base from an ASP.NET application called AktiveraMera. The structure was manually applied by following a step by step process where each CSS rule was classified and sorted into files and folders. For example, rules styling a specific button were put in a file which is then placed in the \emph{component-styles} folder. All these separate files are then concatenated into one file which is then minimized in order to keep it small when it is later served to the user. The divided structure makes it easier for developers to add new component-, structure-, or browser-specific CSS in the correct place. It also makes it easier to keep refactoring limited and the developer knows all rules that are relevant to a certain element are kept in the same place. The new structure introduced some visual regression. It is difficult to assess how much extra time it adds if all visual regression must be remedied, but some of the changes could probably be ignored while fixing the more egregious changes. Either way, some extra work needs to be done but according to interviewed developers it would probably be worth it. The assessment is that the new structure does improve maintainability. The tool CSS-analyser was then used to find and remove duplicated CSS, a form of dead CSS code, in order to reduce the size of the CSS files. Smaller file size (fewer lines of code) is correlated to readability and by extent maintainability since it is less code to read through and understand. However, while the results did show a reduction in file size it did introduce a lot of very obvious visual regression which means a lot more work. In an industry setting there are often time constraints and if something adds more work it might not be a good strategy. The visual regression was akin to a broken website where the entire layout is changed, so it could not be ignored if it was used on the live application in question. Maintainability can be improved by introducing some structure to the CSS code in the form of a folder structure with sorted and classified CSS rules. Removing dead CSS code is difficult to do automatically without introducing website breaking visual regression. It is much more important to make sure the structure of the CSS is good than to directly try to remove dead code, especially since the structure can make other refactoring strategies more effective.
|
3 |
Optimalizace překladu agentních jazyků různé úrovně abstrakce / Optimalisation of Agent Languages CompilerKalmár, Róbert January 2012 (has links)
The aim of this work is an optimization of AHLL language compiler. Several intermediate representations of compiled code along with code optimization techniques are introduced. The main part of the work is focused on implementing these optimization techniques and generation of the target code in ALLL language. At the end of the work, the results achieved by new version of AHLL compiler are presented. In addition, there are also presented some ideas for the future work on AHLL and the compiler.
|
Page generated in 0.0322 seconds