• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 17
  • 7
  • 5
  • 4
  • 3
  • 1
  • 1
  • Tagged with
  • 41
  • 9
  • 8
  • 7
  • 5
  • 5
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Scaling Context-Sensitive Points-To Analysis

Nasre, Rupesh 02 1900 (has links) (PDF)
Pointer analysis is one of the key static analyses during compilation. The efficiency of several compiler optimizations and transformations depends directly on the scalability and precision of the underlying pointer analysis. Recent advances still lack an efficient and scalable context-sensitive inclusion-based pointer analysis. In this work, we propose four novel techniques to improve the scalability of context-sensitive points-to analysis for C/C++ programs. First, we develop an efficient way of storing the approximate points-to information using a multi-dimensional bloom filter (multibloom). By making use of fast hash functions and exploiting the spatial locality of the points-to information, our multibloom-based points-to analysis offers significant savings in both analysis time and memory requirement. Since the representation never resets any bit in the multibloom, no points-to information is ever lost; and the analysis is sound, though approximate. This allows a client to trade off a minimal amount of precision but gain huge savings(two orders less) in memory requirement. By making use of multiple random and independent hash functions, the algorithm also achieves high precision and runs, on an average,2×faster than Andersen’s points-to analysis. Using Mod/Ref analysis as a client, we illustrate that the precision is above 98% of that of Andersen’s analysis. Second, we devise a sound randomized algorithm that processes a group of constraints in a less precise but efficient manner and the remaining constraints in a more precise manner. By randomly choosing different groups of constraints across different runs, the analysis results in different points-to information, each of which is guaranteed to be sound. By joining the results of a few runs, the analysis obtains an approximation that is very close to the one obtained by the more precise analysis and still proves efficient in terms of the analysis time. We instantiate our technique to develop a randomized context-sensitive points-to analysis. By varying the level of randomization, a client of points-to analysis can trade off minimal precision (less than 5%) for large gain in efficiency(over 50% reduction in analysis time). We also develop an adaptive version of the randomized algorithm that carefully varies the randomization across different runs to achieve maximum benefit in terms of analysis time and precision without pre-setting the randomization percentage and the number of runs. Third, we transform the points-to analysis problem into finding a solution to a system of linear equations. By making novel use of prime factorization, we illustrate how to transform complex points-to constraints into a set of linear equations and transform the solution back as a points-to solution. We prove that our algorithm is sound and show that our technique is 1.8×faster than Andersen’s analysis for large benchmarks. Finally, we observe that the order in which points-to constraints are processed plays a vital role in the algorithm efficiency. We prove that finding an optimal ordering to compute the fixpoint solution is NP-Hard. We then propose a greedy heuristic based on the amount of points-to information computed by a constraint to prioritize the constraints. This results in a dynamic ordering of the constraint evaluation which, in turn, results in skewed evaluation of constraints where each constraint is evaluated repeatedly and different number of times in a single iteration. Our prioritized analysis achieves, on an average, an improvement of 33% over Andersen’s points-to analysis. We illustrate that our algorithms help in scaling the state-of-the-art pointer analyses. We also believe that the techniques developed would be useful for other program analyses and transformations.
2

Realization Methods for the Quadtree Morphological Filter with Their Applications

Chen, Yung-lin 07 September 2011 (has links)
Quadtree algorithm and morphological image processing are combined in the proposed method in this paper. A new method is proposed to improve the previous pattern mapping method for faster processing. The previous pattern mapping method is a pattern mapping method by storing the tree pattern by string form, which is a pointless data structure. In the proposed method the tree pattern is saved in a point data structure. Therefore, the pointer tree can be applied to the quadtree immediately without the transforming time, which was required in the previous pattern mapping method. In this paper, the pointless quad tree work is modified to pointer quad tree to reduce the processing time. The modified algorithm is applied to circuit detection, image restoration, image segmentation and cell counting.
3

Indicadores de desempenho como facilitadores das implementações de melhorias / Pointers of performance as facilitators of the implementations of improvements

Pereira, Francisco Antonio 10 April 2006 (has links)
Orientador: Ademir Jose Petenate / Dissertação (mestrado profissional) - Universidade Estadual de Campinas, Faculdade de Engenharia Mecanica / Made available in DSpace on 2018-08-08T19:55:39Z (GMT). No. of bitstreams: 1 Pereira_FranciscoAntonio_M.pdf: 1072211 bytes, checksum: f5c426d28438ce512199e6843c4c9600 (MD5) Previous issue date: 2006 / Resumo: As organizações se encontram em ambientes turbulentos de rápidas e radicais mudanças, por isso se tem dado grande importância à questão da implementação das ações estratégicas nas empresas. Justificando uma análise mais cuidadosa, para entender como as empresas podem buscar estratégias bem sucedidas de ações de melhoria de processos e do desempenho organizacional, alinhando-as e integrando-as aos seus objetivos e metas. Porém, para uma efetiva gestão de melhorias, a organização deverá ter um sistema para avaliação e medição do seu desempenho. Dessa forma, este trabalho está estruturado, inicialmente, com uma discussão teórica sobre os temas: estratégia, gestão de melhorias de processos e medição do desempenho organizacional. Com o embasamento teórico, é realizada uma proposta de um modelo para a gestão de melhorias estratégicas, que é detalhada em processos e atividades, e é apresentado um exemplo de aplicação prática do modelo na elaboração e aplicação do programa de ¿Housekeeping-5S¿ no jornal ¿O Estado de S. Paulo¿. Uma das ações previstas no planejamento estratégico era a adoção da ¿Qualidade Total Percebida¿ e para tal, optou-se pelo programa ¿Housekeeping-5S¿ que contribuiu significativamente com o desenvolvimento da ¿Cultura da Qualidade¿. Com a adoção do modelo apresentado neste trabalho, o monitoramento e os ajustes das ações obtiveram-se excelentes resultados principalmente na melhoria ambiental e da qualidade de vida. Isso, só foi possível pela adoção do modelo que garantiu as correções necessárias ao alcance das metas definidas no planejamento estratégico da empresa. Por meio deste modelo, é possível garantir a eficácia da elaboração e execução dos planos de ação estratégicos / Abstract: The organizations find out in turbulent fast environments and radical changes,therefore the great importance to the question of the implementation of the strategical actions in the companies has been given. Justifying a more careful analysis, to understand as the companies they can search successful strategies of action and improvement of processes and the organizacional performance, lining up them and integrating them with the objectives and goals. However, for an effective management of improvements, the organization must have a system for evaluation and measurement of its performance. Of this form, this work is structuralized, initially, with a theoretical quarrel on the subjects: strategy, management of improvements of processes and measurement of the organizacional performance. With the theoretical basement, it is carried through a proposal of a model for the management of strategical improvements, that is detailed in processes and activities, and is presented an example of practical application of the model in the elaboration and application of the program of ¿Housekeeping-5S¿ in the periodical ¿O Estado de S.Paulo¿. One of the actions foreseen in the strategical planning was the adoption of the ¿Perceived Total Quality¿ and for such, it was opted to the program ¿Housekeeping- 5S¿ that contributed significantly with the development of the ¿Culture of the Quality¿. With the adoption of the model presented in this work, the monitoramento and the adjustments of the actions had mainly gotten excellent results in the ambient improvement and of the quality of life. This, was only possible for the adoption of the model that guaranteed the necessary corrections to the reach of the goals defined in the strategical planning of the company. By means of this model, it is possible to guarantee the effectiveness of the elaboration and execution of the strategical plans of action / Mestrado / Gestão da Qualidade Total / Mestre Profissional em Engenharia Mecanica
4

Kombinatorisk Optimering med Pointer Networks och Reinforcement Learning

Holmberg, Axel, Hansson, Wilhelm January 2021 (has links)
Given the complexity and range of combinatorial optimization problems, solving them can be computationally easy or hard. There are many ways to solve them, but all available methods share a problem: they take a long time to run and have to be rerun when new cases are introduced. Machine learning could prove a viable solution to solving combinatorial optimization problems due to the possibility for models to learn and generalize, eliminating the need to run a complex algorithm every time a new instance is presented. Uniter is a management consulting firm that provides services within product modularization. Product modularization results in the possibility for many different product variations to be created based on customer needs. Finding the best combination given a specific customer's need will require solving a combinatorial optimization problem. Based on Uniter's need, this thesis sought to develop and evaluate a machine learning model consisting of a Pointer Network architecture and trained using Reinforcement Learning.  The task was to find the combination of parts yielding the lowest cost, given a use case. Each use case had different attributes that specified the need for the final product. For each use case, the model was tasked with selecting the most appropriate combination from a set of 4000 distinct combinations. Three experiments were conducted: examining if the model could suggest an optimal solution after being trained on one use case, if the model could suggest an optimal solution of a previously seen use case, and if the model could suggest an optimal solution of an unseen use case. For all experiments, a single data set was used. The suggested model was compared to three baselines: a weighted random selection, a naive model implementing a feed-forward network, and an exhaustive search. The results showed that the proposed model could not suggest an optimal solution in any of the experiments. In most tests conducted, the proposed model was significantly slower at suggesting a solution than any baseline. The proposed model had high accuracy in all experiments, meaning it suggested almost only feasible solutions in the allowed solution space. However, when the model converged, it suggested only one combination for every use case, with the feed-forward baseline showing the same behavior. This behavior could suggest that the model misinterpreted the task and identified a solution that would work in most cases instead of suggesting the optimal solution for each use case. The discussion concludes that an exhaustive search is preferable for the studied data set and that an alternate approach using supervised learning may be a better solution.
5

Dynamic pointer tracking and its applications

Zhang, Kun 12 January 2010 (has links)
Due to the significant limitations of static analysis and the dynamic nature of pointers in weakly typed programming languages like C and C++, the points-to sets obtained at compile time are quite conservative. Most static pointer analysis methods trade the precision for the analysis speed. The methods that perform the analysis in a reasonable amount of time are often context and/or flow insensitive. Other methods that are context, flow, and field sensitive have to perform the whole program inter-procedural analysis, and do not scale with respect to the program size. A large class of problems involving optimizations such as instruction prefetching, control and data speculation, redundant load/store instructions removal, instruction scheduling, and memory disambiguation suffer due to the imprecise and conservative points-to sets computed statically. One could possibly live without optimizations, but in domains involving memory security and safety, lack of the precise points-to sets can jeopardize the security and safety. In particular, the lack of dynamic points-to sets drastically reduce the ability to reason about a program's memory access behavior, and thus illegal memory accesses can go unchecked leading to bugs as well as security holes. On the other hand, the points-to sets can be very useful for other domains such as the heap shape analysis and garbage collection. The knowledge of precise points-to sets is therefore becoming very important, but has received little attention so far beyond a few studies, which have shown that the pointers exhibit very interesting behaviors during execution. How to track such behaviors dynamically and benefit from them is the topic covered by this research. In this work, we propose a technique to compute the precise points-to sets through dynamic pointer tracking. First, the compiler performs the pointer analysis to obtain the static points-to sets. Then, the compiler analyzes the program, and inserts the necessary instructions to refine the points-to sets. At runtime, the inserted instructions automatically update the points-to sets. Dynamic pointer tracking in software can be expensive and can be a barrier to the practicality of such methods. Several optimizations including removal of redundant update, post-loop update, special pattern driven update removal, pointer initialization update removal, update propagation, invariant removal, and on demand update optimization are proposed. Our experimental results demonstrate that our mechanism is able to compute the points-to sets dynamically with tolerable overheads. Finally, the memory protection and garbage collection work are presented as the consumers of dynamic pointer tracking to illustrate its importance. In particular, it is shown how different memory properties can be easily tracked using the dynamic points-to sets opening newer possibilities.
6

Interaktyvaus pateikčių valdymo metodas / A method for interactive presentation control

Mandrijauskas, Renatas 13 August 2010 (has links)
Skaitmeninė pateiktis šiandien yra labiausiai paplitęs būdas mokyti kursų ir rengti mokslo bei verslo pristatymus. Norint paremti šią koncepciją, daugumoje universitetų klasės kambarių ir verslo konferencijų salėse yra įrengtos didelės raiškos projektoriai ir kompiuteriai. Siekiant kontroliuoti pateiktis, dėstytojai ar konferencijos dalyviai turi naudoti kompiuterio klaviatūrą ir pelę. Operacijos, tokios, kaip kitos skaidrės įjungimas, frazės paryškinimas ar grafiko nubraižymas, atliekamos dirbant kompiuteriu užuot tiesiogiai bendraujant su auditorija. Lazeriniai žymekliai – dažnai naudojamas įrankis pateikčių metu. Šis įrankis suteikia didesnę laisvę, tačiau atlikti pateikčių valdymo veiksmams pranešėjas privalo nueiti prie kompiuterio. Panaudojant turimą pateikčių įrangą ir įdiegus vaizdo apdorojimo sistemą, galima pateiktims suteikti interaktyvumo. Interaktyvios sistemos dėka, pranešėjas pateikčių metu galėtų atsiriboti nuo kompiuterio, sutelkti visą dėmesį auditorijai ir tuo pačiu metu atlikti pagrindinius pateikčių valdymo veiksmus. Tokiu būdu pranešėjas pateikiamą medžiagą auditorijai pateiktų efektyviau ir įtaigiau. Atliekant šį magistrinį darbą, sukurta „Fenix“ sistema, kuri internetinės kameros fiksuojamam vaizde, aptinka lazerinį žymeklį, nustato žymeklio padėtį ekrane ir, priklausomai nuo padėties, atlieka pagrindinius pateikčių metu naudojamus veiksmus. / Computer - based presentations are now the most common way to teach courses and make scientific or business presentations. To support this, the majority of university classrooms and corporate conference rooms are equipped with high - definition projectors and computers. To control presentation, teachers or participants of the conference must use the computer keyboard and mouse. Operations such as changing slide, highlighting phrase or graphic, performed by sitting near computer instead of directly communicating with the audience. Laser markers - a tool often used during the presentation. This tool provides more freedom, but presenter must go to the computer to make an action. This document describes a method for interactive presentation controlling using laser pointer, web cam, projector and image processing. In project part you will find the main technical and design issues of system that was developed during master course. Document also contains developed system test results and reviews.
7

Novel storage architectures and pointer-free search trees for database systems

Vasaitis, Vasileios January 2012 (has links)
Database systems research is an old and well-established field in computer science. Many of the key concepts appeared as early as the 60s, while the core of relational databases, which have dominated the database world for a while now, was solidified during the 80s. However, the underlying hardware has not displayed such stability in the same period, which means that a lot of assumptions that were made about the hardware by early database systems are not necessarily true for modern computer architectures. In particular, over the last few decades there have been two notable consistent trends in the evolution of computer hardware. The first is that the memory hierarchy of mainstream computer systems has been getting deeper, with its different levels moving away from each other, and new levels being added in between as a result, in particular cache memories. The second is that, when it comes to data transfers between any two adjacent levels of the memory hierarchy, access latencies have not been keeping up with transfer rates. The challenge is therefore to adapt database index structures so that they become immune to these two trends. The latter is addressed by gradually increasing the size of the data transfer unit; the former, by organizing the data so that it exhibits good locality for memory transfers across multiple memory boundaries. We have developed novel structures that facilitate both of these strategies. We started our investigation with the venerable B+-tree, which is the cornerstone order-preserving index of any database system, and we have developed a novel pointer-free tree structure for its pages that optimizes its cache performance and makes it immune to the page size. We then adapted our approach to the R-tree and the GiST, making it applicable to multi-dimensional data indexes as well as generalized indexes for any abstract data type. Finally, we have investigated our structure in the context of main memory alone, and have demonstrated its superiority over the established approaches in that setting too. While our research has its roots in data structures and algorithms theory, we have conducted it with a strong experimental focus, as the complex interactions within the memory hierarchy of a modern computer system can be quite challenging to model and theorize about effectively. Our findings are therefore backed by solid experimental results that verify our hypotheses and prove the superiority of our structures over competing approaches.
8

Le cadre de la parole et le cadre du signe : un rendez-vous développemental

Ducey Kaufmann, Virginie 26 January 2007 (has links) (PDF)
Notre hypothèse de travail est qu'il existerait un rendez-vous développemental entre ce que nous nommons le cadre de la parole et le cadre du signe. Tandis que le cadre de la parole (Speech Frame) s'établit sous la forme du babillage canonique, vers 7 mois, le cadre du signe (Sign Frame) se manifeste tout d'abord sous la forme du pointage dit impératif vers 9 mois, avant de donner lieu au pointage dit déclaratif. Ce dernier apparaît avec les premiers mots, tandis que le cadre de la parole permet à ce moment-là de coproduire (coarticuler) voyelle et consonne (Sussman et al. 1999). Les places respectives des ingrédients de ce rendez-vous développemental autour du premier mot restent encore à explorer.<br />Dans la présente contribution, nous avons voulu tester l'existence d'un rapport harmonique entre cadre de la parole et cadre du signe. Pour cela, il nous a fallu tout d'abord obtenir la distribution des fréquences de babillage, puis celle des durées des pointers. Nos résultats sur 6 sujets, suivis sur 12 mois, montrent qu'avec un mode de babillage à 3Hz et des strokes de pointers de 600-700 ms (1.5Hz), nous pouvons rendre compte du gabarit (template) des premiers mots. En effet, ces mots «prosodiques» pouvant varier d'une à deux «syllabes», il est nécessaire de faire appel à la notion de pied (foot) comme une unité de contrôle métrique ancrée dans le pointer. Ceci rendra compte des observations courantes dans la littérature à condition qu'au lieu de compter seulement des syllabes/mot, on mesure le pas des cycles mandibulaires entrant dans le stroke des pointers.
9

Expression data flow graph: precise flow-sensitive pointer analysis for C programs

Thiessen, Rei Unknown Date
No description available.
10

Aide à la sélection de cibles pour des environnements de réalité virtuelle / Assistance tools for target selection in virtual reality environments

Wonner, Jonathan 16 December 2013 (has links)
La sélection d'entités est une des tâches courantes et fondamentales de l'interaction. En environnement de réalité virtuelle, cette tâche s'effectue dans les trois dimensions, mais s'accompagne de difficultés inhérentes à ces environnements, comme une mauvaise perception de la profondeur. Des techniques existent pour pallier ces obstacles. Nous présentons trois nouvelles méthodes améliorant les performances de l'utilisateur durant les différentes étapes du processus de sélection. Le principe du Ring Concept permet de localiser des objets non visibles dans la scène. La technique Starfish guide le mouvement de l'utilisateur vers sa cible. Enfin, l'algorithme SPEED prédit le point d'arrivée d'un mouvement de sélection. / Selection is one of the most current and fundamental interaction tasks. In a virtual reality environment, this task is performed in the three dimensions, but is accompanied by difficulties inherent in these environments, such as poor depth perception. Several techniques exist to overcome these obstacles. We present three new methods for improving the performance of the user during the various phases of the selection process. The Ring Concept principle can locate non-visible objects in the scene. The Starfish technique guides the movement of the user to the target. Finally, the SPEED algorithm predicts the endpoint of a selection movement.

Page generated in 0.0904 seconds