• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 17
  • 7
  • 5
  • 4
  • 3
  • 1
  • 1
  • Tagged with
  • 41
  • 9
  • 8
  • 7
  • 5
  • 5
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Object oriented database management systems

Nassis, Antonios 11 1900 (has links)
Modern data intensive applications, such as multimedia systems require the ability to store and manipulate complex data. The classical Database Management Systems (DBMS), such as relational databases, cannot support these types of applications efficiently. This dissertation presents the salient features of Object Database Management Systems (ODBMS) and Persistent Programming Languages (PPL), which have been developed to address the data management needs of these difficult applications. An 'impedance mismatch' problem occurs in the traditional DBMS because the data and computational aspects of the application are implemented using two different systems, that of query and programming language. PPL's provide facilities to cater for both persistent and transient data within the same language, hence avoiding the impedance mismatch problem. This dissertation presents a method of implementing a PPL by extending the language C++ with pre-compiled classes. The classes are first developed and then used to implement object persistence in two simple applications. / Computing / M. Sc. (Information Systems)
22

Conceitos e aplicações de um sistema gerencial de apoio a decisão aplicados a sistemas de distribuição de energia elétrica via internet. / Executive information system: concepts and application for utilities.

Leal, Adriano Galindo 13 July 1999 (has links)
Nesta dissertação, são discutidas as vantagens e dificuldades de um Sistema Gerencial de Apoio a Decisão em um ambiente Intranet/Internet, sua execução, bem como a utilização de aplicações de bancos de dados na web. Um sistema dessa natureza, denominado SAG (Sistema de Apoio Gerencial), foi concebido para dar suporte às atividades de gerência, supervisão e controle da rede de distribuição de energia elétrica da Eletropaulo Metropolitana (São Paulo, Brasil). O SAG possibilita o estabelecimento de uma sistemática de supervisão, visando, a partir da análise das condições atuais da rede de distribuição e em função dos recursos disponíveis, permitir o acompanhamento da evolução da qualidade do fornecimento de energia elétrica. Disponibilizando informações que irão orientar ações para corrigir os possíveis desvios inadequados e fixar políticas e diretrizes a serem seguidas nos níveis gerenciais. Como resultado, é esperado tornar mais ágil o processo decisório, bem como o acesso a dados técnicos e de carregamento dos equipamentos da rede de distribuição. / In this dissertation are discussed the advantages and difficulties of an Executive Information System (EIS) implementation on an Intranet or Internet environment, as well as the use of Database applications on the web. An Executive Information System named SAG was implemented at Eletropaulo Metropolitana (São Paulo, Brazil); it was conceived to address maintenance, operational and engineering departments\' needs. The System allows the establishment of a systematic supervision to attend the quality of the electric energy supplied. The analysis of the actual distribution network conditions, the available resources and the electric energy supplied quality will guide the actions to correct possible inadequacies as well as set and fix policies and guidelines to be followed on a manager\'s level. As a result it\'s expected faster decision-making process and access on distribution network equipment\'s technical data and load.
23

Oak (<em>Quercus robur </em>L.) mortality in south-eastern Sweden: influence of weather and environmental variables

Andersson, Marie January 2009 (has links)
<p>The complex interplay between biotic and abiotic factors, believed to be responsible for several oak declines in European oak stands during the last three decades, remains poorly understood. Hence, this study aims at clarifying the temporal process of oak declines, as well as identifying individual tree and environmental variables that increase the risk of oak mortality. The study was performed in one of the few areas in northern Europe still holding high densities of old oaks (<em>Quercus robur </em>L<em>.</em>). Cross dating revealed that most trees had died during the last decade. Averaged chronologies and multiple chronological clustering suggested that the onset of the oak decline happened in 1992, when a severe drought took place. Two of the sites showed a rather short time period of heavily reduced growth prior to death, most likely caused by an insect defoliation in combination with a mildew infection of the replacement shoots. Environmental variables presented a rather weak influence on oak mortality. The results support the idea of attributing oak mortality to a combination of long- and short-term stresses, and emphasize the importance of including present as well as past factors when analysing the causes of oak declines.</p>
24

Designing and combining mid-air interaction techniques in large display environments

Nancel, Mathieu 05 December 2012 (has links) (PDF)
Large display environments (LDEs) are interactive physical workspaces featuring one or more static large displays as well as rich interaction capabilities, and are meant to visualize and manipulate very large datasets. Research about mid-air interactions in such environments has emerged over the past decade, and a number of interaction techniques are now available for most elementary tasks such as pointing, navigating and command selection. However these techniques are often designed and evaluated separately on specific platforms and for specific use-cases or operationalizations, which makes it hard to choose, compare and combine them.In this dissertation I propose a framework and a set of guidelines for analyzing and combining the input and output channels available in LDEs. I analyze the characteristics of LDEs in terms of (1) visual output and how it affects usability and collaboration and (2) input channels and how to combine them in rich sets of mid-air interaction techniques. These analyses lead to four design requirements intended to ensure that a set of interaction techniques can be used (i) at a distance, (ii) together with other interaction techniques and (iii) when collaborating with other users. In accordance with these requirements, I designed and evaluated a set of mid-air interaction techniques for panning and zooming, for invoking commands while pointing and for performing difficult pointing tasks with limited input requirements. For the latter I also developed two methods, one for calibrating high-precision techniques with two levels of precision and one for tuning velocity-based transfer functions. Finally, I introduce two higher-level design considerations for combining interaction techniques in input-constrained environments. Designers should take into account (1) the trade-off between minimizing limb usage and performing actions in parallel that affects overall performance, and (2) the decision and adaptation costs incurred by changing the resolution function of a pointing technique during a pointing task.
25

Oak (Quercus robur L.) mortality in south-eastern Sweden: influence of weather and environmental variables

Andersson, Marie January 2009 (has links)
The complex interplay between biotic and abiotic factors, believed to be responsible for several oak declines in European oak stands during the last three decades, remains poorly understood. Hence, this study aims at clarifying the temporal process of oak declines, as well as identifying individual tree and environmental variables that increase the risk of oak mortality. The study was performed in one of the few areas in northern Europe still holding high densities of old oaks (Quercus robur L.). Cross dating revealed that most trees had died during the last decade. Averaged chronologies and multiple chronological clustering suggested that the onset of the oak decline happened in 1992, when a severe drought took place. Two of the sites showed a rather short time period of heavily reduced growth prior to death, most likely caused by an insect defoliation in combination with a mildew infection of the replacement shoots. Environmental variables presented a rather weak influence on oak mortality. The results support the idea of attributing oak mortality to a combination of long- and short-term stresses, and emphasize the importance of including present as well as past factors when analysing the causes of oak declines.
26

Object oriented database management systems

Nassis, Antonios 11 1900 (has links)
Modern data intensive applications, such as multimedia systems require the ability to store and manipulate complex data. The classical Database Management Systems (DBMS), such as relational databases, cannot support these types of applications efficiently. This dissertation presents the salient features of Object Database Management Systems (ODBMS) and Persistent Programming Languages (PPL), which have been developed to address the data management needs of these difficult applications. An 'impedance mismatch' problem occurs in the traditional DBMS because the data and computational aspects of the application are implemented using two different systems, that of query and programming language. PPL's provide facilities to cater for both persistent and transient data within the same language, hence avoiding the impedance mismatch problem. This dissertation presents a method of implementing a PPL by extending the language C++ with pre-compiled classes. The classes are first developed and then used to implement object persistence in two simple applications. / Computing / M. Sc. (Information Systems)
27

Vérification de programmes avec pointeurs à l'aide de régions et de permissions / Verification of Pointer Programs Using Regions and Permissions

Bardou, Romain 14 October 2011 (has links)
La vérification déductive de programmes consiste à annoter des programmes par une spécification, c'est-à-dire un ensemble de formules logiques décrivant le comportement du programme, et à prouver que les programmes vérifient bien leur spécification. Des outils tels que la plate-forme Why prennent en entrée un programme et sa spécification et calculent des formules logiques telles que, si elles sont prouvées, le programme vérifie sa spécification. Ces formules logiques peuvent être prouvées automatiquement ou à l'aide d'assistants de preuve.Lorsqu'un programme est écrit dans un langage supportant les alias de pointeurs, c'est-à-dire si plusieurs variables peuvent désigner la même case mémoire, alors le raisonnement sur le programme devient particulièrement ardu. Il est nécessaire de spécifier quels pointeurs peuvent être égaux ou non. Les invariants des structures de données, en particulier, sont plus difficiles à vérifier.Cette thèse propose un système de type permettant de structurer la mémoire de façon modulaire afin de contrôler les alias de pointeurs et les invariants de données. Il est basé sur les notions de région et de permission. Les programmes sont ensuite interprétés vers Why de telle façon que les pointeurs soient séparés au mieux, facilitant ainsi le raisonnement. Cette thèse propose aussi un mécanisme d'inférence permettant d'alléger le travail d'annotation des opérations de régions introduites par le langage. Un modèle est introduit pour décrire la sémantique du langage et prouver sa sûreté. En particulier, il est prouvé que si le type d'un pointeur affirme que celui-ci vérifie son invariant, alors cet invariant est effectivement vérifié dans le modèle. Cette thèse a fait l'objet d'une implémentation sous la forme d'un outil nommé Capucine. Plusieurs exemples ont été écrits pour illustrer le langage, et ont été vérifié à l'aide de Capucine. / Deductive verification consists in annotating programs by a specification, i.e. logic formulas which describe the behavior of the program, and prove that programs verify their specification. Tools such as the Why platform take a program and its specification as input and compute logic formulas such that, if they are valid, the program verifies its specification. These logic formulas can be proven automatically or using proof assistants.When a program is written in a language supporting pointer aliasing, i.e. if several variables may denote the same memory cell, then reasoning about the program becomes particularly tricky. It is necessary to specify which pointers may or may not be equal. Invariants of data structures, in particular, are harder to maintain.This thesis proposes a type system which allows to structure the heap in a modular fashion in order to control pointer aliases and data invariants. It is based on the notions of region and permission. Programs are then translated to Why such that pointers are separated as best as possible, to facilitate reasoning. This thesis also proposes an inference mechanism to alleviate the need to write region operations introduced by the language. A model is introduced to describe the semantics of the language and prove its safety. In particular, it is proven that if the type of a pointer tells that its invariant holds, then this invariant indeed holds in the model. This work has been implemented as a tool named Capucine. Several examples have been written to illustrate the language, and where verified using Capucine.
28

Conceitos e aplicações de um sistema gerencial de apoio a decisão aplicados a sistemas de distribuição de energia elétrica via internet. / Executive information system: concepts and application for utilities.

Adriano Galindo Leal 13 July 1999 (has links)
Nesta dissertação, são discutidas as vantagens e dificuldades de um Sistema Gerencial de Apoio a Decisão em um ambiente Intranet/Internet, sua execução, bem como a utilização de aplicações de bancos de dados na web. Um sistema dessa natureza, denominado SAG (Sistema de Apoio Gerencial), foi concebido para dar suporte às atividades de gerência, supervisão e controle da rede de distribuição de energia elétrica da Eletropaulo Metropolitana (São Paulo, Brasil). O SAG possibilita o estabelecimento de uma sistemática de supervisão, visando, a partir da análise das condições atuais da rede de distribuição e em função dos recursos disponíveis, permitir o acompanhamento da evolução da qualidade do fornecimento de energia elétrica. Disponibilizando informações que irão orientar ações para corrigir os possíveis desvios inadequados e fixar políticas e diretrizes a serem seguidas nos níveis gerenciais. Como resultado, é esperado tornar mais ágil o processo decisório, bem como o acesso a dados técnicos e de carregamento dos equipamentos da rede de distribuição. / In this dissertation are discussed the advantages and difficulties of an Executive Information System (EIS) implementation on an Intranet or Internet environment, as well as the use of Database applications on the web. An Executive Information System named SAG was implemented at Eletropaulo Metropolitana (São Paulo, Brazil); it was conceived to address maintenance, operational and engineering departments\' needs. The System allows the establishment of a systematic supervision to attend the quality of the electric energy supplied. The analysis of the actual distribution network conditions, the available resources and the electric energy supplied quality will guide the actions to correct possible inadequacies as well as set and fix policies and guidelines to be followed on a manager\'s level. As a result it\'s expected faster decision-making process and access on distribution network equipment\'s technical data and load.
29

Quantitative Metrics and Measurement Methodologies for System Security Assurance

Ahmed, Md Salman 11 January 2022 (has links)
Proactive approaches for preventing attacks through security measurements are crucial for preventing sophisticated attacks. However, proactive measures must employ qualitative security metrics and systemic measurement methodologies to assess security guarantees, as some metrics (e.g., entropy) used for evaluating security guarantees may not capture the capabilities of advanced attackers. Also, many proactive measures (e.g., data pointer protection or data flow integrity) suffer performance bottlenecks. This dissertation identifies and represents attack vectors as metrics using the knowledge from advanced exploits and demonstrates the effectiveness of the metrics by quantifying attack surface and enabling ways to tune performance vs. security of existing defenses by identifying and prioritizing key attack vectors for protection. We measure attack surface by quantifying the impact of fine-grained Address Space Layout Randomization (ASLR) on code reuse attacks under the Just-In-Time Return-Oriented Programming (JITROP) threat model. We conduct a comprehensive measurement study with five fine-grained ASLR tools, 20 applications including six browsers, one browser engine, and 25 dynamic libraries. Experiments show that attackers only need several seconds (1.5-3.5) to find various code reuse gadgets such as the Turing Complete gadget set. Experiments also suggest that some code pointer leaks allow attackers to find gadgets more quickly than others. Besides, the instruction-level single-round randomization can restrict Turing Complete operations by preventing up to 90% of gadgets. This dissertation also identifies and prioritizes critical data pointers for protection to enable the capability to tune between performance vs. security. We apply seven rule-based heuristics to prioritize externally manipulatable sensitive data objects/pointers. Our evaluations using 33 ground truths vulnerable data objects/pointers show the successful detection of 32 ground truths with a 42% performance overhead reduction compared to AddressSanitizer. Our results also suggest that sensitive data objects are as low as 3%, and on average, 82% of data objects do not need protection for real-world applications. / Doctor of Philosophy / Proactive approaches for preventing attacks through security measurements are crucial to prevent advanced attacks because reactive measures can become challenging, especially when attackers enter sophisticated attack phases. A key challenge for the proactive measures is the identification of representative metrics and measurement methodologies to assess security guarantees, as some metrics used for evaluating security guarantees may not capture the capabilities of advanced attackers. Also, many proactive measures suffer performance bottlenecks. This dissertation identifies and represents attack elements as metrics using the knowledge from advanced exploits and demonstrates the effectiveness of the metrics by quantifying attack surface and enabling the capability to tune performance vs. security of existing defenses by identifying and prioritizing key attack elements. We measure the attack surface of various software applications by quantifying the available attack elements of code reuse attacks in the presence of fine-grained Address Space Layout Randomization (ASLR), a defense in modern operating systems. ASLR makes code reuse attacks difficult by making the attack components unavailable. We perform a comprehensive measurement study with five fine-grained ASLR tools, real-world applications, and libraries under an influential code reuse attack model. Experiments show that attackers only need several seconds (1.5-3.5) to find various code reuse elements. Results also show the influence of one attack element over another and one defense strategy over another strategy. This dissertation also applies seven rule-based heuristics to prioritize externally manipulatable sensitive data objects/pointers – a type of attack element – to enable the capability to tune between performance vs. security. Our evaluations using 33 ground truths vulnerable data objects/pointers show the successful identification of 32 ground truths with a 42% performance overhead reduction compared to AddressSanitizer, a memory error detector. Our results also suggest that sensitive data objects are as low as 3% of all objects, and on average, 82% of objects do not need protection for real-world applications.
30

Access Path Based Dataflow Analysis For Sequential And Concurrent Programs

Arnab De, * 12 1900 (has links) (PDF)
In this thesis, we have developed a flow-sensitive data flow analysis framework for value set analyses for Java-like languages. Our analysis frame work is based on access paths—a variable followed by zero or more field accesses. We express our abstract states as maps from bounded access paths to abstract value sets. Using access paths instead of allocation sites enables us to perform strong updates on assignments to dynamically allocated memory locations. We also describe several optimizations to reduce the number of access paths that need to be tracked in our analysis. We have instantiated this frame work for flow-sensitive pointer and null-pointer analysis for Java. We have implemented our analysis inside the Chord frame work. A major part of our implementation is written declaratively using Datalog. We leverage the use of BDDs in Chord for keeping our memory usage low. We show that our analysis is much more precise and faster than traditional flow-sensitive and flow-insensitive pointer and null-pointer analysis for Java. We further extend our access path based analysis frame work to concurrent Java programs. We use the synchronization structure of the programs to transfer abstract states from one thread to another. Therefore, we do not need to make conservative assumptions about reads or writes to shared memory. We prove our analysis to be sound for the happens-before memory model, which is weaker than most common memory models, including sequential consistency and the Java Memory Model. We implement a null-pointer analysis for concurrent Java programs and show it to be more precise than the traditional analysis.

Page generated in 0.131 seconds