1 |
Dokumentų valdymo sistemos metaduomenų apdorojimo modelio sudarymas ir tyrimas / Analysis and Development of Metadata Processing Model for Document ManagementŽukaitis, Rimantas 25 May 2004 (has links)
The increasing usage of personal computers and Internet in organizations made it possible to create, edit and share various documents between different employees of the organization. However, document management becomes very troublesome, especially if several employees can contribute changes to a singe document: it is very hard to locate latest document version, or determine which changes to the document were made by which employee. Document management systems are aimed to solve these problems. However these systems often are highly specialized and very costly to implement, or they are general-purpose and hard to customize and apply to organization business domain. Inabilities to customize often arise from strict and inflexible metadata model, used in document management system. The aim of this work is propose abstract document metadata definition and processing model, based on XML data definition language and concept of XML data processing pipeline. The proposed model is general-purpose and highly flexible at the same time, enabling to apply model to any business domain and customize it to reflect any features specific to this domain.
|
2 |
Understanding and Improving Object-Oriented Software Through Static Software AnalysisIrwin, Warwick Allan January 2007 (has links)
Software engineers need to understand the structure of the programs they construct. This task is made difficult by the intangible nature of software, and its complexity, size and changeability. Static analysis tools can help by extracting information from source code and conveying it to software engineers. However, the information provided by typical tools is limited, and some potentially rich veins of information - particularly metrics and visualisations - are under-utilised because developers cannot easily acquire or make use of the data. This thesis documents new tools and techniques for static analysis of software. It addresses the problem of generating parsers directly from standard grammars, thus avoiding the com-mon practice of customising grammars to comply with the limitations of a given parsing al-gorithm, typically LALR(1). This is achieved by a new parser generator that applies a range of bottom-up parsing algorithms to produce a hybrid parsing automaton. Consequently, we can generate more powerful deterministic parsers - up to and including LR(k) - without incurring the combinatorial explosion that makes canonical LR(k) parsers impractical. The range of practical parsers is further extended to include GLR, which was originally developed for natural language parsing but is shown here to also have advantages for static analysis of programming languages. This emphasis on conformance to standard grammars im-proves the rigour of static analysis tools and allows clearer definition and communication of derived information, such as metrics. Beneath the syntactic structure of software (exposed by parsing) lies the deeper semantic structure of declarations, scopes, classes, methods, inheritance, invocations, and so on. In this work, we present a new tool that performs semantic analysis on parse trees to produce a comprehensive semantic model suitable for processing by other static analysis tools. An XML pipeline approach is used to expose the syntactic and semantic models of the software and to derive metrics and visualisations. The approach is demonstrated producing several types of metrics and visualisations for real software, and the value of static analysis for informing software engineering decisions is shown.
|
Page generated in 0.0522 seconds