• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 56
  • 23
  • 13
  • 8
  • 6
  • 5
  • 5
  • 4
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 220
  • 220
  • 85
  • 73
  • 48
  • 43
  • 32
  • 25
  • 24
  • 22
  • 20
  • 18
  • 17
  • 16
  • 16
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
191

Σπειροειδής κίνηση και έλεγχος σε μικρο/νανο-ηλεκτρομηχανικά συστήματα αποθήκευσης πληροφορίας

Κωτσόπουλος, Ανδρέας 16 April 2013 (has links)
Οι τεχνικές Μικροσκοπίας Ατομικής Δύναμης που χρησιμοποιούν ακίδες σάρωσης έχουν την ικανότητα όχι μόνο να παρατηρούν επιφάνειες σε ατομικό επίπεδο αλλά και να τις τροποποιούν σε πολύ μικρή κλίμακα. Αυτό αποτελεί και το κίνητρο για τη χρησιμοποίηση των τεχνικών αυτών στη δημιουργία συσκευών αποθήκευσης με πολύ μεγαλύτερη πυκνότητα από τις συμβατικές συσκευές. Σε διάφορα ερευνητικά προγράμματα αποθήκευσης δεδομένων τεχνολογίας MEMS/NEMS με ακίδες, η σχετική τροχιά κίνησης της ακίδας ως προς το αποθηκευτικό μέσο ακολουθεί ένα μοτίβο raster. Παρά την απλή υλοποίησή της, η προαναφερθείσα κίνηση σάρωσης έχει σημαντικά μειονεκτήματα. Στο πλαίσιο της εργασίας αυτής προτείνεται μια εναλλακτική τοπολογία σπειροειδούς κίνησης. Η προτεινόμενη μέθοδος μπορεί να εφαρμοσθεί σε οποιοδήποτε σύστημα που βασίζεται σε διαδικασίες σάρωσης, όπως συστήματα αποθήκευσης και AFM συστήματα απεικόνισης. Στην εργασία αυτή μελετάται η περίπτωση των συσκευών αποθήκευσης με ακίδες, όπου η τροχιά που διαγράφει η ακίδα σε σχέση με το επίπεδο x/y που ορίζεται από το μέσο αποθήκευσης, είναι η σπειροειδής καμπύλη του Αρχιμήδη. Η χρήση μιας τέτοιας σπειροειδούς τροχιάς οδηγεί σε σήμα θέσης αναφοράς με εξαιρετικά στενό συχνοτικό περιεχόμενο, το οποίο ολισθαίνει πολύ αργά στον χρόνο. Για πειραματική επιβεβαίωση, ο προτεινόμενος τρόπος σπειροειδούς κίνησης εφαρμόστηκε σε σύστημα αποθήκευσης πληροφορίας με ακίδες με δυνατότητες θερμομηχανικής εγγραφής και ανάγνωσης δεδομένων σε φιλμ πολυμερούς. Επιπλέον, μελετήθηκε η αξιοποίηση των ιδιοτήτων του νέου τύπου κίνησης από αρχιτεκτονικές ελέγχου ειδικά σχεδιασμένες και βελτιστοποιημένες για τη συγκεκριμένη οικογένεια τροχιών αναφοράς, με στόχο την επίτευξη πολύ υψηλότερων συχνοτήτων σάρωσης για την ίδια ακρίβεια θέσης. Προς επιβεβαίωση των θεωρητικών αναλύσεων, παρουσιάζονται αποτελέσματα εξομοιώσεων καθώς και πειραματικά αποτελέσματα από πειραματική διάταξη. Στο πλαίσιο της διατριβής πραγματοποιήθηκε και η μοντελοποίηση του καναλιού θερμομηχανικής αποθήκευσης με ακίδες σε μεμβράνες πολυμερούς υλικού. Ενώ η θεωρητική μορφή των θερμομηχανικά εγγεγραμμένων κοιλωμάτων είναι κωνική, στην πράξη η μορφή του απέχει πολύ από το θεωρητικό μοντέλο. Για τον λόγο αυτό, αναπτύχθηκε μοντέλο του συμβόλου ως προς την ταχύτητα σάρωσης κατά τη διαδικασία εγγραφής, με βάση πειραματικά δεδομένα. Στο πλαίσιο της διατριβής μελετήθηκε επίσης η δυνατότητα ανάπτυξης συνδυασμένων αρχιτεκτονικών ελέγχου παρακολούθησης και ανάκτησης χρονισμού συμβόλου, όπου η πληροφορία για τη στιγμιαία ταχύτητα του σαρωτή παρέχεται από το μέσο αποθήκευσης μέσω των κυκλωμάτων συγχρονισμού. Τα αποτελέσματα των εξομοιώσεων επιβεβαιώνουν την δυνατότητα αυτή, και επιπλέον δείχνουν ότι υπό προϋποθέσεις η ακρίβεια παρακολούθησης του συστήματος βελτιώνεται. Τέλος, διερευνήθηκε η απόδοση των προτεινόμενων μεθόδων στην περίπτωση φορητών συσκευών, τα οποία υπόκεινται σε εξωτερικές διαταραχές. Στο πλαίσιο της διερεύνησης αυτής, συλλέχθηκαν πειραματικά αποτελέσματα και αναλύθηκαν μετρήσεις τυπικών εξωτερικών διαταραχών. / The AFM techniques using scanning probes have the capacity not only to observe surfaces in atomic level but also to modify them at a very small scale. This feature motivates the use of these techniques to create storage devices capable of storing data in a much higher density than conventional devices. In various MEMS/NEMS-based data storage technology research projects with probes, the relative trajectory follows a raster pattern or similar. Despite its simple implementation, the aforementioned scanning pattern has significant disadvantages. In this work, an alternative spiral motion topology is proposed. The proposed method can be applied to any system based on scanning probes, such as storage systems and AFM imaging systems. In this work, the case of storage devices with probes is studied, in which the trajectory of the probe with respect to the x/y plane of the storage medium, is the spiral curve of Archimedes. The use of such a spiral trajectory leads to a reference position signal with extremely narrowband frequency content, which slides very slowly in time. For experimental verification, the proposed method of spiral motion was applied on a single probe experimental setup, with read and writes data thermomechanical capabilities on very thin polymer films. The aforementioned inherent properties of the proposed approach enable system designs with improved tracking performance and with non-intermittent, high-speed storage capabilities. Thus, the exploitation of these properties by architectures specifically designed and optimized for the particular reference trajectory is studied, in order to achieve much higher scanning frequencies for the same positioning accuracy. To verify the theoretical analysis, simulation results are presented as well as experimental results from the application of the proposed techniques and architectures in experimental AFM systems with a single probe. In this dissertation the modeling of the thermomechanical storage channel with probes in thin polymer films was also carried out. While the theoretical form of thermomechanically engraved indentations is conical, in practice its form is far from this theoretical model. Hence, a symbol model was developed in respect to the scanning speed during the write process, based on experimental data. This model can be used to properly design the equalization circuits depending on the motion speed of operation. Moreover, the possibility of developing combined architectures of tracking control and symbol timing recovery was also investigated, where the information regarding the scanner speed is provided from the storage medium via symbol timing synchronization circuits. The simulation results confirm this approach and, furthermore, show that, under certain conditions, the system’s tracking accuracy is improved. Finally, the performance of the proposed methods in the case of portable storage devices was investigated, where the systems are subjected to external disturbances. As part of this investigation, experimental results were collected and measurements of external disturbances, typical for such devices, were analyzed.
192

Arquitetura com elevada taxa de processamento e reduzida largura de banda de mem?ria para a estima??o de movimento em v?deos digitais

Lopes, Alba Sandyra Bezerra 30 March 2011 (has links)
Made available in DSpace on 2014-12-17T15:47:56Z (GMT). No. of bitstreams: 1 AlbaSBL_DISSERT.pdf: 4454568 bytes, checksum: 25c4881845467354b0805f55975884ef (MD5) Previous issue date: 2011-03-30 / Nowadays several electronics devices support digital videos. Some examples of these devices are cellphones, digital cameras, video cameras and digital televisions. However, raw videos present a huge amount of data, millions of bits, for their representation as the way they were captured. To store them in its primary form it would be necessary a huge amount of disk space and a huge bandwidth to allow the transmission of these data. The video compression becomes essential to make possible information storage and transmission. Motion Estimation is a technique used in the video coder that explores the temporal redundancy present in video sequences to reduce the amount of data necessary to represent the information. This work presents a hardware architecture of a motion estimation module for high resolution videos according to H.264/AVC standard. The H.264/AVC is the most advanced video coder standard, with several new features which allow it to achieve high compression rates. The architecture presented in this work was developed to provide a high data reuse. The data reuse schema adopted reduces the bandwidth required to execute motion estimation. The motion estimation is the task responsible for the largest share of the gains obtained with the H.264/AVC standard so this module is essential for final video coder performance. This work is included in Rede H.264 project which aims to develop Brazilian technology for Brazilian System of Digital Television / Diversos aparelhos eletr?nicos atuais d?o suporte ? utiliza??o de v?deos digitais: celulares, c?meras fotogr?ficas, filmadoras e TVs digitais s?o alguns exemplos. Entretanto, esses v?deos, tal como foram capturados, apresentam uma grande quantidade de informa??o, utilizando milh?es de bits para sua representa??o. Para realizar o armazenamento dos dados na sua forma prim?ria, seria necess?ria uma quantidade enorme de espa?o e uma grande largura de banda para realizar a transmiss?o. A compress?o de v?deos torna-se, ent?o, essencial para possibilitar o armazenamento e a transmiss?o destes dados. O estimador de movimento, um dos m?dulos do codificador, explora a redund?ncia temporal existente nas sequ?ncias de v?deo para reduzir a quantidade de dados necess?ria ? representa??o da informa??o. Este trabalho apresenta uma arquitetura em hardware para o estimador de movimento para v?deos de alta resolu??o, segundo o padr?o H.264/AVC. O padr?o H.264/AVC ? o mais novo padr?o de compress?o de v?deos que possibilita, gra?as a uma s?rie de inova??es, alcan?ar elevadas taxas de compress?o. A arquitetura apresentada neste trabalho foi projetada para permitir o m?ximo reuso de dados, visando a diminui??o da largura de banda necess?ria para realizar o processo de estima??o de movimento. ? na estima??o de movimento que residem os maiores ganhos do padr?o e, por isso, este m?dulo ? essencial para a efici?ncia do codificador como um todo. Este trabalho est? inserido no projeto Rede H.264, que visa desenvolver tecnologia brasileira para o Sistema Brasileiro de Televis?o Digital
193

ZABEZPEČENÍ A OCHRANA DAT PŘI ČINNOSTI HASIČSKÉHO ZÁCHRANNÉHO SBORU JIHOČESKÉHO KRAJE / Data security of the fire department of the South Bohemian region

REMIÁŠ, František January 2012 (has links)
The thesis is focused on work with the data and information for the Fire and Rescue Service of South Bohemia Region, and its goal is to determine whether the data and information faced by members and employees of Fire and Rescue Service in their work are treated in accordance with applicable legislation, and whether they are adequately secured and protected against misuse and loss. Based on the organizational structure of qualitative research are mapped in detail the security and protection of data on individual departments and workplaces. These information compile risk areas for different types of data and due to this is prepared the risk map for these areas. Due to the findings the specific measures are designed to mitigate the risks for certain fields of work with data and information. The first part explains the basic concepts and definitions of words data, information and knowledge. The next section discusses the historical context of the mapped datastorage and handling of data in paper and electronic form, including the development of information systemsand technologies. Further are described the possible threats by varius effects on these data. Another chapter focuseson describing the current state ofwork with data and description oftechnologies used to it at the Fire and Rescue Service. Afterwards is elaborated an organizational structure focused on work with data. In following part the risk map is prepared for each area and proposesseveral solutions and particular measures to reducethe greatest of the resultingrisks. The benefit of this work is the implementation of several specific solutions to eliminate and reduce the threat of risks when working with dataat Fire and Rescue Service of South Bohemia Region, and several otherproposal measures and stepsfor additional security and data protection.
194

Codes With Locality For Distributed Data Storage

Moorthy, Prakash Narayana 03 1900 (has links) (PDF)
This thesis deals with the problem of code design in the setting of distributed storage systems consisting of multiple storage nodes, storing many different data les. A primary goal in such systems is the efficient repair of a failed node. Regenerating codes and codes with locality are two classes of coding schemes that have recently been proposed in literature to address this goal. While regenerating codes aim to minimize the amount of data-download needed to carry out node repair, codes with locality seek to minimize the number of nodes accessed during node repair. Our focus here is on linear codes with locality, which is a concept originally introduced by Gopalan et al. in the context of recovering from a single node failure. A code-symbol of a linear code C is said to have locality r, if it can be recovered via a linear combination of r other code-symbols of C. The code C is said to have (i) information-symbol locality r, if all of its message symbols have locality r, and (ii) all-symbol locality r, if all the code-symbols have locality r. We make the following three contributions to the area of codes with locality. Firstly, we extend the notion of locality, in two directions, so as to permit local recovery even in the presence of multiple node failures. In the first direction, we consider codes with \local error correction" in which a code-symbol is protected by a local-error-correcting code having local-minimum-distance 3, and thus allowing local recovery of the code-symbol even in the presence of 2 other code-symbol erasures. In the second direction, we study codes with all-symbol locality that can recover from two erasures via a sequence of two local, parity-check computations. When restricted to the case of all-symbol locality and two erasures, the second approach allows, in general, for design of codes having larger minimum distance than what is possible via the rst approach. Under both approaches, by studying the generalized Hamming weights of the dual codes, we derive tight upper bounds on their respective minimum distances. Optimal code constructions are identified under both approaches, for a class of code parameters. A few interesting corollaries result from this part of our work. Firstly, we obtain a new upper bound on the minimum distance of concatenated codes and secondly, we show how it is always possible to construct the best-possible code (having largest minimum distance) of a given dimension when the code's parity check matrix is partially specified. In a third corollary, we obtain a new upper bound for the minimum distance of codes with all-symbol locality in the single erasure case. Secondly, we introduce the notion of codes with local regeneration that seek to combine the advantages of both codes with locality as well as regenerating codes. These are vector-alphabet analogues of codes with local error correction in which the local codes themselves are regenerating codes. An upper bound on the minimum distance is derived when the constituent local codes have a certain uniform rank accumulation (URA) property. This property is possessed by both the minimum storage regenerating (MSR) and the minimum bandwidth regenerating (MBR) codes. We provide several optimal constructions of codes with local regeneration, where the local codes are either the MSR or the MBR codes. The discussion here is also extended to the case of general vector-linear codes with locality, in which the local codes do not necessarily have the URA property. Finally, we evaluate the efficacy of two specific coding solutions, both possessing an inherent double replication of data, in a practical distributed storage setting known as Hadoop. Hadoop is an open-source platform dealing with distributed storage of data in which the primary aim is to perform distributed computation on the stored data via a paradigm known as Map Reduce. Our evaluation shows that while these codes have efficient repair properties, their vector-alphabet-nature can negatively a affect Map Reduce performance, if they are implemented under the current Hadoop architecture. Specifically, we see that under the current architecture, the choice of number processor cores per node and Map-task scheduling algorithm play a major role in determining their performance. The performance evaluation is carried out via a combination of simulations and actual experiments in Hadoop clusters. As a remedy to the problem, we also pro-pose a modified architecture in which one allows erasure coding across blocks belonging to different les. Under the modified architecture, the new coding solutions will not suffer from any Map Reduce performance-loss as seen in the original architecture, while retaining all of their desired repair properties
195

Active Analytics: Adapting Web Pages Automatically Based on Analytics Data

Carle, William R., II 01 January 2016 (has links)
Web designers are expected to perform the difficult task of adapting a site’s design to fit changing usage trends. Web analytics tools give designers a window into website usage patterns, but they must be analyzed and applied to a website's user interface design manually. A framework for marrying live analytics data with user interface design could allow for interfaces that adapt dynamically to usage patterns, with little or no action from the designers. The goal of this research is to create a framework that utilizes web analytics data to automatically update and enhance web user interfaces. In this research, we present a solution for extracting analytics data via web services from Google Analytics and transforming them into reporting data that will inform user interface improvements. Once data are extracted and summarized, we expose the summarized reports via our own web services in a form that can be used by our client side User Interface (UI) framework. This client side framework will dynamically update the content and navigation on the page to reflect the data mined from the web usage reports. The resulting system will react to changing usage patterns of a website and update the user interface accordingly. We evaluated our framework by assigning navigation tasks to users on the UNF website and measuring the time it took them to complete those tasks, one group with our framework enabled, and one group using the original website. We found that the group that used the modified version of the site with our framework enabled was able to navigate the site more quickly and effectively.
196

Performance Evaluation of LINQ to HPC and Hadoop for Big Data

Sivasubramaniam, Ravishankar 01 January 2013 (has links)
There is currently considerable enthusiasm around the MapReduce paradigm, and the distributed computing paradigm for analysis of large volumes of data. The Apache Hadoop is the most popular open source implementation of MapReduce model and LINQ to HPC is Microsoft's alternative to open source Hadoop. In this thesis, the performance of LINQ to HPC and Hadoop are compared using different benchmarks. To this end, we identified four benchmarks (Grep, Word Count, Read and Write) that we have run on LINQ to HPC as well as on Hadoop. For each benchmark, we measured each system’s performance metrics (Execution Time, Average CPU utilization and Average Memory utilization) for various degrees of parallelism on clusters of different sizes. Results revealed some interesting trade-offs. For example, LINQ to HPC performed better on three out of the four benchmarks (Grep, Read and Write), whereas Hadoop performed better on the Word Count benchmark. While more research that is extensive has focused on Hadoop, there are not many references to similar research on the LINQ to HPC platform, which is slowly evolving during the writing of this thesis.
197

RFID in Virtuellen Unternehmen: Potenziale von Data-on-Tag

Werner, Kerstin, Grummt, Eberhard January 2007 (has links)
No description available.
198

Marknadsföring via sökmotorer : Planering av sökord

Magnusson, Wilhelm, Schreil, Christian January 2015 (has links)
Närvaron på Internet har aldrig varit större och användningen av sökmotorer motsvarar en stor del av hur människor upptäcker tillgängligt innehåll på Internet. Innehållet som utgörs av hemsidor kategoriseras av sökmotorers automatiserade tjänster. Ett arbete för att förbättra hemsidors möjlighet att kategoriseras görs för att få ökad synlighet genom sökmotorer. Förbättrad synlighet kan också erhållas genom sökmotormarknadsföring, vilket är en strategi för att annonsera via sökmotorer. Genom sökmotorer finns verktyg som assisterar detta ändamål. Företaget som rapporten berör erbjuder tjänster både för sökmotoroptimering och marknadsföring. Problemet med företagets befintliga hantering är att det inte erbjuder tillräckliga möjligheter till analys av statistik för annonseringar på sökmotorerna. Begränsningarna kommer av att statistiken för företagets kunder är separerade på flera olika platser. Konsulter på företaget som ansvarar för vissa kunder får inte dela med sig av statistik för administrerade annonseringar till andra konsulter på företaget. Projektet syftar finna en metod för företagets anställda att göra bättre analyser av sökord och annonseringar. Metoden realiseras i en applikationsplattform, där de anställda erhåller en webbapplikation som kan användas för att finna relevant statistik. Med en kvalitativ metod samlas underlag om företaget och deras förväntningar på systemet in, den insamlade informationen används i den fas som mynnar ut i systemets framställning. Systemet utvecklas iterativt och implementeras sedan i företagets befintliga driftmiljö. Genom arbetet som gjorts i projektet har ett användbart system kunnat realiseras för relevanta användare på företaget. Applikationen erbjuder djupare insikt i vilka faktorer som bidrar till annonseringars framgång. Dock har framgången kring systemets införande inte kunnat mätas i tillräckligt stor utsträckning. En sammanfattning av studien bidrar till rekommendationer till framtida arbete. / Internet usage has never been greater. Search engines provide a gateway to discover content online. The content comes in the form of web pages, which are categorized by the automated services of a search engine. Certain steps can be taken to optimize the way web sites are categorized, all of which are done to improve the visibility on search engines. Another way to increase visibility of a web site can be achieved with search engine marketing. Search engine marketing describes the process of advertising content on search engines. Search engine providers have different tools that may assist the process of advertising. The company that this report concerns provides services in the areas of optimizing and advertising content on search engines. The problem manifests in the company’s current system, which doesn’t provide an acceptable way of analyzing statistics of their search engine marketing efforts. Employees involved with their respective customers may not share information about their strategies and statistics to co-workers. The purpose of the project is to define a method for the company and its employees to achieve a higher level analysis of keyword and advertisement data. The work is realized in a platform that provides employees a way to extract relevant statistics from a web application. A qualitative methodology is defined to collect descriptive information about the company’s processes and their expectations on the system to be developed. The system is developed in a series of iterations and is deployed on a server provided by the company. The efforts have resulted in a useful system that may provide employees with deeper insights as to which factors that might be key for the success of certain advertisement strategies. However, the effects of the system have not been measured to confirm if the method actually improved the company’s search engine marketing efforts. To conclude the study, a set of recommendations has been given for future work.
199

Den digitalt suveräna staten : En undersökning av inställningen till nationell datalagring av personuppgifter hos statliga myndigheter / The digitally sovereign state : An investigation into the attitude towards national data storage of personal data within Swedish public authorities

Gordon Hultsjö, Joel January 2021 (has links)
The number of scandals during the past years regarding the use and misuse of digital storage of personal infor-mation in combination with the implementation of the General Data Protection Regulation (GDPR) within the EU member states, has resulted in a resurfaced discussion of sovereignty within the public sphere in relation to the storage of digital information. This master thesis examines the attitudes towards national data storage of personal data within twenty Swedish public agencies in the context of the analytical term Digital sovereignty.The thesis uses semi-structured interviews with employees working with data protection and qualitative con-tent analysis of internal documents connected to personal data management, in order to examine Swedish govern-ment agencies attitudes towards national data storage of personal information. The responses of the interviews and the internal policy documents in the area of personal data protections is viewed through the analytic term Digital sovereignty. The government agency the Swedish social security agency’s definition of Digital sovereignty is used in the thesis, which focuses on national governments ability to have control over both the technical and geograph-ical processing and storage of their citizen’s personal data.The thesis concludes that Swedish authorities takes the risk of transfer of personal data to third countries outside of the EU very seriously, while they also see the need to find legal ways to transfer personal data to these same countries. The thesis also concludes that Swedish government agencies try to avoid cloud services and are cautious in their use due to the implications they have for information and data security, while other research have shown that cloud services are used extensively within Swedish government agencies. The thesis also concludes that there is a lack of interest in national data storage of personal information within Swedish government, which can partially be attributed to the relationship between the General Data Protections Regulation and data storage regulation on a national level in Sweden. This leads to the final conclusion in this thesis, which is that there is some indication that the future of storage of personal data with the EU member states lies not in nationally managed cloud services, but rather in a federated cloud service on EU-level such as the currently ongoing project Gaia-X. This is a two years master's thesis in Archival science.
200

Integrované řešení diagnostiky výrobního zařízení v energetice České republiky / INTEGRATED SOLUTION OF DIAGNOSTICS OF PRODUCTION EQUIPMENT IN THE CZECH ENERGY INDUSTRY

Cvešpr, Pavel January 2013 (has links)
This dissertation thesis is concerned with the diagnostics of the main production facilities in Czech power engineering with a focus on its integrating role in the process of gaining, saving and processing information for the purpose of evaluating the technical state of the operated facilities and the plan to manage their lifetime. It is divided into two parts, theoretical and practical. The theoretical part presents the conclusions of examination of the needs of involved workers in the areas of diagnostics, maintenance and expert assessment of technical state of equipment. These conclusions were formulated based on the completed analysis of the current status ("as - is" analysis) of performing diagnostics of systems operated in the technological units of both nuclear and classical power engineering. The monitored equipment involves electrical installations and machinery, steel and building constructions, measuring instruments and vibrodiagnostics. Based on the analysis results, process diagrams are created for the solution of partial tasks. The analysis of the proposed solution for problems in question ("should - be" analysis) includes a design of the fundamental scheme of the data model for a software solution and a design of data flows from the individual data sources. The following part presents an application layer which includes a detailed description of major functionalities. Further, important activities and procedures are described that are necessary to evaluate the technical state of equipment. The practical part deals with the implementation of the LTO suite software product in the environment of power engineering in Czech Republic, specifically within ČEZ, a.s.. The LTO suite functionality is demonstrated in this part of the thesis by screens recorded within the LTO suite individual modules, which work above the actual data. It starts with the initial screen for configuration of displayed data, continues to present examples of the equipment register, planning, processing and saving of information collected through the diagnostic activities over to the module of integration – analytical layer, which is designed for evaluation of the technical state of equipment at the entered date with a reporting output. The thesis also includes the chapters on "Aims of the Study" and "Conclusion". The key chapter presents the "Benefits of the Study", whose overview describes the original results of the research as well as those applicable also outside the power engineering area.

Page generated in 0.303 seconds