• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 60
  • 32
  • 13
  • 8
  • 7
  • 4
  • 4
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 138
  • 73
  • 56
  • 51
  • 48
  • 38
  • 35
  • 35
  • 22
  • 22
  • 16
  • 15
  • 13
  • 13
  • 13
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Evaluation of Load Scheduling Strategies for Real-Time Data Warehouse Environments

Thiele, Maik, Lehner, Wolfgang 13 January 2023 (has links)
The demand for so-called living or real-time data warehouses is increasing in many application areas, including manufacturing, event monitoring and telecommunications. In fields like these, users normally expect short response times for their queries and high freshness for the requested data. However, it is truly challenging to meet both requirements at the same time because of the continuous flow of write-only updates and read-only queries as well as the latency caused by arbitrarily complex ETL processes. To optimize the update flow in terms of data freshness maximization and load minimization, we propose two algorithms - local and global scheduling - that operate on the basis of different system information. We want to discuss the benefits and drawbacks of both approaches in detail and derive recommendations regarding the optimal scheduling strategy for any given system setup and workload.
42

Využití systému SAS při tvorbě datových skladů a optimalizaci ETL procesů / Using the SAS System for Building Data Warehouses and Optimalization of ETL Processes

Pešička, Michal January 2008 (has links)
This diploma thesis deals with usability of the SAS system and its components for building and running data warehouse and complete solution of Business Intelligence. In the beginning it introduces the meaning and benefits of putting on Business Intelligence and its spot in an organization. It especially focuses on the running BI project in Kooperativa, a.s., insurance company. The main goal of this thesis is to aim on ETL processes of data warehouse, their specificity, characteristics and regular tasks solved across data layers, measuring their performance and feasibility of ETL optimalization. This optimalization can be considered from two different points of view – the first is a creation and maintenance of the ETL source code, the second is tuning for faster data processing. Log files, which are are the main source for performance monitoring, are processed by macroprogram specially tailored to this particular reason. Gained results are analyzed and on that basis I outline spots that need attention. The last part offers comparison of some alternatives to data transformation process typically solved by ETL tasks. Acquired results could be taken as hints used in designing and tweaking other akin ETL processes.
43

Effiziente Schemamigration in der modellgetriebenen Datenbankanwendungsentwicklung

Claußnitzer, Ralf 25 March 2008 (has links)
Unter dem Terminus der MDA (Model Driven Architecture)versteht man eine Methode, Anwendungen im Rahmen der UML zu spezifizieren und ablauffähigen Programm-Code durch automatische Generierung zu erzeugen. Am Lehrstuhl für Datenbanken existiert in diesem Zusammenhang das GignoMDA-Projekt, daß sich mit der modellgetriebenen Entwicklung von Datenbankenanwendungen beschäftigt. Als wesentlicher Bestandteil der jeweiligen Anwendung sind Datenmodelle jedoch, genau wie die Anwendungsarchitektur selbst, Anpassungen an sich veränderte Zielstellungen und Umgebungsbedingungen unterworfen. Es stellt sich also die Notwendigkeit der Überführung von Datenbeständen in neu generierte Zielsysteme, als Bestandteil eines vollständig modellgetriebenen Ansatzes dar. Diese Arbeit stellt ein Konzept zur Schema- und Datenmigration bei der Weiterentwicklung der Anwendungs-Datenbankmodelle vor. Dabei werden Datenmigrationen, gemäß dem MDA-Ansatz, als Modell in UML ausgedrückt und anschließend zur automatischen Erzeugung von plattformabhängigen Migrationsmodellen genutzt. Aus diesen Migrationsmodellen können so, Datenbanktechnik basierte Programme (ETL, Stored Procedures) zur effizienten Ausführung von Migrationen generiert werden.
44

Datakvalité : En nödvändighet inom Business Intelligence

Halvarsson Eklund, Tom, Sjövall, Julia January 2019 (has links)
Business Intelligence (BI) syftar till att förädla data för att förbättra det organisatoriska beslutsfattandet. Bristande datakvalité orsakar många misslyckade BI-projekt och studier belyser att säkerställande av data av hög kvalité är grundläggande för att lyckas inom BI. Uppsatsens syfte var därför att skapa en förståelse av praktiska tillvägagångssätt för en sådan säkerställning. Begreppet datakvalité kan brytas ner i dimensioner för att göra det mer hanterbart, och denna studie bygger på vad som tidigare tagits fram som nyckeldimensioner. Intervjuer genomfördes med fyra konsulter med expertis och lång erfarenhet inom BI. Utöver en effektiv ETL-process och kontroller i BI-systemet, genomsyras resultatet av ett behov av förarbete inför lagring i källsystemen samt kontinuerliga avstämningsrutiner. Detta för att möjliggöra säkerställandet av data av hög kvalité i större utsträckning. Indikationer ges även på att tillvägagångssätt för att säkerställa data av hög kvalité styrs av de krav som ställs på datat för att det ska vara användbart för dess avsedda syfte.
45

Estudo das camadas transportadoras de elétrons em dispositivos poliméricos emissores de luz. / Electron transport layer study in polymeric light emission devices.

Gimaiel, Helena Liberatori 15 October 2008 (has links)
Este trabalho tem como objetivo o estudo de camadas transportadoras de carga, em especial de elétrons, em dispositivos poliméricos emissores de luz (PLEDs). Estes dispositivos são obtidos basicamente através da deposição de uma camada emissiva polimérica entre dois eletrodos. Dispositivos poliméricos luminescentes apresentam vantagens quando comparados aos LEDs convencionais, entre elas a possibilidade de fabricação de mostradores ultrafinos, com baixo custo de fabricação e flexíveis. A eficiência dos PLEDs está intimamente ligada a viabilidade de comercialização destes displays. Neste trabalho, a primeira estrutura confeccionada é formada pela deposição de um filme do polímero emissivo OC1C10 PPV - poly[2-metoxi-5-(3,7-dimetiloctiloxi)- 1,4-fenilenovinileno entre o anodo ITO (Oxido de Índio-Estanho) e o catodo alumínio. Sobre o eletrodo transparente ITO o polímero OC1C10-PPV e o Al são depositados sob forma de filme através das técnicas Spin Coating e Evaporação Térmica, respectivamente. Em uma segunda etapa, foram fabricadas amostras com uma estrutura diferenciada, de modo que foram introduzidas camadas de filmes relacionadas aos fenômenos de transporte de cargas capazes de aumentar a eficiência do dispositivo. Estas camadas são responsáveis por uma redução da barreira de potencial existente entre o polímero emissivo e os eletrodos e podem também confinar um número maior lacunas no interior da camada emissiva. Este feito implica num aumento da população de elétrons e lacunas no interior da camada ativa, gerando uma maior recombinação de cargas. Neste sentido, filmes de Alq3 (tris-(8-hydroxyquinolate)-aluminum) com diferentes espessuras foram introduzidos entre a camada ativa e o catodo, gerando uma emissão luminosa superior. Por fim, diferentes concentrações de Alq3 foram adicionadas à solução de OC1C10-PPV, resultando em compostos OC1C10-PPV:Alq3 utilizados como camada ativa, o que proporcionou um aumento da eficiência luminosa em mais de 10 vezes quando comparado aos dispositivos fabricados na primeira etapa do trabalho. A utilização do polímero PEDOT:PSS - Poli (3,4-etilenodioxitiofeno) : Poliestireno Sulfonado- como camada transportadora de lacunas (HTLs) depositada entre o alumínio e o polímero eletroluminescente se mostrou indispensável para o funcionamento dos dispositivos fabricados, evitando o curto circuito entre eletrodos. / This work has the objective to study charge carried layers, especially of electrons, in polymer light emitting devices (PLEDs). These devices are obtained basically by polymer emissive layer deposition between two electrodes. Polymer luminescent devices show advantages when compared to conventional LEDs, such as the thin displays manufacture possibility, with low cost of manufacture and flexible. The PLEDs efficiency is closely linked the commercialization feasibility these displays. In this work, the first structure fabricated is composed by the deposition of OC1C10 PPV - [poly(2-(3,7-dimethyloctyloxy)-5-(2\'-methoxy-1,4-phenylene vinylene)] emissive polymer film between the anode ITO (Indium Tin Oxide) and the aluminum cathode. On the transparent electrode ITO the polymer OC1C10-PPV and Al are deposited as a film through spin coating and thermic evaporation techniques, respectively. In other step, samples were fabricated with a differentiated structure, which films layers related to charges transport were introduced and increase efficiency device. These layers are responsible for reduction of potential barrier between polymer emissive and the electrodes and they can confine a higher number of holes in the emissive layer. This imply in an increase of electrons and hole population in active layer, producing a higher luminous emission. Finally, different Alq3 concentrations were added to solution of the OC1C10-PPV, resulting in compost OC1C10-PPV:Alq3 used as active layer, that increase the luminous efficiency more 10 times than the devices fabricated in first stage of this work. The use polymer PEDOT:PSS Poly(3,4-ethylenedioxythiophene): Poly(StyreneSulfonate) as hole transport layer (HTLs) deposited between the aluminum and the polymer electroluminescent showed essential to operation of devices produced, avoiding the short circuit between electrodes.
46

Evangelist Marketing of the CloverETL Software / Evangelist Marketing of the CloverETL Software

Štýs, Miroslav January 2011 (has links)
The Evangelist Marketing of the CloverETL Software diploma thesis aims at proposing a new marketing strategy for an ETL tool - CloverETL. Theoretical part comprises chapters two and three. In chapter two, the thesis attempts to cover the ETL term, which - as a separate component of the Business Intelligence architecture - is not given much space in literature. Chapter three introduces evangelist marketing, explains its origins and best practices. Practical part involves introducing the Javlin, a.s. company and its CloverETL software product. After assessing the current marketing strategy, proposal of a new strategy follows. The new strategy is built on evangelist marketing pillars. Finally, benefits of the new approach are discussed looking at stats and data - mostly Google Analytics outputs.
47

Modélisation et manipulation d'entrepôts de données complexes et historisées

Teste, Olivier 18 December 2000 (has links) (PDF)
Le mémoire de cette thèse traite de la modélisation conceptuelle et de la manipulation des données (par des algèbres) dans les systèmes d'aide à la décision. Notre thèse repose sur la dichotomie de deux espaces de stockage : l'entrepôt de données regroupe les extraits des bases sources utiles pour les décideurs et les magasins de données sont déduits de l'entrepôt et dédiés à un besoin d'analyse particulier.<br />Au niveau de l'entrepôt, nous définissons un modèle de données permettant de décrire l'évolution temporelle des objets complexes. Dans notre proposition, l'objet entrepôt intègre des états courants, passés et archivés modélisant les données décisionnelles et leurs évolutions. L'extension du concept d'objet engendre une extension du concept de classe. Cette extension est composée de filtres (temporels et d'archives) pour construire les états passés et archivés ainsi que d'une fonction de construction modélisant le processus d'extraction (origine source). Nous introduisons également le concept d'environnement qui définit des parties temporelles cohérentes de tailles adaptées aux exigences des décideurs. La manipulation des données est une extension des algèbres objet prenant en compte les caractéristiques du modèle de représentation de l'entrepôt. L'extension se situe au niveau des opérateurs temporels et des opérateurs de manipulation des ensembles d'états.<br />Au niveau des magasins, nous définissons un modèle de données multidimensionnelles permettant de représenter l'information en une constellation de faits ainsi que de dimensions munies de hiérarchies multiples. La manipulation des données s'appuie sur une algèbre englobant l'ensemble des opérations multidimensionnelles et offrant des opérations spécifiques à notre modèle. Nous proposons une démarche d'élaboration des magasins à partir de l'entrepôt.<br />Pour valider nos propositions, nous présentons le logiciel GEDOOH (Générateur d'Entrepôts de Données Orientées Objet et Historisées) d'aide à la conception et à la création des entrepôts dans le cadre de l'application médicale REANIMATIC.
48

Spectroscopie LIBS sans calibration : évaluation critique et application à l'analyse de sols pollués.

Travaillé, Grégoire 02 December 2010 (has links) (PDF)
Aujourd'hui encore, malgré l'utilisation de techniques spectrochimiques de référence, l'aspect quantitatif de la spectroscopie sur plasma induit par laser (LIBS) reste largement en friche du fait d'effets de matrice incapacitants. La LIBS sans étalonnage (CF-LIBS) est une des façons considérées par la communauté scientifique afin de relever ce défi. Malgré ses postulats d'idéalité du plasma (ETL, homogénéité et stationnarité du plasma), à l'heure actuelle peu d'études extensives ont été développées afin de réaliser une évaluation critique de la méthode. A l'aide d'outils théoriques (modèle collisionnel-radiatif développé pour l'Aluminium, Argon et Cadmium) et expérimentaux (expérience de Diffusion Thomson) tirant leur origine de la physique des plasmas, ainsi que de cas pratiques d'analyses, nous tenterons d'établir le cahier des charges le plus favorable à l'analyse quantitative par CF-LIBS. L'application de cette méthode à l'analyse d'échantillons complexes (minéraux et sols pollués) sera discutée à l'aune de nos résultats et des connaissances actuellement disponibles en la matière.
49

NIG distribution in modelling stock returns with assumption about stochastic volatility : Estimation of parameters and application to VaR and ETL.

Kucharska, Magdalena, Pielaszkiewicz, Jolanta January 2009 (has links)
<p>We model Normal Inverse Gaussian distributed log-returns with the assumption of stochastic volatility. We consider different methods of parametrization of returns and following the paper of Lindberg, [21] we</p><p>assume that the volatility is a linear function of the number of trades. In addition to the Lindberg’s paper, we suggest daily stock volumes and amounts as alternative measures of the volatility.</p><p>As an application of the models, we perform Value-at-Risk and Expected Tail Loss predictions by the Lindberg’s volatility model and by our own suggested model. These applications are new and not described in the</p><p>literature. For better understanding of our caluclations, programmes and simulations, basic informations and properties about the Normal Inverse Gaussian and Inverse Gaussian distributions are provided. Practical applications of the models are implemented on the Nasdaq-OMX, where we have calculated Value-at-Risk and Expected Tail Loss</p><p>for the Ericsson B stock data during the period 1999 to 2004.</p>
50

NIG distribution in modelling stock returns with assumption about stochastic volatility : Estimation of parameters and application to VaR and ETL.

Kucharska, Magdalena, Pielaszkiewicz, Jolanta January 2009 (has links)
We model Normal Inverse Gaussian distributed log-returns with the assumption of stochastic volatility. We consider different methods of parametrization of returns and following the paper of Lindberg, [21] we assume that the volatility is a linear function of the number of trades. In addition to the Lindberg’s paper, we suggest daily stock volumes and amounts as alternative measures of the volatility. As an application of the models, we perform Value-at-Risk and Expected Tail Loss predictions by the Lindberg’s volatility model and by our own suggested model. These applications are new and not described in the literature. For better understanding of our caluclations, programmes and simulations, basic informations and properties about the Normal Inverse Gaussian and Inverse Gaussian distributions are provided. Practical applications of the models are implemented on the Nasdaq-OMX, where we have calculated Value-at-Risk and Expected Tail Loss for the Ericsson B stock data during the period 1999 to 2004.

Page generated in 0.0239 seconds