• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 60
  • 33
  • 13
  • 8
  • 7
  • 4
  • 4
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 139
  • 74
  • 56
  • 51
  • 48
  • 38
  • 35
  • 35
  • 22
  • 22
  • 16
  • 16
  • 14
  • 13
  • 13
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
121

衡量銀行市場風險-VaR與ETL模型的應用

陳嘉敏, Chen, Jia Min Unknown Date (has links)
本文提出了一個新興風險衡量的工具的概念-期望尾端損失值(ETL),其有別於風險值為百分位數且未考慮報酬分配的尾部風險(Tail Risk),本研究期望能透過ETL的估計可以更完整表達投資組合所有可能面臨的風險,對於市場風險能更有效控管。 本文實證討論有關VaR與ETL穩定度的部分,VaR雖然在理論上證明無法滿足次可加性這個條件,但是在本研究實證中,即使在分配具厚尾狀況下,VaR仍滿足次加性的性質。這也表示,我們在現實生活中很難因VaR理論上缺乏次可加性,而捨棄VaR這個風險衡量工具,然ETL也有其貢獻性,其較VaR多考慮尾部資訊,可視為風險值外另一參考指標,此為本文貢獻一。 本文實證也探討移動窗口中歷史資料長度的不同,是否造成VaR與ETL估算準確性的差異,本文由實證結果發現:在歷史窗口的資料長度越長(1000日)下,並沒有正確預估VaR與ETL,而本研究中以移動窗口為500日下,使用內部模型較具正確性,故在使用風險值模型時,應謹慎選擇移動窗口之長度,此為本文貢獻二。
122

Transfert de rayonnement hors ETL pour l'étude des photosphères et des chromosphères stellaires : applications aux atomes du magnésium, du calcium et du fer dans les étoiles tardives.

Merle, Thibault 21 March 2012 (has links) (PDF)
L'analyse des abondances stellaires suppose souvent que les raies spectrales se forment à l'équilibre thermodynamique local (ETL). Cette hypothèse n'est pas toujours appropriée notamment pour les étoiles pauvres en métaux et/ou évoluées. Pour mieux comprendre ces étoiles et appréhender leur rôle dans l'enrichissement chimique de la Galaxie, il est devenu nécessaire d'adopter une description hors ETL (HETL) plus réaliste mais aussi plus complexe à mettre en oeuvre. Mon travail de thèse a consisté à construire des modèles d'atomes à partir des bases de données de physique atomique les plus récentes pour deux éléments α : le magnésium et le calcium. Ils ont un intérêt astrophysique important car ils permettent de caractériser l'enrichissement chimique des populations stellaires. J'ai donc développé un code de construction de modèles d'atomes, FORMATO, pour l'étude des raies spectrales HETL. J'ai utilisé ces modèles pour calculer une grille de corrections HETL à appliquer sur les largeurs équivalentes des principales raies de ces éléments, dont certaines seront observées par la mission Gaia, pour des géantes et des super- géantes. J'ai aussi appliqué ces résultats pour calculer des lois d'assombrissement centre-bord HETL pour le triplet IR du CaII qui ont permis de déterminer, pour la première fois, l'extension chromosphérique de la géante β Cet, grâce à des mesures interférométriques (VEGA@CHARA). Enfin, dans le cadre du Carina Project, j'ai mis en évidence des effets HETL sur l'équilibre d'ionisation du fer (~0,1 dex) dans un échantillon de 44 géantes rouges de la galaxie dSph Carina, en s'appuyant sur une étude comparée des raies du FeI et FeII à l'ETL et HETL.
123

Détection de métaux lourds dans les sols par spectroscopie d'émission sur plasma induit par laser (LIBS)

Sirven, Jean-Baptiste 18 September 2006 (has links) (PDF)
Dans les domaines de l'analyse, du contrôle et de la mesure physique, le laser constitue un outil métrologique particulièrement puissant et polyvalent, capable d'apporter des réponses concrètes à des problématiques variées, y compris d'ordre sociétal. Parmi ces dernières, la contamination des sites et des sols par les métaux lourds est un enjeu de santé publique important qui requiert de disposer de moyens de mesure adaptés aux réglementations existantes et suffisamment souples d'utilisation. Technique rapide et ne nécessitant pas de préparation de l'échantillon, la spectroscopie sur plasma induit par laser (LIBS) présente des avantages très intéressants pour réaliser des mesures sur site de la teneur en métaux lourds à l'échelle de la dizaine de ppm; la conception d'un appareil portable à moyen terme est envisageable.<br />Dans cette thèse nous montrons d'abord que le régime femtoseconde ne présente pas d'avantages par rapport au régime nanoseconde standard pour notre problématique. Ensuite nous mettons en œuvre un traitement avancé des spectres LIBS par des méthodes chimiométriques dont les performances améliorent sensiblement les résultats des analyses qualitatives et quantitatives d'échantillons de sols.
124

透過ODS進行企業資訊系統整合之研究-以某企業為例 / Using ODS to integrate enterprise systems: A case study

黃琬婷, Huang,wan ting Unknown Date (has links)
由於科技快速進步,企業經營也隨著科技的進步而產生重大變化,不但需求變化快速,企業還要即時快速反應外在環境,於是企業對於資訊系統整合的議題越來越重視,希望將功能導向的系統轉變為流程導向的系統,將資訊有效的整合及標準化,讓企業能快速地與外在環境連結,進而提升整體營運績效。 整合的方法非常的多,大致上可分為四類。目前對於哪一種整合方法是最有效率、效益也無一定論。最主要的原因是不同的整合個案會有不同的整合需求,若只從理論方面來探討資訊系統整合所帶來的效用並無法具體地呈現其價值。 有鑑於此,本研究之目的是希望透過個案單位的作業模型,分析、建構出整合的資料模型。因此,本研究以階段性的方法設計資料整合模型及其運作方法。在第一、二階段當中,先針對個案單位的流程進行系統資料流程塑模及業務流程塑模,再從業務面及系統面找出資訊中斷的地方及因素,並將問題具體地描述出來。在第三階段中,本研究挑選了最適合此個案單位的整合方法,也就是使用資料層級(Data-Level)整合的方法,設計整合的資料模型將資訊流完整的串接起來以支援企業的決策需求。最後一階段則要利用ETL說明整合的系統運作模式,並說明此個案單位使用ETL時,可能會遇到的問題及初步的解決方法。 / Owing to rapid advances in technology, enterprises have a major change of the progress of science and technology. The enterprises not only change rapidly in demand, but also have to response to the external environment rapidly. Hence, enterprise information system integration issues get more attentions. Enterprises hope to improve their systems from function-oriented to process-oriented because the effective integration of information and standardization allows enterprises to quickly link with the external environment and to enhance the overall operating performance. However, there are many kinds of integrated approaches. At present, there is no substantive conclusion in approaches to integrate efficiently all systems in business. The main reason is that the integration of different cases has different integration requirements. Therefore, it cannot concretely show the value of system integration through the discussion of the theoretical aspects. In this thesis, this study aims to enhance the operation of a case to model, analyze, and construct an integrated data model. This research has 4 phases to construct integrated data model. In phase 1 and 2, this study builds the case data flow modeling and business process modeling and discovers information gap. In phase 3, the study selects the most suitable method for this case, that is, to use the data level integrated approach to design integrated data model. Finally, using ETL illustrates system operation mode and describes the case which may encounter problems and initial solutions follow in phase 4.
125

Essays on asset allocation strategies for defined contribution plans

Basu, Anup K. January 2008 (has links)
Asset allocation is the most influential factor driving investment performance. While researchers have made substantial progress in the field of asset allocation since the introduction of mean-variance framework by Markowitz, there is little agreement about appropriate portfolio choice for multi-period long horizon investors. Nowhere this is more evident than trustees of retirement plans choosing different asset allocation strategies as default investment options for their members. This doctoral dissertation consists of four essays each of which explores either a novel or an unresolved issue in the area of asset allocation for individual retirement plan participants. The goal of the thesis is to provide greater insight into the subject of portfolio choice in retirement plans and advance scholarship in this field. The first study evaluates different constant mix or fixed weight asset allocation strategies and comments on their relative appeal as default investment options. In contrast to past research which deals mostly with theoretical or hypothetical models of asset allocation, we investigate asset allocation strategies that are actually used as default investment options by superannuation funds in Australia. We find that strategies with moderate allocation to stocks are consistently outperformed in terms of upside potential of exceeding the participant’s wealth accumulation target as well as downside risk of falling below that target by very aggressive strategies whose allocation to stocks approach 100%. The risk of extremely adverse wealth outcomes for plan participants does not appear to be very sensitive to asset allocation. Drawing on the evidence of the previous study, the second essay explores possible solutions to the well known problem of gender inequality in retirement investment outcomes. Using non-parametric stochastic simulation, we simulate iv and compare the retirement wealth outcomes for a hypothetical female and male worker under different assumptions about breaks in employment, superannuation contribution rates, and asset allocation strategies. We argue that modest changes in contribution and asset allocation strategy for the female plan participant are necessary to ensure an equitable wealth outcome in retirement. The findings provide strong evidence against gender-neutral default contribution and asset allocation policy currently institutionalized in Australia and other countries. In the third study we examine the efficacy of lifecycle asset allocation models which allocate aggressively to risky asset classes when the employee participants are young and gradually switch to more conservative asset classes as they approach retirement. We show that the conventional lifecycle strategies make a costly mistake by ignoring the change in portfolio size over time as a critical input in the asset allocation decision. Due to this portfolio size effect, which has hitherto remained unexplored in literature, the terminal value of accumulation in retirement account is critically dependent on the asset allocation strategy adopted by the participant in later years relative to early years. The final essay extends the findings of the previous chapter by proposing an alternative approach to lifecycle asset allocation which incorporates performance feedback. We demonstrate that strategies that dynamically alter allocation between growth and conservative asset classes at different points on the investment horizon based on cumulative portfolio performance relative to a set target generally result in superior wealth outcomes compared to those of conventional lifecycle strategies. The dynamic allocation strategy exhibits clear second-degree stochastic dominance over conventional strategies which switch assets in a deterministic manner as well as balanced diversified strategies.
126

Uma abordagem para automatizar a manutenção do código de procedimentos de carga para ambientes de business intelligence

Costa, Juli Kelle Góis 27 August 2015 (has links)
Business Intelligence (BI) relies on Data Warehouse (DW), a historical data repository designed to support the decision making process. Without an effective Data Warehouse, organizations cannot extract the data required for information analysis in time to enable more effective strategic, tactical, and operational insights. This thesis presents an approach and a Rapid Application Development (RAD) tool to increase efficiency and effectiveness of ETL (Extract, Transform and Load) programs creation and maintenance. Experiment evaluation of the approach is carried out in two controlled experiments that carefully evaluated the efficiency and effectiveness of the tool in an industrial setting. The results indicate that our approach can indeed be used as method aimed at improving creation and maintenance of ETL processes. / Grande parte das aplicações de Business Intelligence (BI) efetivas depende de um Data Warehouse (DW), um repositório histórico de dados projetado para dar suporte a processos de tomada de decisão. Sem um DW eficiente, as organizações tendem a não extrair, em um tempo aceitável, os dados que viabilizam ações estratégicas, táticas e operacionais mais eficazes. Muitos ambientes de BI possuem um processo de Engenharia de Software particular, baseado em dados, para desenvolver programas de Extração, Transformação e Carga (ETL) de dados para o DW. Este trabalho propõe o desenvolvimento e experimentação de uma abordagem de Desenvolvimento Rápido de Aplicações (RAD) para aumentar a eficácia e a eficiência da manutenção de procedimentos de carga SQL, utilizados em processos ETL, avaliando a relação existente entre a sua utilização e a qualidade dos dados que são movidos, gerados e atualizados durante o processo de povoamento de um Data Warehouse. Este é um ambiente ímpar que necessita de maior integração e interdisciplinaridade entre as áreas de Engenharia de Software (ES) e Banco de Dados. Foi feita uma avaliação da criação e manutenção automática de procedimentos em extensões da SQL, perfazendo dois experimentos controlados feitos na indústria, para analisar a efetividade de uma ferramenta que encapsula e automatiza parte da abordagem. Os resultados indicaram que a nossa abordagem pode ser usada como método para acelerar e melhorar o desenvolvimento e manutenção de processos ETL.
127

Spectroscopie optique d’émission et spectroscopie laser pour le diagnostic des plasmas induits par laser / Optical emission spectroscopy and laser scattering for laser induced plasmas diagnostic

Farah Sougueh, Ali 07 September 2015 (has links)
Les plasmas induits par laser (PIL) ont depuis leurs apparitions dans les années soixante suscité un très grand intérêt notamment comme source de données spectroscopiques. Ils ont également acquis des nombreuses applications, comme sources des rayons X pour la lithographie, l’allumage plasma, la déposition par laser pulsé, ou sont devenues la base d’une technique d’analyse très populaire – la LIBS (laser induced breakdown spectroscopy). Cette dernière peut s’appliquer in situe à tout type d’échantillon et sans préparation. Toutefois, les mesures faites par cette méthode sont latéralement intégrées nécessitant des techniques d’inversion, mais dépendent également des conditions d’équilibre thermodynamiques local (ETL) dans le plasma. Afin de valider les mesures effectuées par LIBS, la diffusion Thomson qui est une méthode spatialement résolue et indépendante des hypothèses d’équilibre thermodynamique a été appliquée pour caractériser les PIL. Des plasmas d’ablation et de claquage ont donc été caractérisés à la fois par spectroscopie d’émission et par diffusion Thomson. La comparaison des paramètres température et densité électronique obtenues par les deux méthodes d’une part, et le critère de McWhirter ainsi que les temps de relaxation et les longueurs de diffusions des espèces contenues dans le plasma d’autre part, ont permis de statuer sur l’ETL. / Laser induced plasma (LIP) which was first reported in the beginning of sixties, has achieved a great interest as a source of spectroscopic data. It has also many applications like X-ray sources for lithography, plasma igniters, pulsed laser deposition or it has become a basis of a very popular analytical technique – LIBS (laser induced breakdown spectroscopy). The latter is mainly due to its applicability to different kinds of samples, no sample preparation or in-situ and remote sensing capability. However, LIBS measurements are laterally integrated and Abel inversion must be performed. Also the method assumes the plasma to be in local thermodynamic equilibrium (LTE). In order to validate LIBS measurements, Thomson scattering (TS) method which is spatially resolved and free from equilibrium assumption was applied. Thus, ablation and breakdown plasmas were characterized by both two methods. Comparison between plasma parameters (temperature and electron density) obtained by the two methods and McWhirter criterion as well as relaxation times and diffusion lengths of species in the plasma allowed to estimate LTE.
128

Návrh metodiky testování BI řešení / Design of methodology for BI solutions testing

Jakubičková, Nela January 2011 (has links)
This thesis deals with Business Intelligence and its testing. It seeks to highlight the differences from the classical software testing and finally design a methodology for BI solutions testing that could be used in practice on real projects of BI companies. The aim of thesis is to design a methodology for BI solutions testing based on theoretical knowledge of Business Intelligence and software testing with an emphasis on the specific BI characteristics and requirements and also in accordance with Clever Decision's requirements and test it in practice on a real project in this company. The paper is written up on the basis of studying literature in the field of Business Intelligence and software testing from Czech and foreign sources as well as on the recommendations and experience of Clever Decision's employees. It is one of the few if not the first sources dealing with methodology for BI solutions testing in the Czech language. This work could also serve as a basis for more comprehensive methodologies of BI solutions testing. The thesis can be divided into theoretical and practical part. The theoretical part tries to explain the purpose of Business Intelligence use in enterprises. It elucidates particular components of the BI solution, then the actual software testing, various types of tests, with emphasis on the differences and specificities of Business Intelligence. The theoretical part is followed by designed methodology for BI solutions using a generic model for the BI/DW solution testing. The practical part's highlight is the description of real BI project testing in Clever Decision according to the designed methodology.
129

Vývoj datového skladu na platformě Teradata a Informatica v sektoru pojišťovnictví / Data warehousing on technological platform TERADATA and Informatica in the insurance industry

Šiler, Zdeněk January 2012 (has links)
This thesis focuses on data warehousing on technological platform TERADATA and Informatica Power Center (further only IFPC). TERADATA provides a robust database system for storage of big volume data and query processing over such data. Product Informatica Powercenter is a tool for developing of ETL processes. Both of tools belong to mature technology for large data warehouse development which stores large volumes of data over the enterprise. The thesis analyses both tools to build data warehouse and the specifics of their use in the insurance sector. The thesis is divided into two main thematic sections - theoretical and practical part. The theoretical part describes database system TERADATA and ETL tool IFPC in details, including analysis of business intelligence architecture in the insurance segment, which often uses this platform for data warehouse development. The thesis describes the architecture of database system TERADATA and the way to data storage and query processing. Then specific features, on which is necessary to focus by TERADATA data warehouse development, are characterized. Also its advantages and disadvantages are analyzed. Database system TERADATA is faced with other competing database systems. The thesis deals with general characteristics of ETL tool IFPC -- software architecture a its components. It examines the advantages and disadvantages of IFPC compared to competitors on the market. Conclusion of the theoretical part analyzes the synergies between Teradata and IFPC. The thesis explains the real benefits of combination TERADATA and IFPC. The practical part of thesis demostrates the use of tools for data warehousing development on real project Unification of client data. This project describes the entire development process in a data warehouse from business requirements through functional and technical design to implementation of ETL mapping in Informatica Power Center. It deals with bug fixing during ETL development and testing methods. The pratical part focuses on implementation of chosen mapping in IFPC which is deployed in the insurance sector. Part of this thesis is a comparison of ETL tools IFPC with SSIS ETL tool integrated in MS SQL Server 2008 R2.
130

Migrace systémové databáze elektronického obchodu / E-commerce System Database Migration

Zkoumalová, Barbora January 2016 (has links)
The object of master‘s thesis is design and creation of e-commerce system database migration tool from the ZenCart platform to the PrestaShop platform. Both system databases will be described and analysed and based on gained information the migration tool will be created according customers‘ requirements and then final data migration from original to the new database will be executed.

Page generated in 0.0775 seconds