• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 136
  • 28
  • 20
  • 19
  • 10
  • 6
  • 6
  • 5
  • 5
  • 4
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 285
  • 285
  • 106
  • 68
  • 46
  • 40
  • 39
  • 38
  • 37
  • 35
  • 35
  • 33
  • 32
  • 29
  • 28
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
191

Approche de conception haut-niveau pour l'accélération matérielle de calcul haute performance en finance / High-level approach for hardware acceleration of high-performance computing in finance

Mena morales, Valentin 12 July 2017 (has links)
Les applications de calcul haute-performance (HPC) nécessitent des capacités de calcul conséquentes, qui sont généralement atteintes à l'aide de fermes de serveurs au détriment de la consommation énergétique d'une telle solution. L'accélération d'applications sur des plateformes hétérogènes, comme par exemple des FPGA ou des GPU, permet de réduire la consommation énergétique et correspond donc à un compromis architectural plus séduisant. Elle s'accompagne cependant d'un changement de paradigme de programmation et les plateformes hétérogènes sont plus complexes à prendre en main pour des experts logiciels. C'est particulièrement le cas des développeurs de produits financiers en finance quantitative. De plus, les applications financières évoluent continuellement pour s'adapter aux demandes législatives et concurrentielles du domaine, ce qui renforce les contraintes de programmabilité de solutions d'accélérations. Dans ce contexte, l'utilisation de flots haut-niveaux tels que la synthèse haut-niveau (HLS) pour programmer des accélérateurs FPGA n'est pas suffisante. Une approche spécifique au domaine peut fournir une réponse à la demande en performance, sans que la programmabilité d'applications accélérées ne soit compromise.Nous proposons dans cette thèse une approche de conception haut-niveau reposant sur le standard de programmation hétérogène OpenCL. Cette approche repose notamment sur la nouvelle implémentation d'OpenCL pour FPGA introduite récemment par Altera. Quatre contributions principales sont apportées : (1) une étude initiale d'intégration de c'urs de calculs matériels à une librairie logicielle de calcul financier (QuantLib), (2) une exploration d'architectures et de leur performances respectives, ainsi que la conception d'une architecture dédiée pour l'évaluation d'option américaine et l'évaluation de volatilité implicite à partir d'un flot haut-niveau de conception, (3) la caractérisation détaillée d'une plateforme Altera OpenCL, des opérateurs élémentaires, des surcouches de contrôle et des liens de communication qui la compose, (4) une proposition d'un flot de compilation spécifique au domaine financier, reposant sur cette dernière caractérisation, ainsi que sur une description des applications financières considérées, à savoir l'évaluation d'options. / The need for resources in High Performance Computing (HPC) is generally met by scaling up server farms, to the detriment of the energy consumption of such a solution. Accelerating HPC application on heterogeneous platforms, such as FPGAs or GPUs, offers a better architectural compromise as they can reduce the energy consumption of a deployed system. Therefore, a change of programming paradigm is needed to support this heterogeneous acceleration, which trickles down to an increased level of programming complexity tackled by software experts. This is most notably the case for developers in quantitative finance. Applications in this field are constantly evolving and increasing in complexity to stay competitive and comply with legislative changes. This puts even more pressure on the programmability of acceleration solutions. In this context, the use of high-level development and design flows, such as High-Level Synthesis (HLS) for programming FPGAs, is not enough. A domain-specific approach can help to reach performance requirements, without impairing the programmability of accelerated applications.We propose in this thesis a high-level design approach that relies on OpenCL, as a heterogeneous programming standard. More precisely, a recent implementation of OpenCL for Altera FPGA is used. In this context, four main contributions are proposed in this thesis: (1) an initial study of the integration of hardware computing cores to a software library for quantitative finance (QuantLib), (2) an exploration of different architectures and their respective performances, as well as the design of a dedicated architecture for the pricing of American options and their implied volatility, based on a high-level design flow, (3) a detailed characterization of an Altera OpenCL platform, from elemental operators, memory accesses, control overlays, and up to the communication links it is made of, (4) a proposed compilation flow that is specific to the quantitative finance domain, and relying on the aforementioned characterization and on the description of the considered financial applications (option pricing).
192

Evaluating credal set theory as a belief framework in high-level information fusion for automated decision-making

Karlsson, Alexander January 2010 (has links)
High-level information fusion is a research field in which methods for achieving an overall understanding of the current situation in an environment of interest are studied. The ultimate goal of these methods is to provide effective decision-support for human or automated decision-making. One of the main proposed ways of achieving this is to reduce the uncertainty, coupled with the decision, by utilizing multiple sources of information. Handling uncertainty in high-level information fusion is performed through a belief framework, and one of the most commonly used such frameworks is based on Bayesian theory. However, Bayesian theory has often been criticized for utilizing a representation of belief and evidence that does not sufficiently express some types of uncertainty. For this reason, a generalization of Bayesian theory has been proposed, denoted as credal set theory, which allows one to represent belief and evidence imprecisely. In this thesis, we explore whether credal set theory  yields measurable advantages, compared to Bayesian theory, when used as a belief framework in high-level information fusion for automated decision-making, i.e., when decisions are made by some pre-determined algorithm. We characterize the Bayesian and credal operators for belief updating and evidence combination and perform three experiments where the Bayesian and credal frameworks are evaluated with respect to automated decision-making. The decision performance of the frameworks are measured by enforcing a single decision, and allowing a set of decisions, based on the frameworks’ belief and evidence structures. We construct anomaly detectors based on the frameworks and evaluate these detectors with respect to maritime surveillance. The main conclusion of the thesis is that although the credal framework uses considerably more expressive structures to represent belief and evidence, compared to the Bayesian framework, the performance of the credal framework can be significantly worse, on average, than that of the Bayesian framework, irrespective of the amount of imprecision. / Högnivåfusion är ett forskningsområde där man studerar metoder för att uppnå en övergripande situationsförståelse för någon miljö av intresse. Syftet med högnivåfusion är att tillhandahålla ett effektivt beslutstöd for mänskligt eller automatiskt beslutsfattande. För att åstadkomma detta har det föreslagits att man ska reducera osäkerhet kring beslutet genom att använda flera olika källor av information. Det främsta verktyget för att hantera osäkerhet inom högnivåfusion är ett ramverk för att hantera evidensbaserad trolighet och evidenser kring en given tillståndsrymd. Ett av de vanligaste ramverken som används inom högnivåfusion för detta syfte är baserad på Bayesiansk teori. Denna teori har dock ofta blivit kritiserad för att den använder en representation av evidensbaserad trolighet och evidenser som inte är tillräckligt uttrycksfull för att representera vissa typer av osäkerheter. På grund av detta har en generalisering av Bayesiansk teori föreslagits, kallad “credal set theory“, där man kan representera evidensbaserad trolighet och evidenser oprecist. I denna avhandling undersöker vi om “credal set theory“ medför mätbara fördelar, jämfört med Bayesiansk teori, då det används som ett ramverk i högnivåfusion för automatiskt beslutsfattande, dvs. när ett beslut fattas av en algoritm. Vi karaktäriserar Bayesiansk och “credal“ operatorer för updatering av evidensbaserad trolighet och kombination av evidenser och vi presenterar tre experiment där vi utvärderar ramverken med avseende på automatiskt beslutsfattande. Utvärderingen genomförs med avseende på ett enskilt beslut och för en mängd beslut baserade på ramverkens strukturer för evidensbaserad trolighet och evidens. Vi konstruerar anomalidetektorer baserat på de två ramverken som vi sedan utvärderar med avseende på maritim övervakning.Den främsta slutsatsen av denna avhandling är att även om “credal set theory“ har betydligt mer uttrycksfulla strukturer för att representera evidensbaserad trolighet och evidenser kring ett tillståndsrum, jämfört med det Bayesianska ramverket, så kan “credal set theory“ prestera signifikant sämre i genomsnitt än det Bayesianska ramverket, oberoende av mängden oprecision. / <p>Examining Committee: Arnborg, Stefan, Professor (KTH Royal Institute of Technology), Kjellström, Hedvig, Associate Professor (Docent) (KTH Royal Institute of Technology), Saffiotti, Alessandro, Professor (Örebro University)</p>
193

Maturation of Clay Seals in Deep Bore Holes for Disposal of Radioactive waste : Theory and Experiments

Yang, Ting January 2017 (has links)
KBS-3 and very deep borehole (VDH) concepts are two major types of long-term geologicaldisposal methods for high-level radioactive waste (HLW) isolating from the biosphere. TheKBS-3V concept for isolating the HLW at the depth of 400-500 m, is the officially proposedoption in Sweden and has been the subject of considerable research in the past few decades,while the VDH concept was considered as an option in the 1950s but later became discouragedbecause of insufficient experience in drilling technology. The greatest merit of the VDHconcept is that the almost stagnant groundwater in the deep boreholes prevents the transport ofthe possible release of radionuclides into the rock or up to the ground level. Since variousdisadvantages of the KBS-3V concept were found in previous research, the superiority of VDHconcept attracted the researchers to continue studying it into the late 1980s.The geological repositories of both of KBS-3V and VDH types primarily consist of a naturalbarrier (host rock) and of an engineering barrier (also known as a buffer/backfill barrier).According to the principle of IAEA and national relative research organizations, thebuffer/backfill material should have low permeability and good expandability, as well assuitable physical and sealing properties.The thesis concerns the VDH concept and is focused on the construction and performance ofthose parts of the sealed repository that are not affected by high temperature or gamma radiation.In the lower part of a VDH repository, the clay packages containing HLW will be exposed tohigh temperature (100-150 􀄇 ) in the borehole and to highly saline groundwater. In theinstallation phase of HLW, the groundwater will be pumped out and replaced by medium-softsmectite clay mud in which the HLW packages are installed vertically. During the hydrationand maturation of the clay components, the microstructural reorganization, water transport,migration of clay particles and redistribution of the density of the components take place. Thematuration determines the transient evolution of the clay seals and influences the rheologicaland soil mechanical behavior in the installation phase. The maturation of clay system alsodetermines their ultimate sealing potential of VDH repositories.This study presents the work carried out for investigating the maturation of the buffer-backfillclay in the HLW deep borehole. Initially in the study three types of clays, the Namontmorillonite,magnesium-rich and illite-smectite mixed layer clays, were examined for estimating their performance as the barrier candidate material. This is mainly presented in theliterature review. The experimental study was conducted on montmorillonite GMZ clays andI/S mixed-layer Holmehus clay. The expandability and permeability tests were carried out forinterpretation of the recorded swelling development and assessment of the effect of the salineconditions, with the goal of deriving a relationship between swelling pressure and hydraulicconductivity for different dry densities. The maturation tests of initially fully-saturatedHolmehus clay and partly saturated GMZ clay were performed. During the tests, the shearstrength mobilised by the relative movement of densified mud and migrated dense clay -contained in a perforated central tube - were determined. According to the results of shearstrength tests, the maximum operation time or the number of clay packages to be placed in asingle operation was evaluated, whilst the suitable saturation degree of the dense clay wasdiscussed as well.A model of the maturation of initially water-saturated clay seals based on Darcy’s law wasworked out and the evolution of the clay components in a lab-scale borehole using Holmehusclay were performed and compared with the experimental recordings. Good agreementsbetween the physical behaviors of the theoretical simulations and the measurements wasachieved by which the validity of the model was verified. Using the results, the hydration andsoil migration in the entire maturation process were presented in diagram. The model was alsoused for preliminary evaluation of the maturation products in real boreholes by assuming thesame Holmehus clay as used in the tests. Two constellation of borehole and dense clay withdifferent diameters, 80 cm borehole /60 cm clay and 80cm/50cm, were assumed. The resultsrespecting dry density and hydraulic conductivity of the ultimate maturation products, and thedegree of homogeneous of the buffer and backfill clay system in the assumed boreholes, arepresented and discussed. The options of different mineral types and initial physical propertiesof the candidate buffer clays provide a reference for engineering barrier design of HLW disposalin VDH.
194

Le modèle flot de données appliqué à la synthèse haut-niveau pour le traitement d’images sur caméra intelligente à base de FPGA. Application aux systèmes d’apprentissage supervisés / The dataflow model for High-Level Synthesis on FPGA-based smart camera. Application to supervised machine learning algorithms

Bourrasset, Cédric 09 February 2016 (has links)
La synthèse de haut niveau (High Level Synthesis (HLS)) est un domaine de recherche qui vise à automatiser le passage de la description d’un algorithme à une représentation au niveau registre de celui-ci en vue de son implantation sur un circuit numérique. Si le problème reste à ce jour largement ouvert pour des algorithmes quelconques, des solutions ont commencé à voir le jour au sein de domaines spécifiques. C’est notamment le cas dans le domaine du traitement d’images où l’utilisation du modèle flot de données offre un bon compromis entre expressivité et efficacité. C’est ce que nous cherchons à démontrer dans cette thèse, qui traite de l’applicabilité du modèle flot de données au problème de la synthèse haut niveau à travers deux exemples d’implantation d’applications de vision complexes sur FPGA. Les applications, issues du domaine de l’apprentissage supervisé sont un système de classification à bases de machines à vecteurs supports (SVM) et un système de reconnaissance exploitant un réseau de neurones convolutionnels (CNN). Dans les deux cas, on étudie les problématiques posées par la reformulation, au sein du modèle flot de données, des structures de données et algorithmes associés ainsi que l’impact de cette reformulation sur l’efficacité des implémentations résultantes. Les expérimentations sont menées avec CAPH, un outil de HLS exploitant le modèle flot de données. / High-level synthesis is a field of research that aims to automate the transformation from an high-level algorithmic description to a register level representation for its implementation on a digital circuit. Most of existing tools based on imperative languages try to provide a general solution to any type of existing algorithm. This approach can be inefficient in some applications where the algorithm description relies on a different paradigm from the hardware execution model. This major drawback can be figured out by the use of specific langages, named Domain Specific Language (DSL). Applied to the image processing field, the dataflow model appears as a good compromise between the expressiveness of the algorithm description and the final implementation efficiency. This thesis address the use of the dataflow programming model as response to high-level synthesis problematics for image processing algorithms on FPGA. To demonstrate the effectiveness of the proposed method but also to put forth the algorithmic reformulation effort to be made by the developer, an ambitious class of applications was chosen : supervised machine learning systems. It will be addressed in particular two algorithms, a classification system based on Support Vector Machine and a convolutional neural network. Experiments will be made with the CAPH langage, a specific HLS tool based on the dataflow programming model.
195

Sport de haut niveau et formation : l'exemple du Football-Club de Sochaux-Montbéliard / High performance sport and vocational training

Gaborel, Anne-Sophie 02 December 2015 (has links)
Dans les travaux en sciences humaines, le football est abordé selon deux axes : le football amateur et celui, plus médiatisé, des professionnels.Imaginés comme des monde imperméables, il est rarement fait mention du passage de l'un à l'autre que réalisent une poignée de jeunes passionnés. Présentés le plus souvent de manière hagiographique, les récits de vie des vedettes du ballon rond ne font pas mention des fondements de l'engagement sportif et des années passées en apprentissage dans les pôles d'excellence. La recherche présentée ici vise à mieux saisir, sociologiquement, les étapes d'entrée dans le haut niveau, des premières balles échangées en famille à la signature du contrat. Au centre de ce processus d'inclusion, le séjour dans un club formateur doit être examinée de manière approfondie. Ce travail, conçu dans une perspective interactionniste, étudie de façon compréhensive les relations interindividuelles au fondement de l'accès à la carrière de haut niveau. Dans cette optique, l'étude de l'école des Lionceaux, à Sochaux, constitue un bon exemple de la prise en charge des aspirants de par sa politique tournée vers la jeunesse et de ses bons résultats. Les données utilisées repose sur une triangulation des sources : des questionnaires, envoyés dans les centres de l'Hexagone, des observations des matchs et des temps de vie en pensionnat, des entretiens -enfin- auprès des apprentis et des entraîneurs. Les éléments collectés permettent de déconstruire le parcours de ceux qui s'engagent dans l'aventure de l'excellence sportive, qu'ils accèdent au Graal ou non. En effet, sur chaque promotion d'une quarantaine d'aspirants, seuls un ou deux signeront un contrat, les candidats éconduits devant renouer avec un quotidien loin des stades. Au-delà des trajectoires individuelles, c'est toute la dynamique de recrutement et de maintien des clubs au haut niveau que l'on entrevoit. / In human sciences field, there are two ways to study football : amateur football and professionnal, which got a widely media coverage. Seen like two worlds split, the coming up to professional from amateur football achieved by a handful of young players is rarely studied. Accounts of football star life's, often written in an hagiographic style, does not mention fondations of the sport entry or the years past in specialized centres of formation. This work aim, by sociological approaches, to the signature of a contract. Inside of this process, the time spent in instructive football clubs has to be studied in details. This work, conceived in an interactionist way, focuses on interindividual relationship which contributes on the access to high-level career. With this point of view, the Lionceaux school, in Sochaux, bring a good exampleof the way to train candidate, because of its youth politics and its good results. Our data comes from three kind of sources : the questionnaires sent to French football clubs, the watching of matches and lifetime in centers, and the interviews of players and coaches. It permits us to detailed candidates' careers, who succeed or not in the purchase of becoming a high-level footballer. Indeed, each year, from about forty players, only one or two will sign a contact, whereas the other will have to return to their lives, away from stadiums. Beyond individual career, we would see here the whole dynamics and maintenance of high-level football clubs.
196

Soft Data-Augmented Risk Assessment and Automated Course of Action Generation for Maritime Situational Awareness

Plachkov, Alex January 2016 (has links)
This thesis presents a framework capable of integrating hard (physics-based) and soft (people-generated) data for the purpose of achieving increased situational assessment (SA) and effective course of action (CoA) generation upon risk identification. The proposed methodology is realized through the extension of an existing Risk Management Framework (RMF). In this work, the RMF’s SA capabilities are augmented via the injection of soft data features into its risk modeling; the performance of these capabilities is evaluated via a newly-proposed risk-centric information fusion effectiveness metric. The framework’s CoA generation capabilities are also extended through the inclusion of people-generated data, capturing important subject matter expertise and providing mission-specific requirements. Furthermore, this work introduces a variety of CoA-related performance measures, used to assess the fitness of each individual potential CoA, as well as to quantify the overall chance of mission success improvement brought about by the inclusion of soft data. This conceptualization is validated via experimental analysis performed on a combination of real- world and synthetically-generated maritime scenarios. It is envisioned that the capabilities put forth herein will take part in a greater system, capable of ingesting and seamlessly integrating vast amounts of heterogeneous data, with the intent of providing accurate and timely situational updates, as well as assisting in operational decision making.
197

Développement systématique et sûreté d’exécution en programmation parallèle structurée / Systematic development and safety of execution in structured parallel programming

Gesbert, Louis 05 March 2009 (has links)
Exprimer le parallélisme dans la programmation de manière simple et performante est un défi auquel l'informatique fait face, en raison de l'évolution actuelle des architectures matérielles. BSML est un langage permettant une programmation parallèle de haut niveau, structurée, qui participe à cette recherche. En s'appuyant sur le coeur du langage existant, cette thèse propose d'une part des extensions qui en font un langage plus général et plus simple (traits impératifs tels que références et exceptions, syntaxe spécifique...) tout en conservant et étendant sa sûreté (sémantiques formelles, système de types...) et d'autre part une méthodologie de développement d'applications parallèles certifiées / Finding a good paradigm to represent parallel programming in a simple and efficient way is a challenge currently faced by computer science research, mainly due to the evolution of machine architectures towards multi-core processors. BSML is a high level, structured parallel programming language that takes part in the research in an original way. By building upon existing work, this thesis extends the language and makes it more general, simple and usable with added imperative features such as references and exceptions, a specific syntax, etc. The existing formal and safety characteristics of the language (semantics, type system...) are preserved and extended. A major application is given in the form of a methodology for the development of fully proved parallel programs
198

Analyse de scène temps réel pour l'interaction 3D / Real-time scene analysis for 3D interaction

Kaiser, Adrien 01 July 2019 (has links)
Cette thèse porte sur l'analyse visuelle de scènes intérieures capturées par des caméras de profondeur dans le but de convertir leurs données en information de haut niveau sur la scène. Elle explore l'application d'outils d'analyse géométrique 3D à des données visuelles de profondeur en termes d'amélioration de qualité, de recalage et de consolidation. En particulier, elle vise à montrer comment l'abstraction de formes permet de générer des représentations légères pour une analyse rapide avec des besoins matériels faibles. Cette propriété est liée à notre objectif de concevoir des algorithmes adaptés à un fonctionnement embarqué en temps réel dans le cadre d'appareils portables, téléphones ou robots mobiles. Le contexte de cette thèse est l'exécution d'un procédé d’interaction 3D temps réel sur un appareil mobile. Cette exécution soulève plusieurs problématiques, dont le placement de zones d'interaction 3D par rapport à des objets environnants réels, le suivi de ces zones dans l'espace lorsque le capteur est déplacé ainsi qu'une utilisation claire et compréhensible du système par des utilisateurs non experts. Nous apportons des contributions vers la résolution de ces problèmes pour montrer comment l'abstraction géométrique de la scène permet une localisation rapide et robuste du capteur et une représentation efficace des données fournies ainsi que l'amélioration de leur qualité et leur consolidation. Bien que les formes géométriques simples ne contiennent pas autant d'information que les nuages de points denses ou les ensembles volumiques pour représenter les scènes observées, nous montrons qu’elles constituent une approximation acceptable et que leur légèreté leur donne un bon équilibre entre précision et performance. / This PhD thesis focuses on the problem of visual scene analysis captured by commodity depth sensors to convert their data into high level understanding of the scene. It explores the use of 3D geometry analysis tools on visual depth data in terms of enhancement, registration and consolidation. In particular, we aim to show how shape abstraction can generate lightweight representations of the data for fast analysis with low hardware requirements. This last property is important as one of our goals is to design algorithms suitable for live embedded operation in e.g., wearable devices, smartphones or mobile robots. The context of this thesis is the live operation of 3D interaction on a mobile device, which raises numerous issues including placing 3D interaction zones with relation to real surrounding objects, tracking the interaction zones in space when the sensor moves and providing a meaningful and understandable experience to non-expert users. Towards solving these problems, we make contributions where scene abstraction leads to fast and robust sensor localization as well as efficient frame data representation, enhancement and consolidation. While simple geometric surface shapes are not as faithful as heavy point sets or volumes to represent observed scenes, we show that they are an acceptable approximation and their light weight makes them well balanced between accuracy and performance.
199

Development of Coupled Thermal-Hydraulic-Mechanical-Chemical Models for Predicting Rock Permeability Change / 岩盤の透水性変化を予測する熱・水・応力・化学連成モデルの開発

Ogata, Sho 24 September 2019 (has links)
京都大学 / 0048 / 新制・課程博士 / 博士(工学) / 甲第22051号 / 工博第4632号 / 新制||工||1722(附属図書館) / 京都大学大学院工学研究科都市社会工学専攻 / (主査)教授 岸田 潔, 教授 木村 亮, 教授 小池 克明 / 学位規則第4条第1項該当 / Doctor of Philosophy (Engineering) / Kyoto University / DFAM
200

Implementace algoritmu dekompozice matice a pseudoinverze na FPGA / Implementation of matrix decomposition and pseudoinversion on FPGA

Röszler, Pavel January 2018 (has links)
The purpose of this thesis is to implement algorithms of matrix eigendecomposition and pseudoinverse computation on a Field Programmable Gate Array (FPGA) platform. Firstly, there are described matrix decomposition methods that are broadly used in mentioned algorithms. Next section is focused on the basic theory and methods of computation eigenvalues and eigenvectors as well as matrix pseudoinverse. Several examples of implementation using Matlab are attached. The Vivado High-Level Synthesis tools and libraries were used for final implementation. After the brief introduction into the FPGA fundamentals the thesis continues with a description of implemented blocks. The results of each variant were compared in terms of timing and FPGA utilization. The selected block has been validated on the development board and its arithmetic precision was analyzed.

Page generated in 0.0525 seconds