• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 5
  • 1
  • Tagged with
  • 10
  • 10
  • 4
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Analyse réaliste d'algorithmes standards / Realistic analysis of standard algorithms

Auger, Nicolas 20 December 2018 (has links)
À l'origine de cette thèse, nous nous sommes intéressés à l'algorithme de tri TimSort qui est apparu en 2002, alors que la littérature sur le problème du tri était déjà bien dense. Bien qu'il soit utilisé dans de nombreux langages de programmation, les performances de cet algorithme n'avaient jamais été formellement analysées avant nos travaux. L'étude fine de TimSort nous a conduits à enrichir nos modèles théoriques, en y incorporant des caractéristiques modernes de l'architecture des ordinateurs. Nous avons, en particulier, étudié le mécanisme de prédiction de branchement. Grâce à cette analyse théorique, nous avons pu proposer des modifications de certains algorithmes élémentaires (comme l'exponentiation rapide ou la dichotomie) qui utilisent ce principe à leur avantage, améliorant significativement leurs performances lorsqu'ils sont exécutés sur des machines récentes. Enfin, même s'il est courant dans le cadre de l'analyse en moyenne de considérer que les entrées sont uniformément distribuées, cela ne semble pas toujours refléter les distributions auxquelles nous sommes confrontés dans la réalité. Ainsi, une des raisons du choix d'implanter TimSort dans des bibliothèques standard de Java et Python est probablement sa capacité à s'adapter à des entrées partiellement triées. Nous proposons, pour conclure cette thèse, un modèle mathématique de distribution non-uniforme sur les permutations qui favorise l'apparition d'entrées partiellement triées, et nous en donnons une analyse probabiliste détaillée / At first, we were interested in TimSort, a sorting algorithm which was designed in 2002, at a time where it was hard to imagine new results on sorting. Although it is used in many programming languages, the efficiency of this algorithm has not been studied formally before our work. The fine-grain study of TimSort leads us to take into account, in our theoretical models, some modern features of computer architecture. In particular, we propose a study of the mechanisms of branch prediction. This theoretical analysis allows us to design variants of some elementary algorithms (like binary search or exponentiation by squaring) that rely on this feature to achieve better performance on recent computers. Even if uniform distributions are usually considered for the average case analysis of algorithms, it may not be the best framework for studying sorting algorithms. The choice of using TimSort in many programming languages as Java and Python is probably driven by its efficiency on almost-sorted input. To conclude this dissertation, we propose a mathematical model of non-uniform distribution on permutations, for which permutations that are almost sorted are more likely, and provide a detailed probabilistic analysis
2

Development Of Strategies For Reducing The Worst-case Messageresponse Times On The Controller Area Network

Celik, Vakkas 01 January 2012 (has links) (PDF)
The controller area network (CAN) is the de-facto standard for in-vehicle communication. The growth of time-critical applications in modern cars leads to a considerable increase in the message trac on CAN. Hence, it is essential to determine ecient message schedules on CAN that guarantee that all communicated messages meet their timing constraints. The aim of this thesis is to develop oset scheduling strategies that
3

An efficient analysis of pareto optimal solutions in multidisciplinary design

Erfani, Tohid January 2011 (has links)
Optimisation is one of the most important and challenging part of any engineering design. In real world design problems one faces multiobjective optimisation under constraints. The optimal solution in these cases is not unique because the objectives can contradict each other. In such cases, a set of optimal solutions which forms a Pareto frontier in the objective space is considered. There are many algorithms to generate the Pareto frontier. However, only a few of them are potentially capable of providing an evenly distributed set of the solutions. Such a property is especially important in real-life design because a decision maker is usually able to analyse only a very limited quantity of solutions. This thesis consists of two main parts. At first, it develops and gives the detailed description of two different algorithms that are able to generate an evenly distributed Pareto set in a general formulation. One is a classical approach and called Directed Search Domain (DSD) and the other, the cylindrical constraint evolutionary algorithm (CCEA), is a hybrid population based method. The efficiency of the algorithms are demonstrated by a number of challenging test cases and the comparisons with the results of the other existing methods. It is shown that the proposed methods are successful in generating the Pareto solutions even when some existing methods fail. In real world design problems, deterministic approaches cannot provide a reliable solution as in the event of uncertainty, deterministic optimal solution would be infeasible in many instances. Therefore a solution less sensitive to problem perturbation is desirable. This leads to the robust solution which is the focus of the second part of the thesis. In the literature, there are some techniques tailored for robust optimisation. However, most of them are either computationally expensive or do not systematically articulate the designer preferences into a robust solution. In this thesis, by introducing a measure for robustness in multiobjective context, a tunable robust function (TRF) is presented. Including the TRF in the problem formulation, it is demonstrated that the desirable robust solution based on designer preferences can be obtained. This not only provides the robust solution but also gives a control over the robustness level. The method is efficient as it only increases the dimension of the problem by one irrespective of the dimension of the original problem.
4

A Pareto-Frontier Analysis of Performance Trends for Small Regional Coverage LEO Constellation Systems

Hinds, Christopher Alan 01 December 2014 (has links) (PDF)
As satellites become smaller, cheaper, and quicker to manufacture, constellation systems will be an increasingly attractive means of meeting mission objectives. Optimizing satellite constellation geometries is therefore a topic of considerable interest. As constellation systems become more achievable, providing coverage to specific regions of the Earth will become more common place. Small countries or companies that are currently unable to afford large and expensive constellation systems will now, or in the near future, be able to afford their own constellation systems to meet their individual requirements for small coverage regions. The focus of this thesis was to optimize constellation geometries for small coverage regions with the constellation design limited between 1-6 satellites in a Walker-delta configuration, at an altitude of 200-1500km, and to provide remote sensing coverage with a minimum ground elevation angle of 60 degrees. Few Pareto-frontiers have been developed and analyzed to show the tradeoffs among various performance metrics, especially for this type of constellation system. The performance metrics focus on geometric coverage and include revisit time, daily visibility time, constellation altitude, ground elevation angle, and the number of satellites. The objective space containing these performance metrics were characterized for 5 different regions at latitudes of 0, 22.5, 45, 67.5, and 90 degrees. In addition, the effect of minimum ground elevation angle was studied on the achievable performance of this type of constellation system. Finally, the traditional Walker-delta pattern constraint was relaxed to allow for asymmetrical designs. These designs were compared to see how the Walker-delta pattern performs compared to a more relaxed design space. The goal of this thesis was to provide both a framework as well as obtain and analyze Pareto-frontiers for constellation performance relating to small regional coverage LEO constellation systems. This work provided an in-depth analysis of the trends in both the design and objective space of the obtained Pareto-frontiers. A variation on the εNSGA-II algorithm was utilized along with a MATLAB/STK interface to produce these Pareto-frontiers. The εNSGA-II algorithm is an evolutionary algorithm that was developed by Kalyanmoy Deb to solve complex multi-objective optimization problems. The algorithm used in this study proved to be very efficient at obtaining various Pareto-frontiers. This study was also successful in characterizing the design and solution space surrounding small LEO remote sensing constellation systems providing small regional coverage.
5

Structual-acoustic properties of automotive panels with shell elements

Kumar, Gaurav January 2014 (has links)
The automotive industry has witnessed a trend in the recent years of reducing the bulk weight of the vehicle in order to achieve improved ride dynamics and economical fuel consumption. Unfortunately, reducing the bulk weight often compromises the noise, vibra- tion, and harshness (NVH) characteristics of the vehicle. In general, the automotive body panels are made out of thin sheet metals (steel and aluminium) that have a very low bend- ing stiffness. Hence, it becomes important to find countermeasures that will increase the structural stiffness of these thin body panels without affecting their bulk weight. One such countermeasure is to introduce the geometrical indentations on various body panels. The geometrical indentation explained in this thesis is in the shape of elliptical dome, which supports the increase of the structural stiffness whilst keeping the bulk weight constant. The primary reason to choose elliptical domes as the applied geometrical indentation is due to a significant amount of interest shown by Jaguar Land Rover. Moreover, the elliptical domes, because of the nature of its design, can cover a larger surface area with minimal depth, thereby, eliminating the possibility of sharp and pointy indentations. This thesis presents a comprehensive study of the structural-acoustic behaviour of the automotive-type panels with dome-shaped indentations. The ultimate aim of this research is to establish a set of design guidelines in order to produce automotive-type panels with optimised dome-shaped indentations. In order to do so, a new design optimisation strategy is proposed that results in the optimal placement of the required dome-shaped indenta- tions. The optimisation problem addressed in this thesis is unlike a general mathematical problem, and requires specific methodologies for its solution. Therefore, the use of genetic algorithm is observed as the most suitable method in order to tackle this type of design optimisation problem. During the development of the optimisation procedure, the preliminary results show a consistency in the design patterns. This led to the motivation to investigate a few intuitively designed panels, which are inspired by the initial, trial, optimisation results. Therefore, four intuitively designed panels are investigated for their structural-acoustic characteristics. The study of the intuitively designed panels provided essential physical insight into the design optimisation problem, which ultimately defined the guidelines in order to develop the proposed optimisation procedure. This type of optimisation procedure is completely new in the domain of structural-acoustic optimisation. The efficiency of the underlying work lies in the separate investigation of both the structural and the acoustic properties of the panels with various dome-shaped indentations, and then utilising the insights gained in order to develop a specific optimisation algorithm to stream-line the dome-shaped panel design procedure.
6

Architecture Framework for Trapped-ion Quantum Computer based on Performance Simulation Tool

Ahsan, Muhammad January 2015 (has links)
<p>The challenge of building scalable quantum computer lies in striking appropriate balance between designing a reliable system architecture from large number of faulty computational resources and improving the physical quality of system components. The detailed investigation of performance variation with physics of the components and the system architecture requires adequate performance simulation tool. In this thesis we demonstrate a software tool capable of (1) mapping and scheduling the quantum circuit on a realistic quantum hardware architecture with physical resource constraints, (2) evaluating the performance metrics such as the execution time and the success probability of the algorithm execution, and (3) analyzing the constituents of these metrics and visualizing resource utilization to identify system components which crucially define the overall performance.</p><p>Using this versatile tool, we explore vast design space for modular quantum computer architecture based on trapped ions. We find that while success probability is uniformly determined by the fidelity of physical quantum operation, the execution time is a function of system resources invested at various layers of design hierarchy. At physical level, the number of lasers performing quantum gates, impact the latency of the fault-tolerant circuit blocks execution. When these blocks are used to construct meaningful arithmetic circuit such as quantum adders, the number of ancilla qubits for complicated non-clifford gates and entanglement resources to establish long-distance communication channels, become major performance limiting factors. Next, in order to factorize large integers, these adders are assembled into modular exponentiation circuit comprising bulk of Shor's algorithm. At this stage, the overall scaling of resource-constraint performance with the size of problem, describes the effectiveness of chosen design. By matching the resource investment with the pace of advancement in hardware technology, we find optimal designs for different types of quantum adders. Conclusively, we show that 2,048-bit Shor's algorithm can be reliably executed within the resource budget of 1.5 million qubits.</p> / Dissertation
7

[pt] ALOCAÇÃO DE RECURSOS ONLINE DA PERSPECTIVA DE ANUNCIANTES / [en] ONLINE ADVERTISER-CENTRIC BUDGET ALLOCATION

EDUARDO CESAR NOGUEIRA COUTINHO 18 August 2020 (has links)
[pt] Nesse trabalho, propomos o problema AdInvest, que modela o processo decisiório de alocação de investimento em marketing digital do ponto de vista do anunciante. Para o problema proposto, definimos um algoritmo chamado balGreedy, e provamos suas garantias para instâncias determísticas e estocásticas do AdInvest. Os teoremas provados garantem ao nosso algoritmo resultados de pior caso relativamente próximos ao OPT, em diversos tipos de instâncias levantadas ao decorrer do trabalho. Em especial, focamos nas instâncias que modelam o efeito de saturação das audiências, que se faz presente na dinâmica de anúncios online. Como mostrado nos experimentos computacionais, o algoritmo balGreedy se mostrou consistentemente eficiente em comparação com as políticas alternativas adotadas, tanto nas instâncias que foram geradas por simulação, quanto em instâncias reais obtidas a partir de dados de um anunciante do Facebook Ads. / [en] In this work, we propose the problem AdInvest, which models the decision-making process for allocating investment in digital marketing from the advertiser perspective. For the proposed problem, we define an algorithm called balGreedy, and we prove its guarantees in deterministic and stochastic instances of the AdInvest. The proven theorems assure to our algorithm worst-case results relatively close to OPT, in several types of instances raised during the work. In particular, we focus on the instances that model the audience saturation effect, which is present in the dynamics of online advertisements. As shown in the computational experiments, the balGreedy algorithm had been consistently efficient compared to the alternative policies adopted, both in the instances generated by simulation and in real instances built from the data of a certain Facebook Ads advertiser.
8

Klasifikace testovacích manévrů z letových dat / Classification of Testing Maneuvers from Flight Data

Funiak, Martin January 2015 (has links)
Zapisovač letových údajů je zařízení určené pro zaznamenávání letových dat z různých senzorů v letadlech. Analýza letových údajů hraje důležitou roli ve vývoji a testování avioniky. Testování a hodnocení charakteristik letadla se často provádí pomocí testovacích manévrů. Naměřená data z jednoho letu jsou uložena v jednom letovém záznamu, který může obsahovat několik testovacích manévrů. Cílem této práce je identi kovat základní testovací manévry s pomocí naměřených letových dat. Teoretická část popisuje letové manévry a formát měřených letových dat. Analytická část popisuje výzkum v oblasti klasi kace založené na statistice a teorii pravděpodobnosti potřebnou pro pochopení složitých Gaussovských směšovacích modelů. Práce uvádí implementaci, kde jsou Gaussovy směšovací modely použité pro klasifi kaci testovacích manévrů. Navržené řešení bylo testováno pro data získána z letového simulátoru a ze skutečného letadla. Ukázalo se, že Gaussovy směšovací modely poskytují vhodné řešení pro tento úkol. Další možný vývoj práce je popsán v závěrečné kapitole.
9

Transparent Forecasting Strategies in Database Management Systems

Fischer, Ulrike, Lehner, Wolfgang 02 February 2023 (has links)
Whereas traditional data warehouse systems assume that data is complete or has been carefully preprocessed, increasingly more data is imprecise, incomplete, and inconsistent. This is especially true in the context of big data, where massive amount of data arrives continuously in real-time from vast data sources. Nevertheless, modern data analysis involves sophisticated statistical algorithm that go well beyond traditional BI and, additionally, is increasingly performed by non-expert users. Both trends require transparent data mining techniques that efficiently handle missing data and present a complete view of the database to the user. Time series forecasting estimates future, not yet available, data of a time series and represents one way of dealing with missing data. Moreover, it enables queries that retrieve a view of the database at any point in time - past, present, and future. This article presents an overview of forecasting techniques in database management systems. After discussing possible application areas for time series forecasting, we give a short mathematical background of the main forecasting concepts. We then outline various general strategies of integrating time series forecasting inside a database and discuss some individual techniques from the database community. We conclude this article by introducing a novel forecasting-enabled database management architecture that natively and transparently integrates forecast models.
10

Autour de quelques statistiques sur les arbres binaires de recherche et sur les automates déterministes / Around a few statistics on binary search trees and on accessible deterministic automata

Amri, Anis 19 December 2018 (has links)
Cette thèse comporte deux parties indépendantes. Dans la première partie, nous nous intéressons à l’analyse asymptotique de quelques statistiques sur les arbres binaires de recherche (ABR). Dans la deuxième partie, nous nous intéressons à l’étude du problème du collectionneur de coupons impatient. Dans la première partie, en suivant le modèle introduit par Aguech, Lasmar et Mahmoud [Probab. Engrg. Inform. Sci. 21 (2007) 133—141], on définit la profondeur pondérée d’un nœud dans un arbre binaire enraciné étiqueté comme la somme de toutes les clés sur le chemin qui relie ce nœud à la racine. Nous analysons alors dans ABR, les profondeurs pondérées des nœuds avec des clés données, le dernier nœud inséré, les nœuds ordonnés selon le processus de recherche en profondeur, la profondeur pondérée des trajets, l’indice de Wiener pondéré et les profondeurs pondérées des nœuds avec au plus un enfant. Dans la deuxième partie, nous étudions la forme asymptotique de la courbe de la complétion de la collection conditionnée à T_n≤ (1+Λ), Λ>0, où T_n≃n ln⁡n désigne le temps nécessaire pour compléter la collection. Puis, en tant qu’application, nous étudions les automates déterministes et accessibles et nous fournissons une nouvelle dérivation d’une formule due à Korsunov [Kor78, Kor86] / This Phd thesis is divided into two independent parts. In the first part, we provide an asymptotic analysis of some statistics on the binary search tree. In the second part, we study the coupon collector problem with a constraint. In the first part, following the model introduced by Aguech, Lasmar and Mahmoud [Probab. Engrg. Inform. Sci. 21 (2007) 133—141], the weighted depth of a node in a labelled rooted tree is the sum of all labels on the path connecting the node to the root. We analyze the following statistics : the weighted depths of nodes with given labels, the last inserted node, nodes ordered as visited by the depth first search procees, the weighted path length, the weighted Wiener index and the weighted depths of nodes with at most one child in a random binary search tree. In the second part, we study the asymptotic shape of the completion curve of the collection conditioned to T_n≤ (1+Λ), Λ>0, where T_n≃n ln⁡n is the time needed to complete accessible automata, we provide a new derivation of a formula due to Korsunov [Kor78, Kor86]

Page generated in 0.0832 seconds