• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 49
  • 40
  • 16
  • 11
  • 8
  • 8
  • 5
  • 2
  • 1
  • 1
  • Tagged with
  • 144
  • 144
  • 51
  • 49
  • 47
  • 32
  • 31
  • 28
  • 26
  • 23
  • 23
  • 17
  • 16
  • 16
  • 16
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
131

Jeux concurrents enrichis : témoins pour les preuves et les ressources / Enriched concurrent games : witnesses for proofs and resource analysis

Alcolei, Aurore 17 October 2019 (has links)
La sémantique des jeux est une sémantique dénotationnelle centrée sur l’interaction : preuves et programmes y sont représentés par des stratégies modélisant, par le flot d’exécution, leur manière de réagir à leur environnement. Malgré cette présentation intensionnelle, les sémantiques de jeux ne suffisent pas à capturer certaines informations calculatoires annexes au flot d’exécution telles que, par exemple, la production de témoins en logique du premier ordre ou la consommation de ressources dans les langages de programmation. Dans cette thèse nous proposons un enrichissement du modèle des jeux concurrent à base de structures d’événements permettant de garder trace de ces informations.Nous construisons d’abord un modèle de jeux concurrent dans lequel les coups joueurs d’une stratégie sont annotés par les termes d’une théorie (in)équationnelle. Cette théorie est un paramètre de notre modèle et les annotations permettent de refléter de manière compacte des informations d’exécution n’ayant pas d’influence sur le flot d’exécution. Nous montrons que le modèle ainsi construit préserve la structure catégorique compacte fermée du modèle sans annotation.Nous explorons ensuite l’expressivité de notre modèle et présentons deux interprétations nouvelles en sémantique des preuves et des programmes : l’une interprétant les preuves de la logique classique du premier ordre par des stratégies concurrentes avec échange de témoins, donnant une version compositionnelle au théorème de Herbrand ; l’autre permettant de refléter les aspects quantitatifs liés à la consommation de ressources telles que le temps, dans l’exécution de programmes concurrents d’ordre supérieur avec mémoire partagée. / This thesis presents a general framework for enriching causal concurrent games model with annotations. These annotations can be viewed as meta-data on strategies: they are modified throughout interactions but do not affect their general flow of control. These data can be of various nature, in particular our enrichment is parametrised over any multi-sorted equational theory and can also reflect structure upon these data such as a partial order. From a semantics point of view, this construction is motivated by problems from both logic and programming languages: On the logic side, the annotated games model specialised to first-order terms enables us to give a novel interpretation of first-order classical proofs as concurrent strategies carrying first-order witnesses. In particular this answer the question of giving a compositional version to Herbrand’s theorem while avoiding the usual proof sequentialization of other denotational approaches. On the programming language side, annotations on games offer intrinsic quantitative models. We show that those can be used to provide denotational semantics for resource consumption analysis of concurrent higher order programming language with shared memory.These enrichments, strongly connected to the causal structure of concurrent games, give an argument in favor of a causal meaning of computations.
132

Respuesta sísmica de estructuras de concreto armado con un análisis tiempo historia no lineal usando acelerogramas artificiales / Seismic response of reinforced concrete structures by a nonlinear time-history analysis using artificial accelerograms

Coronel Huanca, Dennys Luis, Mamani Rojas, Marcos Visney 01 December 2020 (has links)
La escasez de registros sísmicos de gran magnitud en algunas regiones del mundo limita la determinación de la respuesta sísmica de una edificación. En este sentido, los acelerogramas artificiales representan una alternativa para definir el evento sísmico porque consideran las condiciones específicas del sitio en estudio. Esta investigación analiza la respuesta sísmica de las estructuras para diversos registros sísmicos artificiales generados de espectros de diseño para distintas condiciones geotécnicas. El procedimiento de análisis empleado será el dinámico tiempo historia no lineal para obtener una mayor precisión en la respuesta sísmica. Los resultados obtenidos muestran que las derivas de entrepiso de las señales artificiales creadas con la función de intensidad de Liu se ajustan mejor a las derivas obtenidas del sismo real escalado. / The lack of large seismic records in some world regions limits the determination of the seismic response of a building. For that reason, artificial accelerograms represent an alternative to define the seismic event because they consider specific conditions of study site. This research analyses the structures seismic response for various artificial seismic records generated from design spectra and different geotechnical conditions. Dynamic nonlinear time history analyzing was used to obtain greater precision in the seismic response. The results obtained show that the mezzanine drifts of the artificial signals created with the Liu intensity function better fit the drifts obtained from the scaled real earthquake. / Trabajo de investigación
133

Teknikutrymmen på yttertak : Stommaterialets påverkan på materialkostnaden och byggtiden / Plant rooms on roofs : The impact of the structural material on the material cost and construction time

Du, Jenny, Sjögren, Freja January 2018 (has links)
Byggbranschen strävar ständigt efter effektivisering och optimering som kan minska ett projekts byggkostnad samt byggtid. Byggnationen av teknikutrymmen på yttertak upplevs av både konstruktörer och projektledare som ett moment som har en stor byggkostnad och en lång byggtid. Ett teknikutrymme på yttertak har olika förutsättningar i förhållande till den underliggande byggnad, vilket kan leda till att produktionen fördröjs och att byggtiden förlängs. En förutsättning som skiljer ett teknikutrymme från dess underliggande byggnad är att teknikutrymmets ytterväggar kräver ett indrag från fasadlinjen. Detta gör att teknikutrymmets ytterväggar inte alltid placeras i linje med den underliggande vertikala stommen.  De laster som förs ner genom teknikutrymmets ytterväggar skapar punktlaster på teknikutrymmets golvbjälklag. Punktlasterna kan bidra till att golvbjälklaget kräver förstärkningar, vilket är både kostnads- och tidskrävande. En undersökning om stommaterialets påverkan på materialkostnaden och byggtiden för ett teknikutrymme har utförts. I undersökningen analyserades tre olika materialalternativ utifrån förutsättningar och krav som anges i ett specifikt referensprojekt. De material som har analyserats som alternativ för teknikutrymmets stomme är stål, lättelement av trä och betong. Syftet med undersökningen är att analysera vilket stommaterial som är optimalt att använda för teknikutrymmet, med hänsyn till kostnad, tid och bärighet. Resultatet visar att lättelement av trä har den minsta materialkostnaden och att betong har den kortaste byggtiden. Slutsatsen är att det optimala stommaterialet för referensprojektets teknikutrymme är betong. / The construction industry is constantly striving for efficiency and optimization that can reduce a project’s construction cost and construction time. The construction of plant rooms built on roofs is perceived by both structural engineers and project managers as a step in the production that has a big construction cost and a long construction time. A plant room on a roof has other conditions compared to the rest of the building, which can slow down production and the construction time is therefore extended. One condition that separates the plant room from the building is that the plant room’s outer walls require an offset from the facade of the building. This leads to the plantroom’s outer walls not necessarily lining up with the underlying vertical structure. The loads that go down through the plant room’s outer walls create point loads on the plant room’s slab. The point loads could make the slab need strengthening, which is both costly and time-consuming.   An investigation into the material of the structural frame's impact on the construction cost and construction time has been carried out. The aim of the investigation is to analyze which structural material is optimal with regards to cost, time and structural capacity. In the investigation three materials were analyzed by the conditions and requirements from a reference project. The chosen materials were a steel frame as well as light elements made from timber and concrete.   The result shows that lightweight elements made from timber are the most economic option and concrete is the most time-efficient option. The conclusion is that the optimal structural material for the reference project is concrete.
134

Techniques d'analyse et d'optimisation pour la synthèse architecturale de systèmes temps réel embarqués distribués : problèmes de placement, de partitionnement et d'ordonnancement / Analysis and optimization techniques for the architectural synthesis of real time embedded and distributed systems

Mehiaoui, Asma 16 June 2014 (has links)
Dans le cadre industriel et académique, les méthodologies de développement logiciel exploitent de plus en plus le concept de “modèle” afin d’appréhender la complexité des systèmes temps réel critiques. En particulier, celles-ci définissent une étape dans laquelle un modèle fonctionnel, conçu comme un graphe de blocs fonctionnels communiquant via des échanges de signaux de données, est déployé sur un modèle de plateforme d’exécution matérielle et un modèle de plateforme d’exécution logicielle composé de tâches et de messages. Cette étape appelée étape de déploiement, permet d’établir une architecture opérationnelle du système nécessitant une validation des propriétés temporelles du système. Dans le contexte des systèmes temps réel dirigés par les évènements, la vérification des propriétés temporelles est réalisée à l’aide de l’analyse d’ordonnançabilité basée sur l’analyse des temps de réponse. Chaque choix de déploiement effectué a un impact essentiel sur la validité et la qualité du système. Néanmoins, les méthodologies existantes n’offrent pas de support permettant de guider le concepteur d’applications durant l’exploration de l’espace des architectures possibles. L’objectif de ces travaux de thèse consiste à mettre en place des techniques d’analyse et de synthèse automatiques permettant de guider le concepteur vers une architecture opérationnelle valide et optimisée par rapport aux performances du système. Notre proposition est dédiée à l’exploration de l’espace des architectures en tenant compte à la fois des quatre degrés de liberté déterminés durant la phase de déploiement, à savoir (j) le placement des éléments fonctionnels sur les éléments de calcul et de communication de la plateforme d’exécution, (ii) le partitionnement des éléments fonctionnels en tâches temps réel et des signaux de données en messages, (iii) l’affectation de priorités d’exécution aux tâches et aux messages du système et (iv) l’attribution du mécanisme de protection des données partagées pour les systèmes temps réel périodiques. Nous nous intéressons principalement à la satisfaction des contraintes temporelles et celles liées aux capacités des ressources de la plateforme cible. De plus, nous considérons l’optimisation des latences de bout-en-bout et la consommation mémoire. Les approches d’exploration architecturale présentées dans cette thèse sont basées sur la technique d’optimisation PLNE (programmation linéaire en nombres entiers) et concernent à la fois les applications activées périodiquement et celles dont l’activation est pilotée par les données. Contrairement à de nombreuses approches antérieures fournissant une solution partielle au problème de déploiement, les méthodes proposées considèrent l’ensemble du problème de déploiement. Les approches proposées dans cette thèse sont évaluées à l’aide d’applications génériques et industrielles. / Modern development methodologies from the industry and the academia exploit more and more the ”model” concept to address the complexity of critical real-time systems. These methodologies define a key stage in which the functional model, designed as a network of function blocks communicating through exchanged data signals, is deployed onto a hardware execution platform model and implemented in a software model consisting of a set of tasks and messages. This stage so-called deployment stage allows establishment of an operational architecture of the system, thus it requires evaluation and validation of the temporal properties of the system. In the context of event-driven real-time systems, the verification of temporal properties is performed using the schedulability analysis based on the response time analysis. Each deployment choice has an essential impact on the validity and the quality of the system. However, the existing methodologies do not provide supportto guide the designer of applications in the exploration of the operational architectures space. The objective of this thesis is to develop techniques for analysis and automatic synthesis of a valid operational architecture optimized with respect to the system performances. Our proposition is dedicated to the exploration of architectures space considering at the same time the four degrees of freedom determined during the deployment phase, (i) the placement of functional elements on the computing and communication resources of the execution platform, (ii) the partitioning of function elements into real time tasks and data signals into messages, (iii) the priority assignment to system tasks and messages and (iv) the assignment of shared data protection mechanism for periodic real-time systems. We are mainly interested in meeting temporal constraints and memory capacity of the target platform. In addition, we are focusing on the optimization of end-to-end latency and memory consumption. The design space exploration approaches presented in this thesis are based on the MILP (Mixed Integer Linear programming) optimization technique and concern at the same time time-driven and data-driven applications. Unlike many earlier approaches providing a partial solution to the deployment problem, our methods consider the whole deployment problem. The proposed approaches in this thesis are evaluated using both synthetic and industrial applications.
135

Využití a interpretace seismických povrchových vln v širokém oboru frekvencí / Application and interpretation of seismic surface waves in broad frequency range

Gaždová, Renata January 2012 (has links)
Submitted Ph.D. thesis is concerning the application and interpretation of seismic surface waves in a broad range of frequencies and scales. Using surface waves as a supplement to the methods dealing with body waves seems to be worth the effort. Surface wave interpretation can be used to obtain new information about the studied medium and simultaneously it can overcome, in some cases, the limitations of other seismic techniques. Moreover, surface waves are usually present on measured records and hence for its usage it is not necessary to modify the standard measuring procedures. One of the results of this thesis is an original algorithm for dispersive waveform calculation. The program works in an arbitrary range of frequencies and scales. The input parameter for the calculation is the dispersion curve. In this point the algorithm differs from all other approaches used so far. Algorithm is based on a summation of frequency components with shifts corresponding to the velocity dispersion and distance. The resulting waveform only contains an individual dispersive wave of the selected mode, thus being particularly suitable for testing of methodologies for dispersive wave analysis. The algorithm was implemented into the program DISECA. Furthermore, a new procedure was designed to calculate the dispersion...
136

Investigations on CPI Centric Worst Case Execution Time Analysis

Ravindar, Archana January 2013 (has links) (PDF)
Estimating program worst case execution time (WCET) is an important problem in the domain of real-time systems and embedded systems that are deadline-centric. If WCET of a program is found to exceed the deadline, it is either recoded or the target architecture is modified to meet the deadline. Predominantly, there exist three broad approaches to estimate WCET- static WCET analysis, hybrid measurement based analysis and statistical WCET analysis. Though measurement based analyzers benefit from knowledge of run-time behavior, amount of instrumentation remains a concern. This thesis proposes a CPI-centric WCET analyzer that estimates WCET as a product of worst case instruction count (IC) estimated using static analysis and worst case cycles per instruction (CPI) computed using a function of measured CPI. In many programs, it is observed that IC and CPI values are correlated. Five different kinds of correlation are found. This correlation enables us to optimize WCET from the product of worst case IC and worst case CPI to a product of worst case IC and corresponding CPI. A prime advantage of viewing time in terms of CPI, enables us to make use of program phase behavior. In many programs, CPI varies in phases during execution. Within each phase, the variation is homogeneous and lies within a few percent of the mean. Coefficient of variation of CPI across phases is much greater than within a phase. Using this observation, we estimate program WCET in terms of its phases. Due to the nature of variation of CPI within a phase in such programs, we can use a simple probabilistic inequality- Chebyshev inequality, to compute bounds of CPI within a desired probability. In some programs that execute many paths depending on if-conditions, CPI variation is observed to be high. The thesis proposes a PC signature that is a low cost way of profiling path information which is used to isolate points of high CPI variation and divides a phase into smaller sub-phases of lower CPI variation. Chebyshev inequality is applied to sub-phases resulting in much tighter bounds. Provision to divide a phase into smaller sub-phases based on allowable variance of CPI within a sub-phase also exists. The proposed technique is implemented on simulators and on a native platform. Other advantages of phases in the context of timing analysis are also presented that include parallelized WCET analysis and estimation of remaining worst case execution time for a particular program run.
137

Statické řešení střešní konstrukce / Static design of roof structure

Duda, Tomáš January 2014 (has links)
The subject of the diploma thesis is the design and static analysis of roof structure. It is prestressed roof shell (membrane) supported by a cable. The whole thesis is divided into several separated units according to task's specification. The text section describes the introduction to the issue of presstressed suspension cable constructions and displays their possible shapes, types and realizations. There is also technical report of solved shell, which was written on the base of the documents, drawn drawings and structural (static) analysis.
138

Model-Based Exploration of Parallelism in Context of Automotive Multi-Processor Systems

Höttger, Robert Martin 15 July 2021 (has links)
This dissertation entitled ’Model-Based Exploration of Parallelism in the Context of Automotive Multi-Core Systems’ deals with the analytical investigation of different temporal relationships for automotive multi-processor systems subject to critical, embedded, real-time, distributed, and heterogeneous domain requirements. Vehicle innovation increasingly demands high-performance platforms in terms of, e.g., highly assisted or autonomous driving such that established software development processes must be examined, revised, and advanced. The goal is not to develop application software itself, but instead to improve the model-based development process, subject to numerous constraints and requirements. Model-based software development is, for example, an established process that allows systems to be analyzed and simulated in an abstracted, standardized, modular, isolated, or integrated manner. The verification of real-time behavior taking into account various constraints and modern architectures, which include graphics and heterogeneous processors as well as dedicated hardware accelerators, is one of many challenges in the real-time and automotive community. The software distribution across hardware entities and the identification of software that can be executed in parallel are crucial in the development process. Since these processes usually optimize one or more properties, they belong to the category of problems that can only be solved in polynomial time using non-deterministic methods and thus make use of (meta) heuristics for being solved. Such (meta) heuristics require sophisticated implementation and configuration, due to the properties to be optimized are usually subject to many different analyses. With the results of this dissertation, various development processes can be adjusted to modern architectures by using new and extended processes that enable future and computationally intensive vehicle applications on the one hand and improve existing processes in terms of efficiency and effectiveness on the other hand. These processes include runnable partitioning, task mapping, data allocation, and timing verification, which are addressed with the help of constraint programming, genetic algorithms, and heuristics.
139

Two-phase WCET analysis for cache-based symmetric multiprocessor systems

Tsoupidi, Rodothea Myrsini January 2017 (has links)
The estimation of the worst-case execution time (WCET) of a task is a problem that concerns the field of embedded systems and, especially, real-time systems. Estimating a safe WCET for single-core architectures without speculative mechanisms is a challenging task and an active research topic. However, the advent of advanced hardware mechanisms, which often lack predictability, complicates the current WCET analysis methods. The field of Embedded Systems has high safety considerations and is, therefore, conservative with speculative mechanisms. However, nowadays, even safety-critical applications move to the direction of multiprocessor systems. In a multiprocessor system, each task that runs on a processing unit might affect the execution time of the tasks running on different processing units. In shared-memory symmetric multiprocessor systems, this interference occurs through the shared memory and the common bus. The presence of private caches introduces cachecoherence issues that result in further dependencies between the tasks. The purpose of this thesis is twofold: (1) to evaluate the feasibility of an existing one-pass WCET analysis method with an integrated cache analysis and (2) to design and implement a cachebased multiprocessor WCET analysis by extending the singlecore method. The single-core analysis is part of the KTH’s Timing Analysis (KTA) tool. The WCET analysis of KTA uses Abstract Search-based WCET Analysis, an one-pass technique that is based on abstract interpretation. The evaluation of the feasibility of this analysis includes the integration of microarchitecture features, such as cache and pipeline, into KTA. These features are necessary for extending the analysis for hardware models of modern embedded systems. The multiprocessor analysis of this work uses the single-core analysis in two stages to estimate the WCET of a task running under the presence of temporally and spatially interfering tasks. The first phase records the memory accesses of all the temporally interfering tasks, and the second phase uses this information to perform the multiprocessor WCET analysis. The multiprocessor analysis assumes the presence of private caches and a shared communication bus and implements the MESI protocol to maintain cache coherence. / Uppskattning av längsta exekveringstid (eng. worst-case execution time eller WCET) är ett problem som angår inbyggda system och i synnerhet realtidssystem. Att uppskatta en säker WCET för enkelkärniga system utan spekulativa mekanismer är en utmanande uppgift och ett aktuellt forskningsämne. Tillkomsten av avancerade hårdvarumekanismer, som ofta saknar förutsägbarhet, komplicerar ytterligare de nuvarande analysmetoderna för WCET. Inom fältet för inbyggda system ställs höga säkerhetskrav. Således antas en konservativ inställning till nya spekulativa mekanismer. Trotts detta går säkerhetskritiska system mer och mer i riktning mot multiprocessorsystem. I multiprocessorsystem påverkas en process som exekveras på en processorenhet av processer som exekveras på andra processorenheter. I symmetriska multiprocessorsystem med delade minnen påträffas denna interferens i det delade minnet och den gemensamma bussen. Privata minnen introducerar cache-koherens problem som resulterar i ytterligare beroende mellan processerna. Syftet med detta examensarbete är tvåfaldigt: (1) att utvärdera en befintlig analysmetod för WCET efter integrering av en lågnivå analys och (2) att designa och implementera en cache-baserad flerkärnig WCET-analys genom att utvidga denna enkelkärniga metod. Den enkelkärniga metoden är implementerad i KTH’s Timing Analysis (KTA), ett verktyg för tidsanalys. KTA genomför en så-kallad Abstrakt Sök-baserad Metod som är baserad på Abstrakt Interpretation. Utvärderingen av denna analys innefattar integrering av mikroarkitektur mekanismer, såsom cache-minne och pipeline, i KTA. Dessa mekanismer är nödvändiga för att utvidga analysen till att omfatta de hårdvarumodeller som används idag inom fältet för inbyggda system. Den flerkärniga WCET-analysen genomförs i två steg och uppskattar WCET av en process som körs i närvaron av olika tids och rumsligt störande processer. Första steget registrerar minnesåtkomst för alla tids störande processer, medans andra steget använder sig av första stegets information för att utföra den flerkärniga WCET-analysen. Den flerkärniga analysen förutsätter ett system med privata cache-minnen och en gemensamm buss som implementerar MESI protokolen för att upprätthålla cache-koherens.
140

A Case Study: Optimising PAP ambulance location with data and travel time analysis

Lukas, Kurasinski, Jason, Tan January 2022 (has links)
The mental health concerns in Sweden have been increasing since the beginning of the 2000’s, where Skåne County in the southern parts of Sweden has shown to be slightly higher in a proportion of reported cases in comparison to other regions. To address the growing need for psychiatric healthcare, the health services of the region of Skåne (Region Skåne) have introduced a psychiatric ambulance unit as a part of first responders. The Prehospital Acute Psychiatry (PAP) ambulance is manned by health care professionals trained in mental health issues. The goal of Region Skåne is to reach 90% of the population within 20 minutes and 99% of the population within 35 minutes. This case study aims to provide valuable and useful information to the decision-makers in Region Skåne when placing additional PAP ambulance units. A PAP ambulance placed in an ambulance station uses an optimisation model previously created and is based on data and travel time analysis. The data analysis consists of K-Means clustering and Linear regression, to find similarities in the data as well as trends in the number of cases. The travel time analysis and the area and population coverage is based on 20, 35, and 60 minutes travel time from a station. The travel time is dependent on the road conditions as well as population density when considering ambulance stations for additional PAP ambulances. Malmö, Helsingborg, and Kristianstad/Hässleholm PAP stations are shown to be optimal choices, due to favourable road conditions and densely populated regions. Ambulances placed in these stations can cover much ground while also being able to attend to a major portion of the population. The data analysis also shows that it is beneficial to place ambulances in these stations, due to an increasing trend of mental illness cases in these areas symbolising a medium to a high number of cases in relation to the rest of Skåne.

Page generated in 0.0472 seconds