• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2837
  • 968
  • 566
  • 403
  • 380
  • 198
  • 93
  • 71
  • 69
  • 57
  • 56
  • 51
  • 51
  • 45
  • 33
  • Tagged with
  • 6890
  • 559
  • 526
  • 496
  • 484
  • 478
  • 447
  • 428
  • 419
  • 402
  • 381
  • 361
  • 353
  • 350
  • 336
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
331

Game theoretic optimization for product line evolution

Song, Ruoyu 07 January 2016 (has links)
Product line planning aims at optimal planning of product variety. In addition, the traditional product line planning problem develops new product lines based on product attributes without considering existing product lines. However, in reality, almost all new product lines evolve from existing product lines, which leads to the product line evolution problem. Product line evolution involves trade-offs between the marketing perspective and engineering perspective. The marketing concern focuses on maximizing utility for customers; the engineering concern focuses on minimizing engineering cost. Utility represents satisfaction experienced by the customers of a product. Engineering cost is the total cost involved in the process of the development of a product line. These two goals are in conflict since the high utility requires high-end product attributes which could increase the engineering cost and vice versa. Rather than aggregating both problems as one single level optimization problem, the marketing and engineering concerns entail a non-collaborative game per se. This research investigates a game-theoretic approach to the product line evolution problem. A leader-follower joint optimization model is developed to leverage conflicting goals of marketing and engineering concerns within a coherent framework of game theoretic optimization. To solve the joint optimization model efficiently, a bi-level nested genetic algorithm is developed. A case study of smart watch product line evolution is reported to illustrate the feasibility and potential of the proposed approach.
332

OPTIMIZED LOW BIT RATE PCM/FM TELEMETRY WITH WIDE IF BANDWIDTHS

Law, Eugene 10 1900 (has links)
International Telemetering Conference Proceedings / October 21, 2002 / Town & Country Hotel and Conference Center, San Diego, California / This paper will present the results of some experiments with non-coherent, single symbol detection of pulse code modulation (PCM)/frequency modulation (FM) where the receiver intermediate frequency (IF) bandwidth is much wider than the bit rate. The experiments involved varying the peak deviation and measuring the bit error probability (BEP) at various signal energy per bit to noise power spectral density ratios (E(b)/N(o)). The experiments showed that the optimum peak-to-peak deviation was about 0.7 to 0.8 times the –3 dB IF bandwidth and that the E(b)/N(o) required for a given BEP increased as the ratio of IF bandwidth to bit rate increased. Further, bi-phase-level/FM performed slightly better than non-return-to-zero-level (NRZ-L)/FM with an ac coupled RF signal generator and IF bandwidths much wider than the bit rate.
333

Examining the Souls's Series Level Design

Ribbing, Valdemar, Melander, Laban January 2016 (has links)
The Souls’s series has become more and more popular over the recent years. The games have defined their own genre and are praised by many. One element of what is praised is the level design. The purpose of this paper is to examine how the level design of each game has developed over each game. To gather data from the games we have selected a number of principles found in level design related literature. After identifying these principles in the games we convert them into quantifiable data. What we found was that the results varied a lot and it was difficult to see any development. The study would have needed to be done on a grander scale to get more accurate and interesting results. Gathering player data could reveal interesting results such as paths players tend to traverse. / Souls serien har ökat i popularitet under de senaste åren. Spelen har definerat sin egen genre och hyllas av många. Ett element som hyllas är spelets level design. Syftet med den här uppsatsen är att undersöka hur varje spels level design har utvecklats under åren. För att samla datan har vi valt ett antal principer som vi har hittat i level design relaterad litteratur. Efter vi har identifierat principerna i spelen så omvandlar vi dem till kvantifierbar data. Vad vi upptäckte var att resultaten varierade mycket från år till år och det var svårt att se någon utveckling. Undersökningen skulle behövas gjorts på en större skala för att få mer korrekta och intressanta resultat. Att samla data från spelare skulle kunna ge intressanta resultat, ett exempel är vilka vägar spelare tenderar att ta.
334

Unlabled Level Planarity

Fowler, Joe January 2009 (has links)
Consider a graph G with vertex set V in which each of the n vertices is assigned a number from the set {1, ..., k} for some positive integer k. This assignment phi is a labeling if all k numbers are used. If phi does not assign adjacent vertices the same label, then phi partitions V into k levels. In a level drawing, the y-coordinate of each vertex matches its label and the edges are drawn strictly y-monotone. This leads to level drawings in the xy-plane where all vertices with label j lie along the line lj = {(x, j) : x in Reals} and where each edge crosses any of the k horizontal lines lj for j in [1..k] at most once. A graph with such a labeling forms a level graph and is level planar if it has a level drawing without crossings.We first consider the class of level trees that are level planar regardless of their labeling. We call such trees unlabeled level planar (ULP). We describe which trees are ULP and provide linear-time level planar drawing algorithms for any labeling. We characterize ULP trees in terms of two forbidden subdivisions so that any other tree must contain a subtree homeomorphic to one of these. We also provide linear-time recognition algorithms for ULP trees. We then extend this characterization to all ULP graphs with five additional forbidden subdivisions, and provide linear-time recogntion and drawing algorithms for any given labeling.
335

Modelling exchange rates and monetary policy in emerging Asian economies : non-linear econometric approach

Anwar, Muslimin January 2007 (has links)
In this thesis we examine exchange rates and monetary policy of four emerging Asian countries, namely Indonesia, Malaysia, the Philippines and South Korea. We model equilibrium exchange rates using a general behavioural specification consistent with a variety of theoretical approaches; and short-run dynamics using a general non-linear adjustment model. We find in all countries examined, equilibrium nominal and real exchange rates are a function of permanent relative output and one or more variables from domestic and foreign price levels, nominal and real interest rate differentials, the level of and changes in net foreign assets, and a time trend. These results imply that individual countries present significant elements of idiosyncratic behaviour, casting doubt on empirical models using panel-data techniques. We also obtain evidence of non-linear exchange rate dynamics, with the speed of adjustment to equilibrium being in all cases a function of the size, and in two cases, the sign of the misalignment term. With respect to monetary policy, we examined these countries' monetary policy reaction function based on an open economy augmented Taylor rule including the exchange rate and the foreign interest rate. Using a formal testing approach, our tests reject linearity, suggesting that monetary authorities in these four emerging economies are subject to nonlinear inflation effects and that they respond more vigorously to inflation when it is further from the target. Our results also lead us to speculate that policymakers in three countries may have been attempting to keep inflation within the range, while those in the other country may have been pursuing a point inflation target. Finally, we also find monetary policy is asymmetric as policy makers respond differently to upward and downward deviations of inflation away from the target.
336

The impact of future sea-level rise on the London-Penzance railway line

Dawson, David January 2012 (has links)
The coastal section of the London to Penzance railway line (Dawlish-Teignmouth) lies very close to sea level and has been susceptible to frequent closure during high seas and storm events. As the main railway connection for the southwest of England to the rest of Great Britain, it is a vital transport link for the Devon and Cornwall economy. Current understanding of future sea-level rise in the region is compromised by a lack of reliable geological data on which to establish accurate future sea-level projections. Furthermore, the impacts – in engineering and economic terms – of potential sea-level change on the long-term functioning of the main railway are unclear, and future policy making and planning are compromised by a similar gap in scientific knowledge. The central aim of this thesis is to establish the extent to which future sea-level changes will impact upon the Southwest’s main railway line. This aim carries three objectives: (1) to establish accurate sea-level trends over the last 4000 years (late Holocene) in order to validate geophysical models used in current future sea-level projections in the southwest of England; (2) to establish the likely impacts of future sea-level change on the functioning of the Dawlish-Teignmouth railway line; and (3) to integrate climate and socio-economic futures (scenarios) in an internally consistent manner for future use in regional policy debates. In addressing these objectives, we estimate that during the last 2000 years the coast of south Devon has subsided at a rate of ~1.1 mm/yr, generating a relative sea-level rise of ~0.9 mm/yr. The geophysical model (used to determine regional sea-level projections) underestimates the geologically estimated coastal subsidence rate by only 17%, which would generate an additional sea-level rise, compared to predicted values, of 0.014 m by 2100. Based on an empirical trend between increases in sea-level changes and rail functioning during the last 40 years, the corrected sea-level projections provide input for establishing future days with line restrictions due to overtopping on the Southwest Mainline. Impacts to both the Southwest economy (e.g., rail users) and the infrastructure owners have been determined, and integrating these forecasts with socio-economic scenarios (SES) has highlighted the important interaction between climate and socio-economic trends and future vulnerability. In a worst case scenario (e.g., high emissions), rail services are predicted to be disrupted (on average) for around 35% of the winter by 2060. By this stage, the cost of these disruptions will have exceeded the capital needed for constructing a new alternative inland route.
337

Complementing user-level coarse-grain parallelism with implicit speculative parallelism

Ioannou, Nikolas January 2012 (has links)
Multi-core and many-core systems are the norm in contemporary processor technology and are expected to remain so for the foreseeable future. Parallel programming is, thus, here to stay and programmers have to endorse it if they are to exploit such systems for their applications. Programs using parallel programming primitives like PThreads or OpenMP often exploit coarse-grain parallelism, because it offers a good trade-off between programming effort versus performance gain. Some parallel applications show limited or no scaling beyond a number of cores. Given the abundant number of cores expected in future many-cores, several cores would remain idle in such cases while execution performance stagnates. This thesis proposes using cores that do not contribute to performance improvement for running implicit fine-grain speculative threads. In particular, we present a many-core architecture and protocols that allow applications with coarse-grain explicit parallelism to further exploit implicit speculative parallelism within each thread. We show that complementing parallel programs with implicit speculative mechanisms offers significant performance improvements for a large and diverse set of parallel benchmarks. Implicit speculative parallelism frees the programmer from the additional effort to explicitly partition the work into finer and properly synchronized tasks. Our results show that, for a many-core comprising 128 cores supporting implicit speculative parallelism in clusters of 2 or 4 cores, performance improves on top of the highest scalability point by 44% on average for the 4-core cluster and by 31% on average for the 2-core cluster. We also show that this approach often leads to better performance and energy efficiency compared to existing alternatives such as Core Fusion and Turbo Boost. Moreover, we present a dynamic mechanism to choose the number of explicit and implicit threads, which performs within 6% of the static oracle selection of threads. To improve energy efficiency processors allow for Dynamic Voltage and Frequency Scaling (DVFS), which enables changing their performance and power consumption on-the-fly. We evaluate the amenability of the proposed explicit plus implicit threads scheme to traditional power management techniques for multithreaded applications and identify room for improvement. We thus augment prior schemes and introduce a novel multithreaded power management scheme that accounts for implicit threads and aims to minimize the Energy Delay2 product (ED2). Our scheme comprises two components: a “local” component that tries to adapt to the different program phases on a per explicit thread basis, taking into account implicit thread behavior, and a “global” component that augments the local components with information regarding inter-thread synchronization. Experimental results show a reduction of ED2 of 8% compared to having no power management, with an average reduction in power of 15% that comes at a minimal loss of performance of less than 3% on average.
338

Ljusets påverkan på ljudnivån i ett klassrum i grundskolan

Sethberg, Frida, Wik, Nina January 2016 (has links)
A good work environment in school is crucial in order to effectively teach and learn. Two factors that have a strong impact on the environment in a classroom is the lighting and the sound level. Research has shown that both light and sound have an impact on the health, and while a lot of research has been done on the work environment and the impact by different factors, very few studies have focused on how light and sound affect each other. The purpose of this study is to shed light on how the lighting can affect the sound level in a classroom. The results of this study could be used as a reference as to how the lighting should be set up in order to creative the most effective work environment in a classroom. In order to study if the sound level changes depending on the lighting, a quantitative field experiment has been performed. The study took place in a classroom with 6th graders in primary school. During 12 weekdays over a period of 3 weeks, the decibel values were logged to find out whether the sound level changed. During the first week, the original lighting was used. During the second week, the light was dimmed, and during the third and final week, the light was unevenly spread out. The results show that the sound level is indeed affected by the lighting in the classroom. The unevenly spread out lighting setup with a focus on vertical surfaces resulted in the lowest sound level. One conclusion that can be drawn is that more focus on the ambient light affects the students positively. The most important conclusion is that a lighting setup with different luminaires and a variety of the light levels and spread depending of the time of the day and season, is affecting the sound level and work environment in the classroom in a positive way. / En bra arbetsmiljö i skolan är en viktig förutsättning för att kunna förmedla kunskap. Två parametrar som har stor betydelse i ett klassrum är belysningen och ljudnivån. Forskning visar att både ljus och ljud påverkar hälsan men trots mängden forskning gällande arbetsmiljön och dess påverkande parametrar finns det fåtal forskningar som studerar hur ljus och ljud påverkar varandra. Syftet med studien och denna rapport är att öka kunskapen om hur ljussättning kan påverka ljudnivån i ett klassrum i grundskolan. Studien kan användas som underlag för hur klassrum bör belysas för att påverka arbetsmiljön i ett klassrum. För att undersöka om ljudnivå förändras vid förändrad ljusnivå respektive förändrad ljusfördelning har ett kvantitativt fältexperiment utförts. Studien genomfördes i ett hemklassrum för årskurs 6 och under tolv veckodagar under tre veckors tid loggades decibelvärden. Under vecka 1 var klassrummets ursprungliga belysning kvar, under vecka 2 sänktes ljusnivån och sista veckan skapades en ojämn ljusfördelning. Resultaten visar att ljudnivån kan ändras i ett klassrum med hjälp av belysningen, och studien resulterade i att den ojämna belysningen med fokus på vertikala ytor och omfältsljuset gav den lägsta ljudnivån. Slutsatser som kan dras av resultaten är bland annat att mer fokus på omgivningsljuset påverkar elevernas välmående i en positiv riktning. Den viktigaste slutsatsen är ändå att en belysningslösning med tillgång till olika typer av armaturer och möjlighet att kunna variera både ljusnivå och ljusfördelning beroende på lektion och tid på dygnet och årstid bidrar till en bättre arbetsmiljö i klassrummet.
339

Examining Levels of Automation in the Wood Processing Industry - A case study

Schneider, Christian, Andersson, Oscar January 2016 (has links)
Companies operating in the wood processing industry need to increase their productivity by implementing automation technologies in their production systems. An increasing global competition and rising raw material prizes challenge their competitiveness. Yet, too extensive automation brings risks such as a deterioration in situation awareness and operator deskilling. The concept of Levels of Automation is generally seen as means to achieve a balanced task allocation between the operators’ skills and competences and the need for automation technology relieving the humans from repetitive or hazardous work activities. The aim of this thesis was to examine to what extent existing methods for assessing Levels of Automation in production processes are applicable in the wood processing industry when focusing on an improved competitiveness of production systems. This was done by answering the following research questions (RQ): RQ1: What method is most appropriate to be applied with measuring Levels of Automation in the wood processing industry? RQ2: How can the measurement of Levels of Automation contribute to an improved competitiveness of the wood processing industry’s production processes? Literature reviews were used to identify the main characteristics of the wood processing industry affecting its automation potential and appropriate assessment methods for Levels of Automation in order to answer RQ1. When selecting the most suitable method, factors like the relevance to the target industry, application complexity or operational level the method is penetrating were important. The DYNAMO++ method, which covers both a rather quantitative technical-physical and a more qualitative social-cognitive dimension, was seen as most appropriate when taking into account these factors. To answer RQ 2, a case study was undertaken at a major Swedish manufacturer of interior wood products to point out paths how the measurement of Levels of Automation contributes to an improved competitiveness of the wood processing industry. The focus was on the task level on shop floor and concrete improvement suggestions were elaborated after applying the measurement method for Levels of Automation. Main aspects considered for generalization were enhancements regarding ergonomics in process design and cognitive support tools for shop-floor personnel through task standardization. Furthermore, difficulties regarding the automation of grading and sorting processes due to the heterogeneous material properties of wood argue for a suitable arrangement of human intervention options in terms of work task allocation.  The application of a modified version of DYNAMO++ reveals its pros and cons during a case study which covers a high operator involvement in the improvement process and the distinct predisposition of DYNAMO++ to be applied in an assembly system.
340

Génération de maillage à partir d'images 3D en utilisant l'adaptation de maillage anisotrope et une équation de réinitialisation / Direct multiphase mesh generation from 3D images using anisotropic mesh adaptation and a redistancing equation

Zhao, Jiaxin 03 March 2016 (has links)
Ces dernières années, les techniques d'imagerie ont fait l'objet de beaucoup d'améliorations. Elles permettent de fournir des images numériques 2D ou 3D précises de zones parfois invisibles à l’œil nu. Ces techniques s'appliquent dans de nombreux domaines comme l'industrie cinématographique, la photographie ou l'imagerie médicale... Dans cette thèse, l'imagerie sera utilisée pour effectuer des simulations numériques en la couplant avec un solveur éléments finis. Nous présenterons, en premier lieu, la morphologie mathématique et la méthode d'immersion d'image. Elles permettront l'extraction d'informations permettant la transformation d'une image dans un maillage exploitable. Puis, une méthode itérative d'adaptation de maillage basée sur un estimateur d'erreur sera utilisée afin de construire un maillage optimal. Ainsi, un maillage sera construit uniquement avec les données d'une image. Nous proposerons également une nouvelle méthodologie pour construire une fonction régulière a l'aide d'une méthode de réinitialisation de la distance signée. Deux avantages sont à noter : l'utilisation de la fonction régularisée permet une bonne adaptation de maillage. De plus, elle est directement utilisable par le solveur éléments finis. Les simulations numériques sont donc réalisées en couplant éléments finis stabilisés, adaptation de maillage anisotrope et réinitialisation. L'objectif de cette thèse est donc de simplifier le calcul numérique à partir d'image, d'améliorer la précision numérique, la construction d'un maillage automatique et de réaliser des calculs numériques parallèles efficaces. Les applications envisagées peuvent être dans le domaine médical, de la physique des matériaux ou du design industriel. / Imaging techniques have well improved in the last decades. They may accurately provide numerical descriptions from 2D or 3D images, opening perspectives towards inner information, not seen otherwise, with applications in different fields, like medicine studies, material science or urban environments. In this work, a technique to build a numerical description under the mesh format has been implemented and used in numerical simulations when coupled to finite element solvers. Firstly, mathematical morphology techniques have been introduced to handle image information, providing the specific features of interest for the simulation. The immersed image method was then proposed to interpolate the image information on a mesh. Then, an iterative anisotropic mesh adaptation operator was developed to construct the optimal mesh, based on the estimated error concerning the image interpolation. The mesh is thus directly constructed from the image information. We have also proposed a new methodology to build a regularized phase function, corresponding to the objects we wish to distinguish from the image, using a redistancing method. Two main advantages of having such function are: the gradient of the regularized function performs better for mesh adaptation; the regularized function may be directly used for the finite element solver. Stabilized finite element flow and advection solvers were coupled to the constructed anisotropic mesh and the redistancing function, allowing its application to multiphase flow numerical simulations. All these developments have been extended in a massively parallel context. An important objective of this work is the simplification of the image based computations, through a modified way to segment the image and by coupling all to an automatic way to construct the mesh used in the finite element simulations.

Page generated in 0.036 seconds