Spelling suggestions: "subject:"tet"" "subject:"beet""
431 |
Viktiga faktorer produktutveckling enligt set based concurrent engineering / Important factors when developing products using set based concurrent engineeringHäkkinen, Markus January 2016 (has links)
Oftast har produktutveckling samma generella tillvägagångssätt: En specifikation för en ny produkt lämnas från en marknadsföringsavdelning till en produktutvecklingsavdelning. Produktutvecklare tar sedan fram koncept som utvecklas till prototyper för att sedan tillverkas och säljas. Tillvägagångssättet skiljer sig vanligtvis genom att olika modeller som exempelvis Lean product development eller integrerad produktutveckling används i processen. Delen av Lean product development (LPD) som används vid konceptutveckling kallas set based concurrent engineering (SBCE) och dessa begrepp är relativt nya i Sverige. Vad krävs för att arbeta med set based concurrent engineering på ett framgångsrikt sätt? Går det att identifiera viktiga faktorer vid produktutveckling med SBCE hos företag? En litteraturstudie som resulterade i en lista med fem potentiellt viktiga faktorer vid produktutveckling med SBCE gjordes inför en kvalitativ studie av fem företag. Semistrukturerade intervjuer utfördes på Husqvarna, Saab, Furhoffs, Ericsson Radio och GKN Aerospace för att samla in empiri inför analys. I analysen jämfördes företagens sätt att produktutveckla med listan som togs fram i litteraturstudien. Slutsats av studien var att några viktiga faktorer för att lyckas med produktutveckling enligt SBCE är: Bred lösningsrymd, Kunskapsbaserat bortval av koncept, Återvinning av kunskap, Tekniskt kunnig projektledning, Tvärfunktionella arbetsgrupper. Dessutom framkom att en investering i rätt ledarskap kan vara en viktig faktor vid implementering av SBCE då företagets ledning måste ha förståelse för arbetsmodellen om SBCE ska kunna resultera i en positiv effekt. / Product development usually has the same general approach: A specification for a new product is provided to the product development department from the marketing department. Product developers then generate concepts which are developed into prototypes before the products are manufactured and sold. The procedure usually differs by the use of different models such as Lean product development or integrated product development in the process. The part of Lean product development (LPD) that is used when developing concepts is called set based concurrent engineering (SBCE) and these are new concepts in Sweden. What is required to work with set based concurrent engineering in a successful manner? Is it possible to identify important factors when developing products using SBCE in companies? A literature study which resulted in a list with five potentially important factors when developing products using SBCE was created before a qualitative study was conducted at five companies. Semi structured interviews were conducted at Husqvarna, Saab, Furhoffs, Ericsson Radio and GKN Aerospace to gather data for an analysis. In the analysis, comparisons were made between the companies’ way of developing products with the list that was created in the end of the literature study. A conclusion of the study was a number of important factors when developing products using SBCE could be: Wide solution space, Knowledge based screening of concepts, Recycling of knowledge, Technically competent project management, Cross functional teams. The study also showed that an investment in the right type of leadership could potentially be an important factor when implementing SBCE since the company management need to understand the working model if SBCE is going to have a positive effect.
|
432 |
'O Hidden Face!' : an analysis and contextualisation of Priaulx Rainier's 'Requiem'Van Rhyn, Chris 03 1900 (has links)
Thesis (MMUS (Music))--Stellenbosch University, 2010. / ENGLISH ABSTRACT: The South African-born British composer Priaulx Rainier (1903-1986) wrote her
Requiem (1955-1956) for solo tenor and choir to a text by surrealist poet David
Gascoyne. The poem (completed in 1940) contradicts the commemorative genre of the
requiem and instead anticipates the prospective victims of the war that was to come. Due
to the disturbances caused by the war, Rainier only started working on the music fifteen
years later in 1955.
Existing discourse on Rainier has been shaped by the thoughtless regurgitation of
opinions, reviews and clichéd biographical models. Since very few detailed analyses of
Rainier’s works exist, this thesis attempts to address this gap in research on this
composer. It therefore aims to contribute to a more balanced, evidence-based discourse.
The significance of the Requiem is that it was said by commentators to indicate a period
of increasing abstraction in Rainier’s oeuvre. The findings regarding tonality in this work
may therefore serve as a point of reference in future analyses of works preceding and
following the Requiem.
In the literature review, recurring issues in the discourse on Rainier (such as the
numerous references to her childhood in Natal as an influence on her works and the
descriptions of her works as possessing a masculine gender identity) is highlighted. The
contextualisation that follows includes a reception-based periodisation of Rainier’s works
and a “biography” of the Requiem, with a special focus on the intersection between the
symbolic world of David Gascoyne and the Requiem and the sculptor Barbara
Hepworth’s influence on Rainier’s work. Underpinning these contextual considerations
is a set theory analysis of the work that attempts to illustrate the composer’s aspiration
towards musical abstraction as a creative force. The findings of the analysis are also
contextualised with regard to existing notions of Rainier’s style and tonality in her music. / AFRIKAANSE OPSOMMING: Die Suid-Afrikaans gebore Britse komponis Priaulx Rainier (1903-1986) se Requiem
(1955-1956) vir solo-tenoor en koor is ’n toonsetting van ’n gedig deur die surrealistiese
digter David Gascoyne. Die gedig weerspreek die gedenkende karakter van die requiem
genre deurdat dit vooruitwys na die slagoffers van die oorlog wat op hande was. As
gevolg van die onderbrekings wat deur die oorlog veroorsaak was, het Rainier eers
vyftien jaar later, in 1955, aan die musiek begin werk.
Die ondeurdagte herhaling van opinies, resensies en biografiese clichés kenmerk huidige
diskoers oor Rainier. Aangesien daar min gedetailleerde analises van Rainier se werke in
die literatuur bestaan, poog hierdie tesis om bestaande leemtes in die navorsing oor
Rainier te vul. Dit streef ook na die ontwikkeling van ’n meer gebalanseerde, bewysgebaseerde
diskoers. Die belang van hierdie werk setel hoofsaaklik daarin dat dit volgens
sommige bronne dui op ’n kenteringsmoment in Rainier se oeuvre, waarna haar werke
meer abstrak geword het. Die bevindinge rakende tonaliteit in hierdie werk kan dus dien
as ’n verwysingspunt vir toekomstige analises van Rainier se werke wat die Requiem
voorafgaan en volg.
Herhalende elemente in die diskoers oor Rainier (soos die invloed van haar kinderjare in
Natal op haar werke en die beskrywings van ’n manlike gender-identeit in haar werke)
word in die literatuurstudie uitgelig. Die konstekstualisering wat volg bestaan uit ’n
literatuur-gebaseerde periodisering van haar werke en ’n “biografie” van die Requiem wat
fokus op die verhouding tussen simboliek in die werke van David Gascoyne en die
Requiem en die invloed van die beeldhouer Barbara Hepworth op Rainier se werke. Die
kontekstualisering word gevolg deur ’n ‘set’-teorie-analise van die werk waarin gepoog
word om die komponis se strewe na musikale abstraksie as kreatiewe inspirasie aan te
toon. Die bevindinge van die analise word dan vergelyk met die bestaande opvattinge oor
styl en tonaliteit in Rainier se werke. / COMPOSITIONS: In Paradisum for SATB choir 2
Droom for mezzo-soprano and piano: 11
• Kuspad 14
• Sy 17
• Glasblaser 20
• Glasprater 23
String trio arrangements from Droom:
• Glasblaser 26
• Glasprater 30
Het Hom! for tenor saxophone 34
The beauty in sorrow for string quartet: 37
• First Movement 40
• Second Movement 49
• Third Movement 54
Symphonata for orchestra: 61
• First Movement 62
• Second Movement 66
• Third Movement 71
• Fourth Movement 77
|
433 |
Nouvelles approximations numériques pour les équations de Stokes et l'équation Level SetDjenno Ngomanda, Malcom 14 December 2007 (has links) (PDF)
Ce travail de thèse est consacré à deux thèmes de recherche en Calcul Scientifique liés par l'approximation numérique de problèmes en mécanique des fluides. Le premier thème concerne l'approximation numérique des équations de Stokes, modélisant les écoulements de fluides incompressibles à vitesse faible. Ce thème est présent dans plusieurs travaux en Calcum Scientifique. La discrétisation en temps est réalisée à l'aide de la méthode de projection. La discrétisation en espace utilise la méthode des éléments finis hybrides qui permet d'imposer de façon exacte la contrainte d'incompressibilité. Cette approche est originale : la méthode des éléments mixtes hybrides est couplée avec une méthode d'éléments finis standards. L'ordre de convergence des deux méthodes est préservé. Le second thème concerne la mise au point de méthodes numériques de type volumes finis pour la résolution de l'équation Level Set. Ces équations interviennent de manière essentielle dans la résolution des problèmes de propagation d'interfaces. Dans cette partie, nous avons développé une nouvelle méthode d'ordre 2 de type MUSCL pour résoudre le système hyperbolique résultant de l'équation Level Set. Nous illustrons ces propriétés par des applications numériques. En particulier nous avons regardé le cas du problème des deux demi-plans pour lequel notre schéma donne une approximation pour le gradient de la fonction Level Set. Par ailleurs, l'ordre de précision attendu est obtenu avec les normes L1 et Linfini pour des fonctions régulières. Pour finir, il est à noter que notre méthode peut être facilement étendue aux problèmes d'Hamilton-Jacobi du premier et du second ordre
|
434 |
Improvements in ranked set samplingHaq, Abdul January 2014 (has links)
The main focus of many agricultural, ecological and environmental studies is to develop well designed, cost-effective and efficient sampling designs. Ranked set sampling (RSS) is one of those sampling methods that can help accomplish such objectives by incorporating prior information and expert knowledge to the design. In this thesis, new RSS schemes are suggested for efficiently estimating the population mean. These sampling schemes can be used as cost-effective alternatives to the traditional simple random sampling (SRS) and RSS schemes. It is shown that the mean estimators under the proposed sampling schemes are at least as efficient as the mean estimator with SRS. We consider the best linear unbiased estimators (BLUEs) and the best linear invariant estimators (BLIEs) for the unknown parameters (location and scale) of a location-scale family of distributions under double RSS (DRSS) scheme. The BLUEs and BLIEs with DRSS are more precise than their counterparts based on SRS and RSS schemes. We also consider the BLUEs based on DRSS and ordered DRSS (ODRSS) schemes for the unknown parameters of a simple linear regression model using replicated observations. It turns out that, in terms of relative efficiencies, the BLUEs under ODRSS are better than the BLUEs with SRS, RSS, ordered RSS (ORSS) and DRSS schemes.
Quality control charts are widely recognized for their potential to be a powerful process monitoring tool of the statistical process control. These control charts are frequently used in many industrial and service organizations to monitor in-control and out-of-control performances of a production or manufacturing process. The RSS schemes have had considerable attention in the construction of quality control charts. We propose new exponentially weighted moving average (EWMA) control charts for monitoring the process mean and the process dispersion based on the BLUEs obtained under ORSS and ODRSS schemes. We also suggest an improved maximum EWMA control chart for simultaneously monitoring the process mean and dispersion based on the BLUEs with ORSS scheme. The proposed EWMA control charts perform substantially better than their counterparts based on SRS and RSS schemes. Finally, some new EWMA charts are also suggested for monitoring the process dispersion using the best linear unbiased absolute estimators of the scale parameter under SRS and RSS schemes.
|
435 |
Customising compilers for customisable processorsMurray, Alastair Colin January 2012 (has links)
The automatic generation of instruction set extensions to provide application-specific acceleration for embedded processors has been a productive area of research in recent years. There have been incremental improvements in the quality of the algorithms that discover and select which instructions to add to a processor. The use of automatic algorithms, however, result in instructions which are radically different from those found in conventional, human-designed, RISC or CISC ISAs. This has resulted in a gap between the hardware’s capabilities and the compiler’s ability to exploit them. This thesis proposes and investigates the use of a high-level compiler pass that uses graph-subgraph isomorphism checking to exploit these complex instructions. Operating in a separate pass permits techniques to be applied that are uniquely suited for mapping complex instructions, but unsuitable for conventional instruction selection. The existing, mature, compiler back-end can then handle the remainder of the compilation. With this method, the high-level pass was able to use 1965 different automatically produced instructions to obtain an initial average speed-up of 1.11x over 179 benchmarks evaluated on a hardware-verified cycle-accurate simulator. This result was improved following an investigation of how the produced instructions were being used by the compiler. It was established that the models the automatic tools were using to develop instructions did not take account of how well the compiler could realistically use them. Adding additional parameters to the search heuristic to account for compiler issues increased the speed-up from 1.11x to 1.24x. An alternative approach using a re-designed hardware interface was also investigated and this achieved a speed-up of 1.26x while reducing hardware and compiler complexity. A complementary, high-level, method of exploiting dual memory banks was created to increase memory bandwidth to accommodate the increased data-processing bandwidth provided by extension instructions. Finally, the compiler was considered for use in a non-conventional role where rather than generating code it is used to apply source-level transformations prior to the generation of extension instructions and thus affect the shape of the instructions that are generated.
|
436 |
Score-level fusion for multimodal biometricsAlsaade, Fawaz January 2008 (has links)
This thesis describes research into the score-level fusion process in multimodal biometrics. The emphasis of the research is on the fusion of face and voice biometrics in the two recognition modes of verification and open-set identification. The growing interest in the use of multiple modalities in biometrics is due to its potential capabilities for eradicating certain important limitations of unimodal biometrics. One of the factors important to the accuracy of a multimodal biometric system is the choice of the technique deployed for data fusion. To address this issue, investigations are carried out into the relative performance of several statistical data fusion techniques for combining the score information in both unimodal and multimodal biometrics (i.e. speaker and/ or face verification). Another important issue associated with any multimodal technique is that of variations in the biometric data. Such variations are reflected in the corresponding biometric scores, and can thereby adversely influence the overall effectiveness of multimodal biometric recognition. To address this problem, different methods are proposed and investigated. The first approach is based on estimating the relative quality aspects of the test scores and then passing them on into the fusion process either as features or weights. The approach provides the possibility of tackling the data variations based on adjusting the weights for each of the modalities involved according to its relative quality. Another approach considered for tackling the effects of data variations is based on the use of score normalisation mechanisms. Whilst score normalisation has been widely used in voice biometrics, its effectiveness in other biometrics has not been previously investigated. This method is shown to considerably improve the accuracy of multimodal biometrics by appropriately correcting the scores from degraded modalities prior to the fusion process. The investigations in this work are also extended to the combination of score normalisation with relative quality estimation. The experimental results show that, such a combination is more effective than the use of only one of these techniques with the fusion process. The thesis presents a thorough description of the research undertaken, details the experimental results and provides a comprehensive analysis of them.
|
437 |
The construction of a model for lean product developmentKhan, Muhammad Sharjeel January 2012 (has links)
‘Lean’ or ‘lean thinking’ refers to an improvement philosophy which focuses on the fulfilment of customer value and the reduction of waste. This philosophy is credited with the extraordinary rise of Toyota, one of the largest and most profitable automotive companies in the world. This thesis presents a pioneering study investigating how lean thinking should be applied to product development (PD). The aim of the research was to construct an innovative model which supports the implementation of lean thinking in PD. This was achieved through progressive collaboration with practitioners from European manufacturing companies. The model provides a process for the conceptual development of an engineering project, and is composed of phases and activities for which methodologies have been defined. The construction of the lean PD model was preceded by a systematic literature review and an industrial field study, wherein 36 semi-structured interviews were conducted in five manufacturing companies in Europe. The constructed model was later implemented on two real-life case studies via action research. The two conducted case studies involved the product architecture design for a car audio head unit and the development of a helicopter engine. It was concluded that the lean PD model addresses various industrial challenges including customer value, communication, and innovation. Furthermore, by focusing on conceptual design, the lean PD model is expected to reduce design rework. As a result of the positive effects of the model, one of the companies involved intends to implement the lean PD model further, and wishes to extend the model to the rest of the organisation. This research makes four main contributions: (1) a novel lean PD model; (2) a number of tools developed to support the model; (3) a framework for lean PD enablers; and (4) a categorisation of challenges faced by PD in industry used to verify the relevance of the lean PD model.
|
438 |
Comparative study of oscillatory integral, and sub-level set, operator norm estimatesKowalski, Michael Władisław January 2010 (has links)
Oscillatory integral operators have been of interest to both mathematicians and physicists ever since the emergence of the work Theorie Analytique de la Chaleur of Joseph Fourier in 1822, in which his chief concern was to give a mathematical account of the diffusion of heat. For example, oscillatory integrals naturally arise when one studies the behaviour at infinity of the Fourier transform of a Borel measure that is supported on a certain hypersurface. One reduces the study of such a problem to that of having to obtain estimates on oscillatory integrals. However, sub-level set operators have only come to the fore at the end of the 20th Century, where it has been discovered that the decay rates of the oscillatory integral I(lambda) above may be obtainable once the measure of the associated sub-level sets are known. This discovery has been fully developed in a paper of A. Carbery, M. Christ and J.Wright. A principal goal of this thesis is to explore certain uniformity issues arising in the study of sub-level set estimates.
|
439 |
Increasing the efficacy of automated instruction set extensionBennett, Richard Vincent January 2011 (has links)
The use of Instruction Set Extension (ISE) in customising embedded processors for a specific application has been studied extensively in recent years. The addition of a set of complex arithmetic instructions to a baseline core has proven to be a cost-effective means of meeting design performance requirements. This thesis proposes and evaluates a reconfigurable ISE implementation called “Configurable Flow Accelerators” (CFAs), a number of refinements to an existing Automated ISE (AISE) algorithm called “ISEGEN”, and the effects of source form on AISE. The CFA is demonstrated repeatedly to be a cost-effective design for ISE implementation. A temporal partitioning algorithm called “staggering” is proposed and demonstrated on average to reduce the area of CFA implementation by 37% for only an 8% reduction in acceleration. This thesis then turns to concerns within the ISEGEN AISE algorithm. A methodology for finding a good static heuristic weighting vector for ISEGEN is proposed and demonstrated. Up to 100% of merit is shown to be lost or gained through the choice of vector. ISEGEN early-termination is introduced and shown to improve the runtime of the algorithm by up to 7.26x, and 5.82x on average. An extension to the ISEGEN heuristic to account for pipelining is proposed and evaluated, increasing acceleration by up to an additional 1.5x. An energyaware heuristic is added to ISEGEN, which reduces the energy used by a CFA implementation of a set of ISEs by an average of 1.6x, up to 3.6x. This result directly contradicts the frequently espoused notion that “bigger is better” in ISE. The last stretch of work in this thesis is concerned with source-level transformation: the effect of changing the representation of the application on the quality of the combined hardwaresoftware solution. A methodology for combined exploration of source transformation and ISE is presented, and demonstrated to improve the acceleration of the result by an average of 35% versus ISE alone. Floating point is demonstrated to perform worse than fixed point, for all design concerns and applications studied here, regardless of ISEs employed.
|
440 |
A Collapsing Result Using the Axiom of Determinancy and the Theory of Possible CofinalitiesMay, Russell J. 05 1900 (has links)
Assuming the axiom of determinacy, we give a new proof of the strong partition relation on ω1. Further, we present a streamlined proof that J<λ+(a) (the ideal of sets which force cof Π α < λ) is generated from J<λ+(a) by adding a singleton. Combining these results with a polarized partition relation on ω1
|
Page generated in 0.0485 seconds