• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 412
  • 215
  • 103
  • 75
  • 70
  • 52
  • 37
  • 24
  • 18
  • 10
  • 7
  • 5
  • 5
  • 5
  • 4
  • Tagged with
  • 1181
  • 509
  • 260
  • 214
  • 162
  • 155
  • 150
  • 148
  • 126
  • 102
  • 88
  • 88
  • 80
  • 78
  • 72
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
321

Integrated temperature sensors in deep sub-micron CMOS technologies

Chowdhury, Golam Rasul 03 July 2014 (has links)
Integrated temperature sensors play an important role in enhancing the performance of on-chip power and thermal management systems in today's highly-integrated system-on-chip (SoC) platforms, such as microprocessors. Accurate on-chip temperature measurement is essential to maximize the performance and reliability of these SoCs. However, due to non-uniform power consumption by different functional blocks, microprocessors have fairly large thermal gradient (and variation) across their chips. In the case of multi-core microprocessors for example, there are task-specific thermal gradients across different cores on the same die. As a result, multiple temperature sensors are needed to measure the temperature profile at all relevant coordinates of the chip. Subsequently, the results of the temperature measurements are used to take corrective measures to enhance the performance, or save the SoC from catastrophic over-heating situations which can cause permanent damage. Furthermore, in a large multi-core microprocessor, it is also imperative to continuously monitor potential hot-spots that are prone to thermal runaway. The locations of such hot spots depend on the operations and instruction the processor carries out at a given time. Due to practical limitations, it is an overkill to place a big size temperature sensor nearest to all possible hot spots. Thus, an ideal on-chip temperature sensor should have minimal area so that it can be placed non-invasively across the chip without drastically changing the chip floor plan. In addition, the power consumption of the sensors should be very low to reduce the power budget overhead of thermal monitoring system, and to minimize measurement inaccuracies due to self-heating. The objective of this research is to design an ultra-small size and ultra-low power temperature sensor such that it can be placed in the intimate proximity of all possible hot spots across the chip. The general idea is to use the leakage current of a reverse-bias p-n junction diode as an operand for temperature sensing. The tasks within this project are to examine the theoretical aspect of such sensors in both Silicon-On-Insulator (SOI), and bulk Complementary Metal-Oxide Semiconductor (CMOS) technologies, implement them in deep sub-micron technologies, and ultimately evaluate their performances, and compare them to existing solutions. / text
322

Analog-to-digital converter circuit and system design to improve with CMOS scaling

Mortazavi, Yousof 08 September 2015 (has links)
There is a need to rethink the design of analog/mixed-signal circuits to be viable in state-of-the-art nanometer-scale CMOS processes due to the hostile environment they create for analog circuits. Reduced supply voltages and smaller capacitances are beneficial to circuit speed and digital circuit power efficiency; however, these changes along with smaller dimensions and close coupling of fast-switching digital circuits have made high-accuracy voltage domain analog processing increasingly difficult. In this work, techniques to improve analog-to-digital converters (ADC) for nanometer-scale processes are explored. First, I propose a mostly-digital time-based oversampling delta-sigma (∆Σ) ADC architecture. This system uses time, rather than voltage, as the analog variable for its quantizer, where the noise shaping process is realized by modulating the width of a variable-width digital "pulse." The merits of this architecture render it not only viable to scaling, but also enable improved circuit performance with ever-increasing time resolution of scaled CMOS processes. This is in contrast to traditional voltage-based analog circuit design, whose performance generally decreases with scaling due to increasingly higher voltage uncertainty due to supply voltage reduction and short-channel effects. In conjunction with Dr. Woo Young Jung while he was a Ph.D. student at The University of Texas at Austin, two prototype implementations of the proposed architecture were designed and fabricated in TSMC 180 nm CMOS and IBM 45 nm Silicon-On-Insulator (SOI) processes. The prototype ADCs demonstrate that the architecture can achieve bandwidths of 5-20 MHz and ∼50 dB SNR with very small area. The first generation ADC core occupies an area of only 0.0275 mm² , while the second generation ADC core occupies 0.0192 mm² . The two prototypes can be categorized as some of the smallestarea modulators in the literature. Second, I analyze the measured results of the prototype ADC chips, and determine the source for the harmonic distortion. I then demonstrate a digital calibration algorithm that sufficiently mitigates the distortion. This calibration approach falls in the general philosophy of digitally-assisted analog systems. In this philosophy, digital calibration and post-correction are favored over traditional analog solutions, in which there is a high cost to the analog solution either in complexity, power, or area. / text
323

Förstudie med förändringsförslag för omorganisation av mindre företag

Stenberg, Sandra January 2014 (has links)
BAGA Water Technology är ett svenskt företag som är verksamt inom vattenhantering med fokus på konstruktion och tillverkning av reningsverk särskilt anpassade för att klara de nordiska förhållandena. De är en av marknadens ledande aktörer inom området vilket gör att en ständig utveckling krävs för att konkurrenskraften ska bibehållas. Företagets huvudkontor är beläget i Karlskrona medan produktion och lagerhållning till största delen är outsourcat till en underleverantör i Norge och en i Blekinge (UL.B). Outsourcingen till UL.B har visat sig inge problem då kvalitén av produkter inte är tillfredsställande samtidigt som den spridda lagerhållningen och produktionen gör logistiksystemet komplext och svårhanterligt. Arbetets syfte är således att göra en förstudie där BAGA erhåller data som kan användas för att ta beslut om en omorganisation av företagets outsourcade lager och produktion är en rimlig lösning på problemen samtidigt som implementerbara förändringsförslag tillhandahålls. Genom användning av Lean Six Sigmas modell för förbättringsarbete, DMAIC, samt förespråkade verktyg har först en nulägesanalys av processer och flöden kopplade till UL.B genomförts. Intervjuer och observationer både på BAGA och hos UL.B har verkat som grundläggande data. Utifrån den samlade informationen har en orsak-verkan analys gjorts med fokus på de båda problemområdena, där orsakerna sedan har prioriterats i en FMEA. Framtagning av tre förbättringsförslag har därefter gjorts, där det förslag som ansågs mest fördelaktigt är att BAGA bör investera i en utökad egen verksamhet genom tillbyggnad av en lagerbyggnad intill kontoret i Karlskrona. Detta möjliggör en förflyttning av lager och produktion från UL.B till BAGA. Beskrivningen av förändringsförslaget presenterar bland annat förändringar i processer och flöden men även förslag på en fysisk utformning där ett effektivitetstänk enligt Lean Six Sigma ständig genomsyrar arbetet. Slutligen presenteras en kostnadskalkyl som jämför nuläget med det föreslagna scenariot där det visar sig att förslaget ger en ekonomisk vinning. Utifrån de många positiva argument som har framkommit, bland annat att förslaget bidrar med minskade transporter och ett ökat inflytande över produktionen, anses det som ett strategiskt riktigt beslut att implementera förbättringsförslaget.
324

Tunable mismatch shaping for bandpass Delta-Sigma data converters

Akram, Waqas 16 June 2011 (has links)
Oversampled digital-to-analog converters typically employ an array of unit elements to drive out the analog signal. Manufacturing defects can create errors due to mismatch between the unit elements, leading to a sharp reduction in the effective dynamic range through the converter. Mismatch noise shaping is an established technique for alleviating these effects, but usually anchors the signal band to a fixed frequency location. In order to extend these advantages to tunable applications, this work explores a series of techniques that allow the suppression band of the mismatch noise shaping function to have an adjustable center frequency. The proposed techniques are implemented in hardware and evaluated according to mismatch shaping performance, latency and hardware complexity. / text
325

Verslo procesų imitavimas / Business process simulation

Zarembaitė, Vitalija 08 September 2009 (has links)
Paskutinį dešimtmetį bendrovės labai daug dėmesio skiria procesų analizei, veiklos efektyvumo didinimui. Verslo procesų valdymas pritraukia vis didesnį bendrovių dėmesį ir šis dėmesys joms leidžia pereiti nuo imituojamų verslo procesų prie realiai veikiančių. Verslo procesų valdymas apima procesų konstravimą, atvaizdavimą, kontrolę ir analizę. Bendrovės didina darbo efektyvumą nuolatos vertindamos procesų pridedamąją vertę. Verslo procesų tobulinimas yra nenutrūkstamas ciklas, kuriame itin svarbią rolę atlieka procesų konstravimas ir pertvarkymas. Yra begalės būdų pakeisti vystančius procesus ir tik geriausios alternatyvos procesas turi pakeisti realiai vykdomą. Intuityvus proceso pasirinkimas gali nemaloniai nustebinti ir sumažinti verslo efektyvumą vietoje siektų tikslų. Procesų imitavimas yra vienas iš tinkamų būdų jų pertvarkymui. Verslo procesų imitavimas padeda suprasti, analizuoti ir konstruoti procesus. Pasitelkus imitavimą procesai gali būti įvertinti ir palyginti. Imitavimas suteikia proceso poveikio verslo efektyvumui kiekybinį įvertinimą, pagal kurį lengva pasirinkti tinkamiausią procesą. Galima išskirti eilę žingsnių susijusių su verslo procesų imitavimu. Pirmiausia verslo procesas yra atvaizduojamas procesų modelyje. Tada identifikuojami po-procesai ir įvykiai. Yra apibrėžiama proceso eiga, nustatomos jo esybės ir nustatomi ryšiai tarp skirtingų proceso dalių. Galiausiai yra numatomi ir paskiriami resursai. Proceso modelis turėtų būti patvirtintas tik įsitikinus... [toliau žr. visą tekstą] / Business process is: “A collection of related, structured activities – a chain of events- that produce a specific service or product for a particular customer or customers” [How06]. A simulation is an imitation of some real thing, state of affairs, or process. The supporting tools of process mapping and business process simulation are used in the change process and assist in communicating the current process design and people's roles in the overall performance of that design. The simulation model is also used to predict the performance of new designs incorporating the use of information technology. The approach is seen to have a number of advantages in the context of a public sector organization. These include the ability for personnel to move from a traditional grouping of staff in occupational groups with relationships defined by reporting requirements to a view of their role in a process, which delivers a performance to a customer. By running the simulation through time it is also possible to gauge how changes at an operational level can lead to the meeting of strategic targets over time. Business processes are increasingly recognized as the key to competitive survival. The important opportunities inherent to this invisible economic asset are the foundations of process-centered management. Simulation of business processes creates added value in understanding, analyzing and designing processes by introducing dynamic aspects. It provides decision support by anticipation of... [to full text]
326

Integrated Circuit Blocks for High Performance Baseband and RF Analog-to-Digital Converters

Chen, Hongbo 2011 December 1900 (has links)
Nowadays, the multi-standard wireless receivers and multi-format video processors have created a great demand for integrating multiple standards into a single chip. The multiple standards usually require several Analog to Digital Converters (ADCs) with different specifications. A promising solution is adopting a power and area efficient reconfigurable ADC with tunable bandwidth and dynamic range. The advantage of the reconfigurable ADC over customized ADCs is that its power consumption can be scaled at different specifications, enabling optimized power consumption over a wide range of sampling rates and resulting in a more power efficient design. Moreover, the reconfigurable ADC provides IP reuse, which reduces design efforts, development costs and time to market. On the other hand, software radio transceiver has been introduced to minimize RF blocks and support multiple standards in the same chip. The basic idea is to perform the analog to digital (A/D) and digital to analog (D/A) conversion as close to the antenna as possible. Then the backend digital signal processor (DSP) can be programmed to deal with the digital data. The continuous time (CT) bandpass (BP) sigma-delta ADC with good SNR and low power consumption is a good choice for the software radio transceiver. In this work, a proposed 10-bit reconfigurable ADC is presented and the non-overlapping clock generator and state machine are implemented in UMC 90nm CMOS technology. The state machine generates control signals for each MDAC stage so that the speed can be reconfigured, while the power consumption can be scaled. The measurement results show that the reconfigurable ADC achieved 0.6-200 MSPS speed with 1.9-27 mW power consumption. The ENOB is about 8 bit over the whole speed range. In the second part, a 2-bit quantizer with tunable delay circuit and 2-bit DACs are implemented in TSMC 0.13um CMOS technology for the 4th order CT BP sigma-delta ADC. The 2-bit quantizer and 2-bit DACs have 6dB SNR improvement and better stability over the single bit quantizer and DACs. The penalty is that the linearity of the feedback DACs should be considered carefully so that the nonlinearity doesn't deteriorate the ADC performance. The tunable delay circuit in the quantizer is designed to adjust the excess loop delay up to +/- 10% to achieve stability and optimal performance.
327

DAC Linearization Techniques for Sigma-delta Modulators

Godbole, Akshay 2011 December 1900 (has links)
Digital-to-Analog Converters (DAC) form the feedback element in sigma-delta modulators. Any non-linearity in the DAC directly degrades the linearity of the modulator at low and medium frequencies. Hence, there is a need for designing highly linear DACs when used in high performance sigma-delta modulators. In this work, the impact of current mismatch on the linearity performance (IM3 and SQNR) of a 4-bit current steering DAC is analyzed. A selective calibration technique is proposed that is aimed at reducing the area occupancy of conventional linearization circuits. A statistical element selection algorithm for linearizing DACs is proposed. Current sources within the required accuracy are selected from a large set of current sources available. As compared with existing calibration techniques, this technique achieves higher accuracy and is more robust to variations in process and temperature. In contrast to existing data weighted averaging techniques, this technique does not degrade SNR performance of the ADC. A 5th order, 500 MS/s, 20 MHz sigma-delta modulator macro-model was used to test the linearity of the DAC.
328

Gauge Theory Dynamics and Calabi-Yau Moduli

Doroud, Nima January 2014 (has links)
We compute the exact partition function of two dimensional N=(2,2) supersymmetric gauge theories on S². For theories with SU(2|1)_A invariance, the partition function admits two equivalent representations corresponding to localization on the Coulomb branch or the Higgs branch, which includes vortex and anti-vortex excitations at the poles. For SU(2|1)_B invariant gauge theories, the partition function is localized to the Higgs branch which is generically a Kähler quotient manifold. The resulting partition functions are invariant under the renormalization group flow. For gauge theories that flow in the infrared to Calabi-Yau nonlinear sigma models, the partition functions for the SU(2|1)_A (resp SU(2|1)_B) invariant theories compute the Kähler potential on the Kähler moduli (resp. complex structure moduli) of the Calabi-Yau manifold. We also compute the elliptic genus of such theories in the presence of Stückelberg fields and show that they are modular completions of mock Jacobi forms.
329

The impact of six sigma on operational efficiency / Andreas Machinini

Machinini, Mazondeki Andreas January 2010 (has links)
Globalisation of markets has brought about enormous challenges and opportunities for business organisations. The prevailing business environment propels organisations to improve and create value in order to remain competitive. Improvement and value creation begin internally and get reflected externally in the form of value added propositions to the market. Six Sigma is a methodology known for creating value within organisations, in all industries, through process improvement which translates into enormous savings for the organisation. Six Sigma is widely used globally and it has been in existence for many years, yet it is not so prevalent in the South African business environment. This research explores the principles and approach adopted, which distinguish the Six Sigma methodology from other improvement programs. In the manufacturing industry, operational efficiencies are essential to enhance value creation and profitability. The study begins by discussing the origin, history and evolvement of Six Sigma into a methodology recognisable and espoused by leading world class organisations. The technique used to effect Six Sigma is entrenched and enforced by adherence to stipulated basic principles, breakthrough strategy and Six Sigma tools in identification and elimination of variation. The study later models some of Six Sigma tools by application on the operational entity in verification and testing of theoretical knowledge into practical knowledge that can be exploited for process improvement consequently enhancing operational efficiencies. The impact of Six Sigma on operational efficiencies underlie on the ability to positively change process effectiveness and capability to near perfection as expressed by defect rate of not more than 3.4 defects per million opportunities. / Thesis (M.B.A.)--North-West University, Potchefstroom Campus, 2011.
330

The Role of Frontline Leadership in Organizational Learning: Evidence from Incremental Business Process Improvement

Monlouis, Isabelle Nathalie 11 May 2013 (has links)
What is the role of frontline project leadership in organizational learning in incremental business process improvement (iBPI)? Current literature is sparse on the topic of contributions to organizational learning made by frontline employees leading iBPI projects. To bridge this gap, we use an embedded longitudinal multiple case to study the process of leadership of four frontline iBPI projects. The 4I model (intuiting, interpreting, integrating, and institutionalizing) of organizational learning serves as a theoretical lens to study how the insights originating from frontline employees unfold through group-level integration and organization-level institutionalization. Mapping the flow of key project events to the relevant social and psychological processes of the 4I model, we identify how organizational learning unfolds within and through the three levels of the model. The granularity of the 4I model creates a valuable foundation for informing the role of frontline project leadership in iBPI programs and the capacity to leverage insights originating from frontline employees into organizational learning. Practitioners and engaged scholars will find this level of granularity helpful for program design, evaluation, and learning interventions.

Page generated in 0.0595 seconds