• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 629
  • 165
  • 95
  • 65
  • 24
  • 21
  • 18
  • 18
  • 18
  • 18
  • 18
  • 18
  • 13
  • 11
  • 10
  • Tagged with
  • 1227
  • 1227
  • 276
  • 267
  • 254
  • 253
  • 164
  • 160
  • 160
  • 129
  • 128
  • 113
  • 107
  • 105
  • 101
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Adapting the limited memory of microcomputers to solve large scale manufacturing problems

Connolly, Michael January 1983 (has links)
No description available.
52

An Integrated Test Plan for an Advanced Very Large Scale Integrated Circuit Design Group

Didden, William S. 01 January 1984 (has links) (PDF)
VLSI testing poses a number of problems which includes the selection of test techniques, the determination of acceptable fault coverage levels, and test vector generation. Available device test techniques are examined and compared. Design rules should be employed to assure the design is testable. Logic simulation systems and available test utilities are compared. The various methods of test vector generation are also examined. The selection criteria for test techniques are identified. A table of proposed design rules is included. Testability measurement utilities can be used to statistically predict the test generation effort. Field reject rates and fault coverage are statistically related. Acceptable field reject rates can be achieved with less than full test vector fault coverage. The methods and techniques which are examined form the basis of the recommended integrated test plan. The methods of automatic test vector generation are relatively primitive but are improving.
53

QUALIFICATION RESEARCH FOR RELIABLE, CUSTOM LSI/VLSI ELECTRONICS.

Matsumori, Barry Alan. January 1985 (has links)
No description available.
54

Performance analysis of large-scale resource-bound computer systems

Pourranjbar, Alireza January 2015 (has links)
We present an analysis framework for performance evaluation of large-scale resource-bound (LSRB) computer systems. LSRB systems are those whose resources are continually in demand to serve resource users, who appear in large populations and cause high contention. In these systems, the delivery of quality service is crucial, even in the event of resource failure. Therefore, various techniques have been developed for evaluating their performance. In this thesis, we focus on the technique of quantitative modelling, where in order to study a system, first its model is constructed and then the system’s behaviour is analysed via the model. A number of high level formalisms have been developed to aid the task of model construction. We focus on PEPA, a stochastic process algebra that supports compositionality and enables us to easily build complex LSRB models. In spite of this advantage, however, the task of analysing LSRB models still poses unresolved challenges. LSRB models give rise to very large state spaces. This issue, known as the state space explosion problem, renders the techniques based on discrete state representation, such as numerical Markovian analysis, computationally expensive. Moreover, simulation techniques, such as Gillespie’s stochastic simulation algorithm, are also computationally demanding, as numerous trajectories need to be collected. Furthermore, as we show in our first contribution, the techniques based on the mean-field theory or fluid flow approximation are not readily applicable to this case. In LSRB models, resources are not assumed to be present in large populations and models exhibit highly noisy and stochastic behaviour. Thus, the mean-field deterministic behaviour might not be faithful in capturing the system’s randomness and is potentially too crude to show important aspects of their behaviours. In this case, the modeller is unable to obtain important performance indicators, such as the reliability measures of the system. Considering these limitations, we contribute the following analytical methods particularly tailored to LSRB models. First, we present an aggregation method. The aggregated model captures the evolution of only the system’s resources and allows us to efficiently derive a probability distribution over the configurations they experience. This distribution provides full faithfulness for studying the stochastic behaviour of resources. The aggregation can be applied to all LSRB models that satisfy a syntactic aggregation condition, which can be quickly checked syntactically. We present an algorithm to generate the aggregated model from the original model when this condition is satisfied. Second, we present a procedure to efficiently detect time-scale near-complete decomposability (TSND). The method of TSND allows us to analyse LSRB models at a reduced cost, by dividing their state spaces into loosely coupled blocks. However, one important input is a partition of the transitions defined in the model, categorising them into slow or fast. Forming the necessary partition by the analysis of the model’s complete state space is costly. Our process derives this partition efficiently, by relying on a theorem stating that our aggregation preserves the original model’s partition and therefore, it can be derived by an efficient reachability analysis on the aggregated state space. We also propose a clustering algorithm to implement this reachability analysis. Third, we present the method of conditional moments (MCM) to be used on LSRB models. Using our aggregation, a probability distribution is formed over the configurations of a model’s resources. The MCM outputs the time evolution of the conditional moments of the marginal distribution over resource users given the configurations of resources. Essentially, for each such configuration, we derive measures such as conditional expectation, conditional variance, etc. related to the dynamics of users. This method has a high degree of faithfulness and allows us to capture the impact of the randomness of the behaviour of resources on the users. Finally, we present the advantage of the methods we proposed in the context of a case study, which concerns the performance evaluation of a two-tier wireless network constructed based on the femto-cell macro-cell architecture.
55

Development of a computational and neuroinformatics framework for large-scale brain modelling

Sanz Leon, Paula 16 October 2014 (has links)
The central theme of this thesis is the development of both a generalised computational model for large-scale brain networks and the neuroinformatics platform that enables a systematic exploration and analysis of those models. In this thesis we describe the mathematical framework of the computational model at the core of the tool The Virtual brain (TVB), designed to recreate collective whole brain dynamics by virtualising brain structure and function, allowing simultaneous outputs of a number of experimental modalities such as electro- and magnetoencephalography (EEG, MEG) and functional Magnetic Resonance Imaging (fMRI). The implementation allows for a systematic exploration and manipulation of every underlying component of a large-scale brain network model (BNM), such as the neural mass model governing the local dynamics or the structural connectivity constraining the space time structure of the network couplings. We also review previous studies related to brain network models and multimodal neuroimaging integration and detail how they are related to the general model presented in this work. Practical examples describing how to build a minimal *in silico* primate brain model are given. Finally, we explain how the resulting software tool, TVB, facilitates the collaboration between experimentalists and modellers by exposing both a comprehensive simulator for brain dynamics and an integrative framework for the management, analysis, and simulation of structural and functional data in an accessible, web-based interface. / The central theme of this thesis is the development of both a generalised computational model for large-scale brain networks and the neuroinformatics platform that enables a systematic exploration and analysis of those models. In this thesis we describe the mathematical framework of the computational model at the core of the tool The Virtual brain (TVB), designed to recreate collective whole brain dynamics by virtualising brain structure and function, allowing simultaneous outputs of a number of experimental modalities such as electro- and magnetoencephalography (EEG, MEG) and functional Magnetic Resonance Imaging (fMRI). The implementation allows for a systematic exploration and manipulation of every underlying component of a large-scale brain network model (BNM), such as the neural mass model governing the local dynamics or the structural connectivity constraining the space time structure of the network couplings. We also review previous studies related to brain network models and multimodal neuroimaging integration and detail how they are related to the general model presented in this work. Practical examples describing how to build a minimal *in silico* primate brain model are given. Finally, we explain how the resulting software tool, TVB, facilitates the collaboration between experimentalists and modellers by exposing both a comprehensive simulator for brain dynamics and an integrative framework for the management, analysis, and simulation of structural and functional data in an accessible, web-based interface.
56

Gestion des interférences dans les systèmes large-scale MIMO pour la 5G / Interference management in large-scale MIMO systems for 5G

Hajji, Zahran 17 December 2018 (has links)
La thèse s'inscrit dans la perspective de l'explosion du trafic de données générée par l'augmentation du nombre d'utilisateurs ainsi que la croissance du débit qui doivent être prises en compte dans la définition des futures générations de communications radiocellulaires. Une solution est la technologie «large-scale MIMO » (systèmes MIMO de grande dimension) qui pose plusieurs défis. La conception des nouveaux algorithmes de détection de faible complexité est indispensable vu que les algorithmes classiques ne sont plus adaptés à cette configuration à cause de leurs mauvaises performances de détection ou de leur complexité trop élevée fonction du nombre d'antennes. Une première contribution de la thèse est un algorithme basé sur la technique de l'acquisition comprimée en exploitant les propriétés des signaux à alphabet fini. Appliqué à des systèmes MIMO de grande dimension, déterminés et sous-déterminés, cet algorithme réalise des performances (qualité de détection, complexité) prometteuses et supérieures comparé aux algorithmes de l'état de l'art. Une étude théorique approfondie a été menée pour déterminer les conditions optimales de fonctionnement et la distribution statistique des sorties. Une seconde contribution est l'intégration de l'algorithme original dans un récepteur itératif en différenciant les cas codé (code correcteur d'erreurs présent) et non codé. Un autre défi pour tenir les promesses des systèmes large scale MIMO (efficacité spectrale élevée) est l'estimation de canal. Une troisième contribution de la thèse est la proposition d'algorithmes d'estimation semi-aveugles qui fonctionnent avec une taille minimale des séquences d'apprentissage (égale au nombre d'utilisateurs) et atteignent des performances très proches de la borne théorique. / The thesis is part of the prospect of the explosion of data traffic generated by the increase of the number of users as well as the growth of the bit rate which must be taken into account in the definition of future generations of radio-cellular communications. A solution is the large-scale MIMO technology (MIMO systems oflarge size) which poses several challenges. The design of the new low complexity detection algorithms is indispensable since the conventional algorithms are no longer adapted to this configuration because of their poor detection performance or their too high complexity depending on the number of antennas. A first contribution of the thesis is an algorithm based on the technique of compressed sensing by exploiting the propertiesof the signals with finite alphabet. Applied to large-scale, determined and under-determined MIMO systems, this algorithm achieves promising and superior performance (quality ofdetection, complexity) compared to state-ofthe-art algorithms. A thorough theoretical study was conducted to determine the optimal operating conditions and the statistical distribution of outputs. A second contribution is the integration of the original algorithm into an iterative receiver by differentiating the coded and uncoded cases. Another challenge to keeping the promise of large- scale MIMO systems (high spectral efficiency) is channel estimation. A third contribution of the thesis is the proposal of semi-blind channel estimation algorithms that work with a minimum size of pilot sequences (equal to the number of users) and reach performances very close to the theoretical bound.
57

Self-Reliance Guidelines for Large Scale Robot Colonies

Engwirda, Anthony, N/A January 2007 (has links)
A Large Scale Robot Colony (LSRC) is a complex artifact comprising of a significant population of both mobile and static robots. LSRC research is in its literary infancy and it is therefore necessary to rely upon external fields for the appropriate framework, Multi Agent Systems (MAS) and Large Scale Systems (LSS). At the intersection of MAS, LSS and LSRC exist near identical issues, problems and solutions. If attention is paid to coherence then solution portability is possible. The issue of Self-Reliability is poorly addressed by the MAS research field. Disparity between the real world and simulation is another area of concern. Despite these deficiencies, MAS and LSS are perceived as the most appropriate frameworks. MAS research focuses on three prime areas, cognitive science, management and interaction. LSRC is focused on Self-Sustainability, Self-Management and Self-Organization. While LSS research was not primarily intended for populations of mobile robots, it does address key issues of LSRC, such as effective sustainability and management. Implementation of LSRC that is based upon the optimal solution for any one or two of the three aspects will be inferior to a coherent solution based upon all three. LSRC’s are complex organizations with significant populations of both static and mobile robots. The increase in population size and the requirement to address the issue of Self-Reliance give rise to new issues. It is no longer sufficient to speak only in terms of robot intelligence, architecture, interaction or team behaviour, even though these are still valid topics. Issues such as population sustainability and management have greater significance within LSRC. As the size of a robot populations increases, minor uneconomical decisions and actions inhibit the performance of the population. Interaction must be made economical within the context of the LSRC. Sustainability of the population becomes significant as it enables stable performance and extended operational lifespan. Management becomes significant as a mechanism to direct the population so as to achieve near optimal performance. The Self-Sustainability, Self-Management and Self-Organization of LSRC are vastly more complex than in team robotics. Performance of the overall population becomes more significant than individual or team achievement. This thesis is a presentation of the Cooperative Autonomous Robot Colony (CARC) architecture. The CARC architecture is novel in that it offers a coherent baseline solution to the issue of mobile robot Self-Reliance. This research uses decomposition as a mechanism to reduce problem complexity. Self-Reliance is decomposed into Self-Sustainability, Self-Management, and Self-Organization. A solution to the issue of Self-Reliance will comprise of conflicting sub-solutions. A product of this research is a set of guidelines that manages the conflict of sub-solutions and maintains a coherent solution. In addressing the issue of Self-Reliance, it became apparent that Economies of Scale, played an important role. The effects of Economies of Scale directed the research towards LSRC’s. LSRC’s demonstrated improved efficiency and greater capability to achieve the requirements of Self-Reliance. LSRC’s implemented with the CARC architecture would extend human capability, enabling large scale operations to be performed in an economical manner, within real world and real time environments, including those of a remote and hostile nature. The theory and architecture are supported using published literature, experiments, observations and mathematical projections. Contributions of this work are focused upon the three pillars of Self-Reliance addressed by CARC: Self-Sustainability, Self-Management and Self-Organization. The chapter on Self-Sustainability explains and justifies the relevance of this issue, what it is, why it is important and how it can be achieved. Self-Sustainability enables robots to continue to operate beyond disabling events by addressing failure and routine maintenance. Mathematical projections are used to compare populations of non-sustained and sustained robots. Computer modeling experiments are used to demonstrate the feasibility of Self-Sustainability, including extended operational life, the maintenance of optimal work flow and graceful physical degradation (GPD). A detailed explanation is presented of Sustainability Functions, Colony Sites, Static Robot Roles, Static Robot Failure Options, and Polymorphism. The chapter on Self-Management explores LSS research as a mechanism to exert influence over a LSRC. An experimental reactive management strategy is demonstrated. This strategy while limited does indicate promising potential directions for future research including the Man in the Loop (MITL) strategy highly desired by NASA JPL for off world command and control of a significant robot colony (Huntsberger, et. al., 2000). Experiments on Communication evaluate both Broadcast Conveyance (BC) and Message Passing Conveyance (MPC). These experiments demonstrate the potential of Message Passing as a low cost system for LSRC communication. Analysis of Metrics indicates that a Performance Based Feedback Method (PBFM) and a Task Achievement Method (TAM) are both necessary and sufficient to monitor a LSRC. The chapter on Self-Organization describes a number of experiments, algorithms and protocols on Reasoning Robotics, a minor variant of Reactive Robotics. Reasoning Robotics utilizes an Event Driven Architecture (EDA) rather than a Stimulus Driven Architecture (SDA) common to Reactive Robotics. Enhanced robot performance is demonstrated by a combination of EDA and environmental modification enabling stigmergy. These experiments cover Intersection Navigation with contingency for Multilane Intersections, a Radio Packet Controller (RPC) algorithm, Active and Passive Beacons including a communication protocol, mobile robot navigation using Migration Decision Functions (MDF’s), including MDF positional errors. The central issue addressed by this thesis is the production of Self-Reliance guidelines for LSRC’s. Self-Reliance is perceived as a critical issue in advancing the useful and productive applications for LSRC’s. LSRC’s are complex with many issues in related fields of MAS and LSS. Decomposition of Self-Reliance into Self-Sustainability, Self-Management and Self-Organization were used to aid in problem understanding. It was found that Self-Sustainability extends the operational life of individual robots and the LSRC. Self-Management enables the exertion of human influence over the LSRC, such that the ratio of humans to robots is reduced but not eliminated. Self-Organization achieves and enhances performance through a routine and reliable LSRC environment. The product of this research was the novel CARC architecture, which consists of a set of Self-Reliance guidelines and algorithms. The Self-Reliance guidelines manage conflict between optimal solutions and provide a framework for LSRC design. This research was supported by literature, experiments, observations and mathematical projections.
58

Process development for si-based nanostructures using pulsed UV laser induced epitaxy

Deng, Chaodan 10 1900 (has links) (PDF)
Ph.D. / Electrical Engineering / Nanometer-scale devices have attracted great attention as the ultimate evolution of silicon integrated circuit technology. However, fabrication of nanometer-scale silicon based devices has met great difficulty because it places severe constraints on process technology. This is especially true for SiGe/Si heterostructures because they are particularly sensitive to strain relaxation and/or process induced defects. Recently developed Pulsed Laser Induced Epitaxy (PLIE) offers a promising approach for the fabrication of nanometer- scale SiGe/Si devices. It possesses the advantage of ultra-short time, low thermal budget and full compatibility with current silicon technology. The selective nature of the process allows epitaxial growth of high quality, localized SiGe layers in silicon. In this thesis, a process to fabricate SiGe nanowires in silicon using PLIE is described. In particular, Ge nanowires with a cross-section of ~6 x 60 nm² are first formed using a lift-off process on the silicon substrate with e-beam lithography, followed by a thin low-temperature oxide deposition. Defect-free SiGe nanowires with a cross-section of ~25 x 95 nm² are then produced by impinging the laser beam on the sample. We thus demonstrate PLIE is a suitable fabrication technique for SiGe/Si nanostructures. Fabrication of Ge nanowires is also studied using Focused Ion Beam (FIB) micromachining techniques. Based on the SiGe nanowire process, we propose two advanced device structures, a quantum wire MOSFET and a lateral SiGe Heterojunction Bipolar Transistor (HBT). MEDICI simulation of the lateral SiGe HBT demonstrates high performance of the device. In order to characterize the SiGe nanowires using cross-sectional transmission electron microscopy, an advanced versatile focused ion beam assisted sample preparation technique using a multi-layer stack scheme for localized surface structures is developed and described in this thesis.
59

Single layer routing : mapping topological to geometric solutions

Hong, Won-kook. January 1986 (has links)
No description available.
60

Single layer routing : mapping topological to geometric solutions

Hong, Won-kook. January 1986 (has links)
No description available.

Page generated in 0.0378 seconds