• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 31
  • 10
  • 5
  • 1
  • Tagged with
  • 53
  • 53
  • 13
  • 13
  • 10
  • 10
  • 10
  • 9
  • 9
  • 6
  • 6
  • 6
  • 6
  • 5
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

On the efficiency of using multiple hops in fixed relay based wireless networks /

Florea, Adrian, January 1900 (has links)
Thesis (M. App. Sc.)--Carleton University, 2005. / Includes bibliographical references (p. 63-64). Also available in electronic format on the Internet.
12

Restarting the Lanczos algorithm for large eigenvalue problems and linear equations

Nicely, Dywayne A. Morgan, Ronald Benjamin, January 2008 (has links)
Thesis (Ph.D.)--Baylor University, 2008. / Includes bibliographical references (p. 70-74)
13

Low complexity diversity combining and carrier frequency offset compensation for ubiquitous OFDM based broadband wireless communications /

Huang, Defeng. January 2004 (has links)
Thesis (Ph. D.)--Hong Kong University of Science and Technology, 2004. / Includes bibliographical references (leaves 163-171). Also available in electronic version. Access restricted to campus users.
14

Efficient complexity reduction methods for short-frame iterative decoding /

Kei, Chun-Ling. January 2002 (has links)
Thesis (M. Phil.)--Hong Kong University of Science and Technology, 2002. / Includes bibliographical references (leaves 86-91). Also available in electronic version. Access restricted to campus users.
15

High level static analysis of system descriptions for taming verification complexity

Vasudevan, Shobha. January 1900 (has links)
Thesis (Ph. D.)--University of Texas at Austin, 2007. / Vita. Includes bibliographical references.
16

Sketch and project : randomized iterative methods for linear systems and inverting matrices

Gower, Robert Mansel January 2016 (has links)
Probabilistic ideas and tools have recently begun to permeate into several fields where they had traditionally not played a major role, including fields such as numerical linear algebra and optimization. One of the key ways in which these ideas influence these fields is via the development and analysis of randomized algorithms for solving standard and new problems of these fields. Such methods are typically easier to analyze, and often lead to faster and/or more scalable and versatile methods in practice. This thesis explores the design and analysis of new randomized iterative methods for solving linear systems and inverting matrices. The methods are based on a novel sketch-and-project framework. By sketching we mean, to start with a difficult problem and then randomly generate a simple problem that contains all the solutions of the original problem. After sketching the problem, we calculate the next iterate by projecting our current iterate onto the solution space of the sketched problem. The starting point for this thesis is the development of an archetype randomized method for solving linear systems. Our method has six different but equivalent interpretations: sketch-and-project, constrain-and-approximate, random intersect, random linear solve, random update and random fixed point. By varying its two parameters – a positive definite matrix (defining geometry), and a random matrix (sampled in an i.i.d. fashion in each iteration) – we recover a comprehensive array of well known algorithms as special cases, including the randomized Kaczmarz method, randomized Newton method, randomized coordinate descent method and random Gaussian pursuit. We also naturally obtain variants of all these methods using blocks and importance sampling. However, our method allows for a much wider selection of these two parameters, which leads to a number of new specific methods. We prove exponential convergence of the expected norm of the error in a single theorem, from which existing complexity results for known variants can be obtained. However, we also give an exact formula for the evolution of the expected iterates, which allows us to give lower bounds on the convergence rate. We then extend our problem to that of finding the projection of given vector onto the solution space of a linear system. For this we develop a new randomized iterative algorithm: stochastic dual ascent (SDA). The method is dual in nature, and iteratively solves the dual of the projection problem. The dual problem is a non-strongly concave quadratic maximization problem without constraints. In each iteration of SDA, a dual variable is updated by a carefully chosen point in a subspace spanned by the columns of a random matrix drawn independently from a fixed distribution. The distribution plays the role of a parameter of the method. Our complexity results hold for a wide family of distributions of random matrices, which opens the possibility to fine-tune the stochasticity of the method to particular applications. We prove that primal iterates associated with the dual process converge to the projection exponentially fast in expectation, and give a formula and an insightful lower bound for the convergence rate. We also prove that the same rate applies to dual function values, primal function values and the duality gap. Unlike traditional iterative methods, SDA converges under virtually no additional assumptions on the system (e.g., rank, diagonal dominance) beyond consistency. In fact, our lower bound improves as the rank of the system matrix drops. By mapping our dual algorithm to a primal process, we uncover that the SDA method is the dual method with respect to the sketch-and-project method from the previous chapter. Thus our new more general convergence results for SDA carry over to the sketch-and-project method and all its specializations (randomized Kaczmarz, randomized coordinate descent ... etc.). When our method specializes to a known algorithm, we either recover the best known rates, or improve upon them. Finally, we show that the framework can be applied to the distributed average consensus problem to obtain an array of new algorithms. The randomized gossip algorithm arises as a special case. In the final chapter, we extend our method for solving linear system to inverting matrices, and develop a family of methods with specialized variants that maintain symmetry or positive definiteness of the iterates. All the methods in the family converge globally and exponentially, with explicit rates. In special cases, we obtain stochastic block variants of several quasi-Newton updates, including bad Broyden (BB), good Broyden (GB), Powell-symmetric-Broyden (PSB), Davidon-Fletcher-Powell (DFP) and Broyden-Fletcher-Goldfarb-Shanno (BFGS). Ours are the first stochastic versions of these updates shown to converge to an inverse of a fixed matrix. Through a dual viewpoint we uncover a fundamental link between quasi-Newton updates and approximate inverse preconditioning. Further, we develop an adaptive variant of the randomized block BFGS (AdaRBFGS), where we modify the distribution underlying the stochasticity of the method throughout the iterative process to achieve faster convergence. By inverting several matrices from varied applications, we demonstrate that AdaRBFGS is highly competitive when compared to the well established Newton-Schulz and approximate preconditioning methods. In particular, on large-scale problems our method outperforms the standard methods by orders of magnitude. The development of efficient methods for estimating the inverse of very large matrices is a much needed tool for preconditioning and variable metric methods in the big data era.
17

Abordagem contingencial estruturada de gestão e o sucesso ou fracasso de projetos complexos e incertos em empresas no Brasil. / Adaptive structured approach to management andcomplex and uncertain projects\' sucess and failure in companies in Brazil.

Luiz José Marques Junior 08 July 2009 (has links)
Os projetos fazem parte da gestão estratégica das empresas na criação e sustentação de suas vantagens competitivas. O portfólio de projetos das empresas é composto por iniciativas relacionadas à busca de inovação e eficiência que envolvem diferentes graus de complexidade e incerteza. Dado o comprovado mau desempenho dos projetos de maneira geral, amplificado pelo maior número de projetos considerados complexos e incertos, novas abordagens de gestão de projetos surgiram em contraponto à abordagem convencional. Dessa forma, este trabalho analisa as dinâmicas de definição e gestão de projetos complexos e incertos de empresas no Brasil, com intuito de observar como certas práticas ou a ausência delas contribuem para o sucesso ou fracasso dos projetos. Para atender aos objetivos deste trabalho, a metodologia utilizada foi o estudo de múltiplos casos. Entre os resultados desta pesquisa, os destaques são apresentados a seguir. Primeiro, uma descrição da dinâmica de definição e gestão de projetos complexos e incertos nas empresas estudadas. Segundo, após a descrição das dinâmicas de cada empresa, foi feita uma análise da contribuição das abordagens descritas para o sucesso ou fracasso do projeto. Terceiro, com a descrição das abordagens e suas contribuições para sucesso ou fracasso dos projetos, foi possível entender, criticar e propor melhorias ao modelo contingencial estruturado de Shenhar e Dvir (2007), que foi utilizado como referencial teórico da pesquisa. / Projects are part of the companies strategic management, contributing to create and sustain competitive advantages. Companies project portfolios are composed by initiatives related to innovation and efficiency, which involve different levels of complexity and uncertainty. Considering the poor performance of projects in general, amplified by the increasing number of complex and uncertain projects in companies portfolios, new managing approaches have been emerging to counterbalance the conventional ones. Thus, this work analyses the dynamics of definition and management of complex and uncertain projects in companies in Brazil, with purpose of observing how certain aspects of the dynamics contribute to projects success and failure. In order to meet this purpose, it has been used the multiple cases study methodology. The results of this research are composed of three parts. First, a description of the dynamics of complex and uncertain projects definition and management in the companies analyzed. Second, after the description of each companys dynamics, an analysis about its contribution to projects success or failure has followed. Third, after completing the description of the dynamics and their contribution to projects success and failure, it was possible to understand, criticize and suggest improvements to the Shenhar and Dvir (2007) structured and adaptive project management approach, which was used as the theoretical reference for this research.
18

Dynamic simulation of solid state controlled machine systems including component failures

McHale, Timothy Luke January 1983 (has links)
A modeling approach suitable for simulating solid-state-controlled machine systems, including component failures within either the electronics or machine(s), is presented in detail. The capability of modeling unbalanced-machine operation is included in the modeling approach. The approach is directly amenable to computer implementation. Computer implementation of the modeling approach was performed and the simulated results were compared with actual oscillograms, obtained from the performance tests of an Electric Vehicle Propulsion Unit, in order to verify the proposed modeling approach. Excellent correlation between the simulated waveforms and the oscillograms existed in all the simulated cases. The modeling approach was used also to simulate the electrical behavior of a brushless-excitation system used for large turbine generators. The simulations consisted of normal steady-state operation as well as a scenario of fault conditions occurring within the rotating rectifier assembly of the brushless exciter. The simulated results are displayed and a discussion of intrinsic features of these results needed to identify the specific fault is presented. Fault detection schemes are warranted for such expensive systems. Actual voltage and/or current waveforms could be telemetered to an controller for fault detection and classification. The elements of this modeling approach which allow inexpensive computer simulation of such systems, that can contain nonlinearities and/or spontaneous faults in any of its components, are listed as follows: 1. The capability of automatically generating the systems' governing state equations, from a minimal set of topological data and component values, at any point within the simulation run; 2. Inclusion of unbalanced machine operation is a result of having no topological restrictions placed upon the mutual coupling. 3. Using piece-wise linear I-V characteristics of the solid-state switching components decreases the computation time needed for a given simulation run since iteration for the status of the equivalent resistance values for each switch is only required at their threshold (I-V) points. 4. Employment of an implicit (predictor-corrector) integration algorithm designed specifically for solving stiff differential equations, typically associated with solid-state controlled machine systems, allows realistic modeling of the solid-state switches' equivalent resistance values. Also, implicit algorithms (like the one employed in this work) result in a drastic reduction of computer execution time and an increase in accuracy, when compared to explicit algorithms, systems. for simulating these types of stiff systems. / Ph. D.
19

Simulation of a building heating, ventilating and air-conditioning system

Botha, C P 03 July 2006 (has links)
Simulation is one of the oldest and also among the most important tools available to engineers. In the building Heating, Ventilating and Air-Conditioning (HVAC) community the availability and/or functionality of simulation tools is limited and it is difficult to determine whether the simulation models accurately represent reality. The purpose of this study was to accurately verify one such a simulation model and then to extend the study to two unique applications. Comprehensive structural, comfort and energy audits were performed to construct a suitable simulation model with the aid of the control simulation package: QUICK Control. The model was then verified against measured building data to ensure an accurate representation of the actual dynamic building response. For the first application various control retrofits were evaluated and the highest potential for energy saving was found. Thereafter the model was implemented to investigate the change in indoor air conditions due to failure of HVAC equipment. Heating, ventilating and air-conditioning in buildings consume a significant portion of the available electrical energy in South Africa. Of this energy up to 30% can be saved by improving the HVAC systems currently installed in the buildings. This could result in savings of up to R400 million. For the building used in this study it was found that up to 66% of the HVAC system’s electrical energy consumption could be saved with a payback period of only 9 months. These savings could be achieved by implementing a setback control strategy with an improved time management procedure. Predicting the impact of failing equipment is a difficult task because of the integrated dynamic effect every HVAC component has on the next. With the aid of a comprehensive integrated simulation model the implications of failing can be determined and necessary assessments and precautions can be taken. The results of this study showed that the air-conditioning system under investigation was approximately 100% over designed. Failure of up to 50% was allowable in the cooling equipment before any noticeable impact could be observed in the indoor climate. With further failure the required comfort conditions could not be sustained. <p The substantial savings calculation and possibility of predicting climate deterioration would not have been possible without the aid of a comprehensive simulation package and model. This study clearly highlights the worth of integrated simulation. / Dissertation (MSc (Mechanical Engineering))--University of Pretoria, 2006. / Mechanical and Aeronautical Engineering / unrestricted
20

Support vector machine-based fuzzy systems for quantitative prediction of peptide binding affinity

Uslan, Volkan January 2015 (has links)
Reliable prediction of binding affinity of peptides is one of the most challenging but important complex modelling problems in the post-genome era due to the diversity and functionality of the peptides discovered. Generally, peptide binding prediction models are commonly used to find out whether a binding exists between a certain peptide(s) and a major histocompatibility complex (MHC) molecule(s). Recent research efforts have been focused on quantifying the binding predictions. The objective of this thesis is to develop reliable real-value predictive models through the use of fuzzy systems. A non-linear system is proposed with the aid of support vector-based regression to improve the fuzzy system and applied to the real value prediction of degree of peptide binding. This research study introduced two novel methods to improve structure and parameter identification of fuzzy systems. First, the support-vector based regression is used to identify initial parameter values of the consequent part of type-1 and interval type-2 fuzzy systems. Second, an overlapping clustering concept is used to derive interval valued parameters of the premise part of the type-2 fuzzy system. Publicly available peptide binding affinity data sets obtained from the literature are used in the experimental studies of this thesis. First, the proposed models are blind validated using the peptide binding affinity data sets obtained from a modelling competition. In that competition, almost an equal number of peptide sequences in the training and testing data sets (89, 76, 133 and 133 peptides for the training and 88, 76, 133 and 47 peptides for the testing) are provided to the participants. Each peptide in the data sets was represented by 643 bio-chemical descriptors assigned to each amino acid. Second, the proposed models are cross validated using mouse class I MHC alleles (H2-Db, H2-Kb and H2-Kk). H2-Db, H2-Kb, and H2-Kk consist of 65 nona-peptides, 62 octa-peptides, and 154 octa-peptides, respectively. Compared to the previously published results in the literature, the support vector-based type-1 and support vector-based interval type-2 fuzzy models yield an improvement in the prediction accuracy. The quantitative predictive performances have been improved as much as 33.6\% for the first group of data sets and 1.32\% for the second group of data sets. The proposed models not only improved the performance of the fuzzy system (which used support vector-based regression), but the support vector-based regression benefited from the fuzzy concept also. The results obtained here sets the platform for the presented models to be considered for other application domains in computational and/or systems biology. Apart from improving the prediction accuracy, this research study has also identified specific features which play a key role(s) in making reliable peptide binding affinity predictions. The amino acid features "Polarity", "Positive charge", "Hydrophobicity coefficient", and "Zimm-Bragg parameter" are considered as highly discriminating features in the peptide binding affinity data sets. This information can be valuable in the design of peptides with strong binding affinity to a MHC I molecule(s). This information may also be useful when designing drugs and vaccines.

Page generated in 0.0634 seconds